The Journal of Interrupted Studies


“…The Journal of Interrupted Studies is an interdisciplinary journal dedicated to the work of academics whose work has been interrupted by forced migration. Publishing both complete and incomplete articles the Journal is currently accepting submissions in the sciences and humanities….

By embracing a multidisciplinary approach the journal offers a platform for all academic endeavours thwarted by forced migration. Especially with regards to the ongoing crises in Syria, Afghanistan and Eritrea. We invite any and all students and academics who were interrupted in their studies and are now considered refugees to submit work.

Engaging in this process, we hope to create a conversation in which all participants can shape the discourse, on terms of dignity and mutual respect. We believe academia allows us to to to initiate such a dialogue and in the process create something of value for all parties.

Refugees status according to the European Union’s directive 2013/32/EU and 2013/33/EU is by no means a requirement for submitting to the Journal. We also wish to attract exiled academics who cannot return to their countries and universities without putting their lives at risk.

We believe that when academic voices are silenced by adversity it is not only the intellectual community that suffers…(More)

Special issue on “the behavioural turn in public policy: new evidence from experiments”


Introduction to the special issue in Economia Politica by Francesco Bogliacino, Cristiano Codagnone and Giuseppe A. Veltri: “Since the publication of the best seller Nudge (Thaler and Sunstein 2008), the growth in the relevance of ‘Behavioural Economics’ (BE) and ‘Nudging’ has been exponential, both in terms of the adoption of behavioural perspectives in policy making and of ongoing academic research. With some simplification three strands can be singled out. First, the widespread application and institutionalisation of behaviourally inspired policy-making beyond the two initial cases of the US and the UK (Lunn 2014; Sousa Lourenço et al. 2016). Second, a discussion within the field of economics as to the place and contribution of BE toward ‘Evidence Based Economics’ (Chetty 2015; Thaler 2016). Third, the explosion between 2010 and 2016 of a multidisciplinary and multi-domain meta-literature of commentaries and essays for and against ‘Nudging’ that deal with its conceptual, theoretical, and philosophical underpinnings, as well as with its political and ethical implications…

In this editorial we briefly consider the three trends outlined above (diffusion of behavioural policy-making, evidence-based economics, and the meta-literature on nudging) and argue in favour of a fruitful dialogue, which is currently missing. In doing this, we sketch the policy triangle of politics, value and evidence as a potential guidance…(More).

Comparing resistance to open data performance measurement


Paper by Gregory Michener and Otavio Ritter in Public Administration : “Much is known about governmental resistance to disclosure laws, less so about multi-stakeholder resistance to open data. This study compares open data initiatives within the primary and secondary school systems of Brazil and the UK, focusing on stakeholder resistance and corresponding policy solutions. The analytical framework is based on the ‘Three-Ps’ of open data resistance to performance metrics, corresponding to professional, political, and privacy-related concerns. Evidence shows that resistance is highly nuanced, as stakeholders alternately serve as both principals and agents. School administrators, for example, are simultaneously principals to service providers and teachers, and at once agents to parents and politicians. Relying on a different systems comparison, in-depth interviews, and newspaper content analyses, we find that similar stakeholders across countries demonstrate strikingly divergent levels of resistance. In overcoming stakeholder resistance – across socioeconomic divides – context conscientious ‘data-informed’ evaluations may promote greater acceptance than narrowly ‘data-driven’ performance measurements…(More)”

Making Sense of Statistics


Report for the BBC Trust: “The BBC, as the UK’s main public service broadcaster, has a particularly important role to play in bringing statistics to public attention and helping audiences to digest, understand and apply them to their daily lives. Accuracy and impartiality have a specific meaning when applied to statistics. Reporting accurately and impartially on critical and sometimes controversial topics requires understanding the data that informs them and accurate and impartial presentation of that data.

Overall, the BBC is to be commended in its approach to the use of statistics. People at the BBC place great value on using statistics responsibly. Journalists often go to some lengths to verify the statistics they receive. They exercise judgement when deciding which statistics to cover and the BBC has a strong record in selecting and presenting statistics effectively. Journalists and programme makers often make attempts to challenge conventional wisdom and provide independent assessments of stories reported elsewhere. Many areas of the BBC give careful thought to the way in which statistics are presented for audiences and the BBC has prioritised responsiveness to mistakes in recent years.

Informed by the evidence supporting this report, including Cardiff University’s content analysis and Oxygen Brand Consulting’s audience research study, we have nevertheless identified some areas for improvement. These include the following:

Contextualising statistics: Numbers are sometimes used by the BBC in ways which make it difficult for audiences to understand whether they are really big or small, worrying or not. Audiences have difficulty in particular in interpreting “big numbers”. And a number on its own, without trends or comparisons, rarely means much. We recommend that much more is done to ensure that statistics are always contextualised in such a way that audiences can understand their significance.

Interpreting, evaluating and “refereeing’“statistics: …The BBC needs to get better and braver in interpreting and explaining rival statistics and guiding the audience. Going beyond the headlines There is also a need for more regular, deeper investigation of the figures underlying sources such as press releases. This is especially pertinent as the Government is the predominant source of statistics on the BBC. We cannot expect, and do not suggest it is necessary for, all journalists to have access to and a full understanding of every single statistic which is in  the public domain. But there is a need to look beyond the headlines to ask how the figures were obtained and whether they seem sensible. Failure to dig deeper into the data also represents lost opportunities to provide new and broader insights on topical issues. For example, reporting GDP per head of population might give a different perspective of the economy than just GDP alone, and we would like to see such analyses covered by the BBC more often. Geographic breakdowns could enhance reporting on the devolved UK.

We recommend that “Reality Check” becomes a permanent feature of the BBC’s activities, with a prominent online presence, reinforcing the BBC’s commitment to providing well-informed, accurate information on topical and important issues.

…The BBC needs to have the internal capacity to question press releases, relate them to other data sources and, if necessary, do some additional calculations – for example translating relative to absolute risk. There remains a need for basic training on, for example, percentages and percentage change, and nominal and real financial numbers….(More)”

Playing politics: exposing the flaws of nudge thinking


Book Review by Pat Kane in The New Scientist: “The cover of this book echoes its core anxiety. A giant foot presses down on a sullen, Michael Jackson-like figure – a besuited citizen coolly holding off its massive weight. This is a sinister image to associate with a volume (and its author, Cass Sunstein) that should be able to proclaim a decade of success in the government’s use of “behavioural science”, or nudge theory. But doubts are brewing about its long-term effectiveness in changing public behaviour – as well as about its selective account of evolved human nature.

influence

Nudging has had a strong and illustrious run at the highest level. Outgoing US President Barack Obama and former UK Prime Minister David Cameron both set up behavioural science units at the heart of their administrations (Sunstein was the administrator of the White House Office of Information and Regulatory Affairs from 2009 to 2012).

Sunstein insists that the powers that be cannot avoid nudging us. Every shop floor plan, every new office design, every commercial marketing campaign, every public information campaign, is an “architecting of choices”. As anyone who ever tried to leave IKEA quickly will suspect, that endless, furniture-strewn path to the exit is no accident.

Nudges “steer people in particular directions, but also allow them to go their own way”. They are entreaties to change our habits, to accept old or new norms, but they presume thatwe are ultimately free to refuse the request.

However, our freedom is easily constrained by “cognitive biases”. Our brains, say the nudgers, are lazy, energy-conserving mechanisms, often overwhelmed by information. So a good way to ensure that people pay into their pensions, for example, is to set payment as a “default” in employment contracts, so the employee has to actively untick the box. Defaults of all kinds exploit our preference for inertia and the status quo in order to increase future security….

Sunstein makes useful distinctions between nudges and the other things governments and enterprises can do. Nudges are not “mandates” (laws, regulations, punishments). A mandate would be, for example, a rigorous and well-administered carbon tax, secured through a democratic or representative process. A “nudge” puts smiley faces on your energy bill, and compares your usage to that of the eco-efficient Joneses next door (nudgers like to game our herd-like social impulses).

In a fascinating survey section, which asks Americans and others what they actually think about being the subjects of the “architecting” of their choices, Sunstein discovers that “if people are told that they are being nudged, they will react adversely and resist”.

This is why nudge thinking may be faltering – its understanding of human nature unnecessarily (and perhaps expediently) downgrades our powers of conscious thought….(More)

See The Ethics of Influence: Government in the age of behavioral science Cass R. Sunstein, Cambridge University Press

Talent Gap Is a Main Roadblock as Agencies Eye Emerging Tech


Theo Douglas in GovTech: “U.S. public service agencies are closely eyeing emerging technologies, chiefly advanced analytics and predictive modeling, according to a new report from Accenture, but like their counterparts globally they must address talent and complexity issues before adoption rates will rise.

The report, Emerging Technologies in Public Service, compiled a nine-nation survey of IT officials across all levels of government in policing and justice, health and social services, revenue, border services, pension/Social Security and administration, and was released earlier this week.

It revealed a deep interest in emerging tech from the public sector, finding 70 percent of agencies are evaluating their potential — but a much lower adoption level, with just 25 percent going beyond piloting to implementation….

The revenue and tax industries have been early adopters of advanced analytics and predictive modeling, he said, while biometrics and video analytics are resonating with police agencies.

In Australia, the tax office found using voiceprint technology could save 75,000 work hours annually.

Closer to home, Utah Chief Technology Officer Dave Fletcher told Accenture that consolidating data centers into a virtualized infrastructure improved speed and flexibility, so some processes that once took weeks or months can now happen in minutes or hours.

Nationally, 70 percent of agencies have either piloted or implemented an advanced analytics or predictive modeling program. Biometrics and identity analytics were the next most popular technologies, with 29 percent piloting or implementing, followed by machine learning at 22 percent.

Those numbers contrast globally with Australia, where 68 percent of government agencies have charged into piloting and implementing biometric and identity analytics programs; and Germany and Singapore, where 27 percent and 57 percent of agencies respectively have piloted or adopted video analytic programs.

Overall, 78 percent of respondents said they were either underway or had implemented some machine-learning technologies.

The benefits of embracing emerging tech that were identified ranged from finding better ways of working through automation to innovating and developing new services and reducing costs.

Agencies told Accenture their No. 1 objective was increasing customer satisfaction. But 89 percent said they’d expect a return on implementing intelligent technology within two years. Four-fifths, or 80 percent, agreed intelligent tech would improve employees’ job satisfaction….(More).

From Tech-Driven to Human-Centred: Opengov has a Bright Future Ahead


Essay by Martin Tisné: ” The anti-corruption and transparency field ten years ago was in pre-iPhone mode. Few if any of us spoke of the impact or relevance of technology to what would become known as the open government movement. When the wave of smart phone and other technology hit from the late 2000s onwards, it hit hard, and scaled fast. The ability of technology to create ‘impact at scale’ became the obvious truism of our sector, so much so that pointing out the failures of techno-utopianism became a favorite pastime for pundits and academics. The technological developments of the next ten years will be more human-centered — less ‘build it and they will come’ — and more aware of the un-intended consequences of technology (e.g. the fairness of Artifical Intelligence decision making) whilst still being deeply steeped in the technology itself.

By 2010, two major open data initiatives had launched and were already seen as successful in the US and UK, one of President Obama’s first memorandums was on openness and transparency, and an international research project had tracked 63 different instances of uses of technology for transparency around the world (from Reclamos in Chile, to I Paid a Bribe in India, via Maji Matone in Tanzania). Open data projects numbered over 200 world-wide within barely a year of data.gov.uk launching and to everyone’s surprise topped the list of Open Government Partnership commitments a few years hence.

The technology genie won’t go back into the bottle: the field will continue to grow alongside technological developments. But it would take a bold or foolish pundit to guess which of blockchain or other developments will have radically changed the field by 2025.

What is clearer is that the sector is more questioning towards technology, more human-centered both in the design of those technologies and in seeking to understand and pre-empt their impact….

We’ve moved from cyber-utopianism less than ten years ago to born-digital organisations taking a much more critical look at the deployment of technology. The evangelical phase of the open data movement is coming to an end. The movement no longer needs to preach the virtues of unfettered openness to get a foot in the door. It seeks to frame the debate as to whether, when and how data might legitimately be shared or closed, and what impacts those releases may have on privacy, surveillance, discrimination. An open government movement that is more human-centered and aware of the un-intended consequences of technology, has a bright and impactful future ahead….(More)”

What’s wrong with big data?


James Bridle in the New Humanist: “In a 2008 article in Wired magazine entitled “The End of Theory”, Chris Anderson argued that the vast amounts of data now available to researchers made the traditional scientific process obsolete. No longer would they need to build models of the world and test them against sampled data. Instead, the complexities of huge and totalising datasets would be processed by immense computing clusters to produce truth itself: “With enough data, the numbers speak for themselves.” As an example, Anderson cited Google’s translation algorithms which, with no knowledge of the underlying structures of languages, were capable of inferring the relationship between them using extensive corpora of translated texts. He extended this approach to genomics, neurology and physics, where scientists are increasingly turning to massive computation to make sense of the volumes of information they have gathered about complex systems. In the age of big data, he argued, “Correlation is enough. We can stop looking for models.”

This belief in the power of data, of technology untrammelled by petty human worldviews, is the practical cousin of more metaphysical assertions. A belief in the unquestionability of data leads directly to a belief in the truth of data-derived assertions. And if data contains truth, then it will, without moral intervention, produce better outcomes. Speaking at Google’s private London Zeitgeist conference in 2013, Eric Schmidt, Google Chairman, asserted that “if they had had cellphones in Rwanda in 1994, the genocide would not have happened.” Schmidt’s claim was that technological visibility – the rendering of events and actions legible to everyone – would change the character of those actions. Not only is this statement historically inaccurate (there was plenty of evidence available of what was occurring during the genocide from UN officials, US satellite photographs and other sources), it’s also demonstrably untrue. Analysis of unrest in Kenya in 2007, when over 1,000 people were killed in ethnic conflicts, showed that mobile phones not only spread but accelerated the violence. But you don’t need to look to such extreme examples to see how a belief in technological determinism underlies much of our thinking and reasoning about the world.

“Big data” is not merely a business buzzword, but a way of seeing the world. Driven by technology, markets and politics, it has come to determine much of our thinking, but it is flawed and dangerous. It runs counter to our actual findings when we employ such technologies honestly and with the full understanding of their workings and capabilities. This over-reliance on data, which I call “quantified thinking”, has come to undermine our ability to reason meaningfully about the world, and its effects can be seen across multiple domains.

The assertion is hardly new. Writing in the Dialectic of Enlightenment in 1947, Theodor Adorno and Max Horkheimer decried “the present triumph of the factual mentality” – the predecessor to quantified thinking – and succinctly analysed the big data fallacy, set out by Anderson above. “It does not work by images or concepts, by the fortunate insights, but refers to method, the exploitation of others’ work, and capital … What men want to learn from nature is how to use it in order wholly to dominate it and other men. That is the only aim.” What is different in our own time is that we have built a world-spanning network of communication and computation to test this assertion. While it occasionally engenders entirely new forms of behaviour and interaction, the network most often shows to us with startling clarity the relationships and tendencies which have been latent or occluded until now. In the face of the increased standardisation of knowledge, it becomes harder and harder to argue against quantified thinking, because the advances of technology have been conjoined with the scientific method and social progress. But as I hope to show, technology ultimately reveals its limitations….

“Eroom’s law” – Moore’s law backwards – was recently formulated to describe a problem in pharmacology. Drug discovery has been getting more expensive. Since the 1950s the number of drugs approved for use in human patients per billion US dollars spent on research and development has halved every nine years. This problem has long perplexed researchers. According to the principles of technological growth, the trend should be in the opposite direction. In a 2012 paper in Nature entitled “Diagnosing the decline in pharmaceutical R&D efficiency” the authors propose and investigate several possible causes for this. They begin with social and physical influences, such as increased regulation, increased expectations and the exhaustion of easy targets (the “low hanging fruit” problem). Each of these are – with qualifications – disposed of, leaving open the question of the discovery process itself….(More)

Is Social Media Killing Democracy?


Phil Howard at Culture Digitally: “This is the big year for computational propaganda—using immense data sets to manipulate public opinion over social media.  Both the Brexit referendum and US election have revealed the limits of modern democracy, and social media platforms are currently setting those limits. 

Platforms like Twitter and Facebook now provide a structure for our political lives.  We’ve always relied on many kinds of sources for our political news and information.  Family, friends, news organizations, charismatic politicians certainly predate the internet.  But whereas those are sources of information, social media now provides the structure for political conversation.  And the problem is that these technologies permit too much fake news, encourage our herding instincts, and aren’t expected to provide public goods.

First, social algorithms allow fake news stories from untrustworthy sources to spread like wildfire over networks of family and friends.  …

Second, social media algorithms provide very real structure to what political scientists often call “elective affinity” or “selective exposure”…

The third problem is that technology companies, including Facebook and Twitter, have been given a “moral pass” on the obligations we hold journalists and civil society groups to….

Facebook has run several experiments now, published in scholarly journals, demonstrating that they have the ability to accurately anticipate and measure social trends.  Whereas journalists and social scientists feel an obligation to openly analyze and discuss public preferences, we do not expect this of Facebook.  The network effects that clearly were unmeasured by pollsters were almost certainly observable to Facebook.  When it comes to news and information about politics, or public preferences on important social questions, Facebook has a moral obligation to share data and prevent computational propaganda.  The Brexit referendum and US election have taught us that Twitter and Facebook are now media companies.  Their engineering decisions are effectively editorial decisions, and we need to expect more openness about how their algorithms work.  And we should expect them to deliberate about their editorial decisions.

There are some ways to fix these problems.  Opaque software algorithms shape what people find in their news feeds.  We’ve all noticed fake news stories, often called clickbait, and while these can be an entertaining part of using the internet, it is bad when they are used to manipulate public opinion.  These algorithms work as “bots” on social media platforms like Twitter, where they were used in both the Brexit and US Presidential campaign to aggressively advance the case for leaving Europe and the case for electing Trump.  Similar algorithms work behind the scenes on Facebook, where they govern what content from your social networks actually gets your attention. 

So the first way to strengthen democratic practices is for academics, journalists, policy makers and the interested public to audit social media algorithms….(More)”.

Digital Government: Leveraging Innovation to Improve Public Sector Performance and Outcomes for Citizens


Book edited by Svenja Falk, Andrea Römmele, Andrea and Michael Silverman: “This book focuses on the implementation of digital strategies in the public sectors in the US, Mexico, Brazil, India and Germany. The case studies presented examine different digital projects by looking at their impact as well as their alignment with their national governments’ digital strategies. The contributors assess the current state of digital government, analyze the contribution of digital technologies in achieving outcomes for citizens, discuss ways to measure digitalization and address the question of how governments oversee the legal and regulatory obligations of information technology. The book argues that most countries formulate good strategies for digital government, but do not effectively prescribe and implement corresponding policies and programs. Showing specific programs that deliver results can help policy makers, knowledge specialists and public-sector researchers to develop best practices for future national strategies….(More)”