Comparing resistance to open data performance measurement


Paper by Gregory Michener and Otavio Ritter in Public Administration : “Much is known about governmental resistance to disclosure laws, less so about multi-stakeholder resistance to open data. This study compares open data initiatives within the primary and secondary school systems of Brazil and the UK, focusing on stakeholder resistance and corresponding policy solutions. The analytical framework is based on the ‘Three-Ps’ of open data resistance to performance metrics, corresponding to professional, political, and privacy-related concerns. Evidence shows that resistance is highly nuanced, as stakeholders alternately serve as both principals and agents. School administrators, for example, are simultaneously principals to service providers and teachers, and at once agents to parents and politicians. Relying on a different systems comparison, in-depth interviews, and newspaper content analyses, we find that similar stakeholders across countries demonstrate strikingly divergent levels of resistance. In overcoming stakeholder resistance – across socioeconomic divides – context conscientious ‘data-informed’ evaluations may promote greater acceptance than narrowly ‘data-driven’ performance measurements…(More)”

Making Sense of Statistics


Report for the BBC Trust: “The BBC, as the UK’s main public service broadcaster, has a particularly important role to play in bringing statistics to public attention and helping audiences to digest, understand and apply them to their daily lives. Accuracy and impartiality have a specific meaning when applied to statistics. Reporting accurately and impartially on critical and sometimes controversial topics requires understanding the data that informs them and accurate and impartial presentation of that data.

Overall, the BBC is to be commended in its approach to the use of statistics. People at the BBC place great value on using statistics responsibly. Journalists often go to some lengths to verify the statistics they receive. They exercise judgement when deciding which statistics to cover and the BBC has a strong record in selecting and presenting statistics effectively. Journalists and programme makers often make attempts to challenge conventional wisdom and provide independent assessments of stories reported elsewhere. Many areas of the BBC give careful thought to the way in which statistics are presented for audiences and the BBC has prioritised responsiveness to mistakes in recent years.

Informed by the evidence supporting this report, including Cardiff University’s content analysis and Oxygen Brand Consulting’s audience research study, we have nevertheless identified some areas for improvement. These include the following:

Contextualising statistics: Numbers are sometimes used by the BBC in ways which make it difficult for audiences to understand whether they are really big or small, worrying or not. Audiences have difficulty in particular in interpreting “big numbers”. And a number on its own, without trends or comparisons, rarely means much. We recommend that much more is done to ensure that statistics are always contextualised in such a way that audiences can understand their significance.

Interpreting, evaluating and “refereeing’“statistics: …The BBC needs to get better and braver in interpreting and explaining rival statistics and guiding the audience. Going beyond the headlines There is also a need for more regular, deeper investigation of the figures underlying sources such as press releases. This is especially pertinent as the Government is the predominant source of statistics on the BBC. We cannot expect, and do not suggest it is necessary for, all journalists to have access to and a full understanding of every single statistic which is in  the public domain. But there is a need to look beyond the headlines to ask how the figures were obtained and whether they seem sensible. Failure to dig deeper into the data also represents lost opportunities to provide new and broader insights on topical issues. For example, reporting GDP per head of population might give a different perspective of the economy than just GDP alone, and we would like to see such analyses covered by the BBC more often. Geographic breakdowns could enhance reporting on the devolved UK.

We recommend that “Reality Check” becomes a permanent feature of the BBC’s activities, with a prominent online presence, reinforcing the BBC’s commitment to providing well-informed, accurate information on topical and important issues.

…The BBC needs to have the internal capacity to question press releases, relate them to other data sources and, if necessary, do some additional calculations – for example translating relative to absolute risk. There remains a need for basic training on, for example, percentages and percentage change, and nominal and real financial numbers….(More)”

Playing politics: exposing the flaws of nudge thinking


Book Review by Pat Kane in The New Scientist: “The cover of this book echoes its core anxiety. A giant foot presses down on a sullen, Michael Jackson-like figure – a besuited citizen coolly holding off its massive weight. This is a sinister image to associate with a volume (and its author, Cass Sunstein) that should be able to proclaim a decade of success in the government’s use of “behavioural science”, or nudge theory. But doubts are brewing about its long-term effectiveness in changing public behaviour – as well as about its selective account of evolved human nature.

influence

Nudging has had a strong and illustrious run at the highest level. Outgoing US President Barack Obama and former UK Prime Minister David Cameron both set up behavioural science units at the heart of their administrations (Sunstein was the administrator of the White House Office of Information and Regulatory Affairs from 2009 to 2012).

Sunstein insists that the powers that be cannot avoid nudging us. Every shop floor plan, every new office design, every commercial marketing campaign, every public information campaign, is an “architecting of choices”. As anyone who ever tried to leave IKEA quickly will suspect, that endless, furniture-strewn path to the exit is no accident.

Nudges “steer people in particular directions, but also allow them to go their own way”. They are entreaties to change our habits, to accept old or new norms, but they presume thatwe are ultimately free to refuse the request.

However, our freedom is easily constrained by “cognitive biases”. Our brains, say the nudgers, are lazy, energy-conserving mechanisms, often overwhelmed by information. So a good way to ensure that people pay into their pensions, for example, is to set payment as a “default” in employment contracts, so the employee has to actively untick the box. Defaults of all kinds exploit our preference for inertia and the status quo in order to increase future security….

Sunstein makes useful distinctions between nudges and the other things governments and enterprises can do. Nudges are not “mandates” (laws, regulations, punishments). A mandate would be, for example, a rigorous and well-administered carbon tax, secured through a democratic or representative process. A “nudge” puts smiley faces on your energy bill, and compares your usage to that of the eco-efficient Joneses next door (nudgers like to game our herd-like social impulses).

In a fascinating survey section, which asks Americans and others what they actually think about being the subjects of the “architecting” of their choices, Sunstein discovers that “if people are told that they are being nudged, they will react adversely and resist”.

This is why nudge thinking may be faltering – its understanding of human nature unnecessarily (and perhaps expediently) downgrades our powers of conscious thought….(More)

See The Ethics of Influence: Government in the age of behavioral science Cass R. Sunstein, Cambridge University Press

Talent Gap Is a Main Roadblock as Agencies Eye Emerging Tech


Theo Douglas in GovTech: “U.S. public service agencies are closely eyeing emerging technologies, chiefly advanced analytics and predictive modeling, according to a new report from Accenture, but like their counterparts globally they must address talent and complexity issues before adoption rates will rise.

The report, Emerging Technologies in Public Service, compiled a nine-nation survey of IT officials across all levels of government in policing and justice, health and social services, revenue, border services, pension/Social Security and administration, and was released earlier this week.

It revealed a deep interest in emerging tech from the public sector, finding 70 percent of agencies are evaluating their potential — but a much lower adoption level, with just 25 percent going beyond piloting to implementation….

The revenue and tax industries have been early adopters of advanced analytics and predictive modeling, he said, while biometrics and video analytics are resonating with police agencies.

In Australia, the tax office found using voiceprint technology could save 75,000 work hours annually.

Closer to home, Utah Chief Technology Officer Dave Fletcher told Accenture that consolidating data centers into a virtualized infrastructure improved speed and flexibility, so some processes that once took weeks or months can now happen in minutes or hours.

Nationally, 70 percent of agencies have either piloted or implemented an advanced analytics or predictive modeling program. Biometrics and identity analytics were the next most popular technologies, with 29 percent piloting or implementing, followed by machine learning at 22 percent.

Those numbers contrast globally with Australia, where 68 percent of government agencies have charged into piloting and implementing biometric and identity analytics programs; and Germany and Singapore, where 27 percent and 57 percent of agencies respectively have piloted or adopted video analytic programs.

Overall, 78 percent of respondents said they were either underway or had implemented some machine-learning technologies.

The benefits of embracing emerging tech that were identified ranged from finding better ways of working through automation to innovating and developing new services and reducing costs.

Agencies told Accenture their No. 1 objective was increasing customer satisfaction. But 89 percent said they’d expect a return on implementing intelligent technology within two years. Four-fifths, or 80 percent, agreed intelligent tech would improve employees’ job satisfaction….(More).

From Tech-Driven to Human-Centred: Opengov has a Bright Future Ahead


Essay by Martin Tisné: ” The anti-corruption and transparency field ten years ago was in pre-iPhone mode. Few if any of us spoke of the impact or relevance of technology to what would become known as the open government movement. When the wave of smart phone and other technology hit from the late 2000s onwards, it hit hard, and scaled fast. The ability of technology to create ‘impact at scale’ became the obvious truism of our sector, so much so that pointing out the failures of techno-utopianism became a favorite pastime for pundits and academics. The technological developments of the next ten years will be more human-centered — less ‘build it and they will come’ — and more aware of the un-intended consequences of technology (e.g. the fairness of Artifical Intelligence decision making) whilst still being deeply steeped in the technology itself.

By 2010, two major open data initiatives had launched and were already seen as successful in the US and UK, one of President Obama’s first memorandums was on openness and transparency, and an international research project had tracked 63 different instances of uses of technology for transparency around the world (from Reclamos in Chile, to I Paid a Bribe in India, via Maji Matone in Tanzania). Open data projects numbered over 200 world-wide within barely a year of data.gov.uk launching and to everyone’s surprise topped the list of Open Government Partnership commitments a few years hence.

The technology genie won’t go back into the bottle: the field will continue to grow alongside technological developments. But it would take a bold or foolish pundit to guess which of blockchain or other developments will have radically changed the field by 2025.

What is clearer is that the sector is more questioning towards technology, more human-centered both in the design of those technologies and in seeking to understand and pre-empt their impact….

We’ve moved from cyber-utopianism less than ten years ago to born-digital organisations taking a much more critical look at the deployment of technology. The evangelical phase of the open data movement is coming to an end. The movement no longer needs to preach the virtues of unfettered openness to get a foot in the door. It seeks to frame the debate as to whether, when and how data might legitimately be shared or closed, and what impacts those releases may have on privacy, surveillance, discrimination. An open government movement that is more human-centered and aware of the un-intended consequences of technology, has a bright and impactful future ahead….(More)”

What’s wrong with big data?


James Bridle in the New Humanist: “In a 2008 article in Wired magazine entitled “The End of Theory”, Chris Anderson argued that the vast amounts of data now available to researchers made the traditional scientific process obsolete. No longer would they need to build models of the world and test them against sampled data. Instead, the complexities of huge and totalising datasets would be processed by immense computing clusters to produce truth itself: “With enough data, the numbers speak for themselves.” As an example, Anderson cited Google’s translation algorithms which, with no knowledge of the underlying structures of languages, were capable of inferring the relationship between them using extensive corpora of translated texts. He extended this approach to genomics, neurology and physics, where scientists are increasingly turning to massive computation to make sense of the volumes of information they have gathered about complex systems. In the age of big data, he argued, “Correlation is enough. We can stop looking for models.”

This belief in the power of data, of technology untrammelled by petty human worldviews, is the practical cousin of more metaphysical assertions. A belief in the unquestionability of data leads directly to a belief in the truth of data-derived assertions. And if data contains truth, then it will, without moral intervention, produce better outcomes. Speaking at Google’s private London Zeitgeist conference in 2013, Eric Schmidt, Google Chairman, asserted that “if they had had cellphones in Rwanda in 1994, the genocide would not have happened.” Schmidt’s claim was that technological visibility – the rendering of events and actions legible to everyone – would change the character of those actions. Not only is this statement historically inaccurate (there was plenty of evidence available of what was occurring during the genocide from UN officials, US satellite photographs and other sources), it’s also demonstrably untrue. Analysis of unrest in Kenya in 2007, when over 1,000 people were killed in ethnic conflicts, showed that mobile phones not only spread but accelerated the violence. But you don’t need to look to such extreme examples to see how a belief in technological determinism underlies much of our thinking and reasoning about the world.

“Big data” is not merely a business buzzword, but a way of seeing the world. Driven by technology, markets and politics, it has come to determine much of our thinking, but it is flawed and dangerous. It runs counter to our actual findings when we employ such technologies honestly and with the full understanding of their workings and capabilities. This over-reliance on data, which I call “quantified thinking”, has come to undermine our ability to reason meaningfully about the world, and its effects can be seen across multiple domains.

The assertion is hardly new. Writing in the Dialectic of Enlightenment in 1947, Theodor Adorno and Max Horkheimer decried “the present triumph of the factual mentality” – the predecessor to quantified thinking – and succinctly analysed the big data fallacy, set out by Anderson above. “It does not work by images or concepts, by the fortunate insights, but refers to method, the exploitation of others’ work, and capital … What men want to learn from nature is how to use it in order wholly to dominate it and other men. That is the only aim.” What is different in our own time is that we have built a world-spanning network of communication and computation to test this assertion. While it occasionally engenders entirely new forms of behaviour and interaction, the network most often shows to us with startling clarity the relationships and tendencies which have been latent or occluded until now. In the face of the increased standardisation of knowledge, it becomes harder and harder to argue against quantified thinking, because the advances of technology have been conjoined with the scientific method and social progress. But as I hope to show, technology ultimately reveals its limitations….

“Eroom’s law” – Moore’s law backwards – was recently formulated to describe a problem in pharmacology. Drug discovery has been getting more expensive. Since the 1950s the number of drugs approved for use in human patients per billion US dollars spent on research and development has halved every nine years. This problem has long perplexed researchers. According to the principles of technological growth, the trend should be in the opposite direction. In a 2012 paper in Nature entitled “Diagnosing the decline in pharmaceutical R&D efficiency” the authors propose and investigate several possible causes for this. They begin with social and physical influences, such as increased regulation, increased expectations and the exhaustion of easy targets (the “low hanging fruit” problem). Each of these are – with qualifications – disposed of, leaving open the question of the discovery process itself….(More)

Is Social Media Killing Democracy?


Phil Howard at Culture Digitally: “This is the big year for computational propaganda—using immense data sets to manipulate public opinion over social media.  Both the Brexit referendum and US election have revealed the limits of modern democracy, and social media platforms are currently setting those limits. 

Platforms like Twitter and Facebook now provide a structure for our political lives.  We’ve always relied on many kinds of sources for our political news and information.  Family, friends, news organizations, charismatic politicians certainly predate the internet.  But whereas those are sources of information, social media now provides the structure for political conversation.  And the problem is that these technologies permit too much fake news, encourage our herding instincts, and aren’t expected to provide public goods.

First, social algorithms allow fake news stories from untrustworthy sources to spread like wildfire over networks of family and friends.  …

Second, social media algorithms provide very real structure to what political scientists often call “elective affinity” or “selective exposure”…

The third problem is that technology companies, including Facebook and Twitter, have been given a “moral pass” on the obligations we hold journalists and civil society groups to….

Facebook has run several experiments now, published in scholarly journals, demonstrating that they have the ability to accurately anticipate and measure social trends.  Whereas journalists and social scientists feel an obligation to openly analyze and discuss public preferences, we do not expect this of Facebook.  The network effects that clearly were unmeasured by pollsters were almost certainly observable to Facebook.  When it comes to news and information about politics, or public preferences on important social questions, Facebook has a moral obligation to share data and prevent computational propaganda.  The Brexit referendum and US election have taught us that Twitter and Facebook are now media companies.  Their engineering decisions are effectively editorial decisions, and we need to expect more openness about how their algorithms work.  And we should expect them to deliberate about their editorial decisions.

There are some ways to fix these problems.  Opaque software algorithms shape what people find in their news feeds.  We’ve all noticed fake news stories, often called clickbait, and while these can be an entertaining part of using the internet, it is bad when they are used to manipulate public opinion.  These algorithms work as “bots” on social media platforms like Twitter, where they were used in both the Brexit and US Presidential campaign to aggressively advance the case for leaving Europe and the case for electing Trump.  Similar algorithms work behind the scenes on Facebook, where they govern what content from your social networks actually gets your attention. 

So the first way to strengthen democratic practices is for academics, journalists, policy makers and the interested public to audit social media algorithms….(More)”.

Digital Government: Leveraging Innovation to Improve Public Sector Performance and Outcomes for Citizens


Book edited by Svenja Falk, Andrea Römmele, Andrea and Michael Silverman: “This book focuses on the implementation of digital strategies in the public sectors in the US, Mexico, Brazil, India and Germany. The case studies presented examine different digital projects by looking at their impact as well as their alignment with their national governments’ digital strategies. The contributors assess the current state of digital government, analyze the contribution of digital technologies in achieving outcomes for citizens, discuss ways to measure digitalization and address the question of how governments oversee the legal and regulatory obligations of information technology. The book argues that most countries formulate good strategies for digital government, but do not effectively prescribe and implement corresponding policies and programs. Showing specific programs that deliver results can help policy makers, knowledge specialists and public-sector researchers to develop best practices for future national strategies….(More)”

Self-organised scientific crowds to remedy research bureaucracy


 at EuroScientist: “Imagine a world without peer review committees, project proposals or activity reports. Imagine a world where research funds seamlessly flow where they are best employed, like nutrients in a food-web or materials in a river network. Many scientists would immediately signup to live in such a world.

The Netherlands is set to become the place where this academic paradise will be tested, in the next few years. In July 2016, the Dutch parliament approved a motion related to implementing alternative funding procedures to alleviate the research bureaucracy, which is increasingly burdening scientists. Here EuroScientistinvestigates whether the self-organisation power of the scientific community could help resolve one of researchers’ worse burden.

Self-organisation

The Dutch national funding agency is planning to adopt a radically new system to allocate part of its funding, promoted by ecologist Marten Sheffer, who is professor of aquatic ecology and water quality management at Wageningen University and Research Centre. Under the proposed approach, funds would intially be evenly divided among all scientists in the country. Then, they would each have to allocate half of what they have received to the person who, in their opinion, is the most deserving scientist in their network. Then, the process would be iterated.

The promoters of the system believe that the “wisdom of the crowd” of the scientific community would assigning more funds to the most deserving scientists among them; with minimal amount of paperwork. The Dutch initiative is part of a broader effort to use a scientific approach to improve science.

In other words, it is part of a trend aiming to employ scientific evidence to tweak the social mechanisms of academia. Specifically, findings from what is known as complexity research are increasingly brought forward as a way of reducing bureaucracy, removing red tape, and maximising the time scientists spend in thinking….

Abandoning the current bureaucratic, top-down system to evaluate and fund research, based on labour-intensive peer-review, may not be too much of a loss. “Peer-review is an imperfect, fragile mechanism. Our simulations show that assigning funds at random would not distort too much the results of the traditional mechanism,” says Flaminio Squazzoni, an economist at the University of Brescia, Italy, and the coordinator of the PEERE-New Frontiers of Peer Review COST action.

In reality peer-review is never quite neutral. “If scientists behave perfectly, then peer review works,” Squazzoni explains, “but if strategic motivations are taken into account, like saving time or competition, then the results are worse than random.” Squazzoni believes that automation, economic incentives, or the creation of professional reviewers may improve the situation….(More)”

Portugal has announced the world’s first nationwide participatory budget


Graça Fonseca at apolitical:”Portugal has announced the world’s first participatory budget on a national scale. The project will let people submit ideas for what the government should spend its money on, and then vote on which ideas are adopted.

Although participatory budgeting has become increasingly popular around the world in the past few years, it has so far been confined to cities and regions, and no country that we know of has attempted it nationwide. To reach as many people as possible, Portugal is also examining another innovation: letting people cast their votes via ATM machines.

‘It’s about quality of life, it’s about the quality of public space, it’s about the quality of life for your children, it’s about your life, OK?’ Graça Fonseca, the minister responsible, told Apolitical. ‘And you have a huge deficit of trust between people and the institutions of democracy. That’s the point we’re starting from and, if you look around, Portugal is not an exception in that among Western societies. We need to build that trust and, in my opinion, it’s urgent. If you don’t do anything, in ten, twenty years you’ll have serious problems.’

Although the official window for proposals begins in January, some have already been submitted to the project’s website. One suggests equipping kindergartens with technology to teach children about robotics. Using the open-source platform Arduino, the plan is to let children play with the tech and so foster scientific understanding from the earliest age.

Proposals can be made in the areas of science, culture, agriculture and lifelong learning, and there will be more than forty events in the new year for people to present and discuss their ideas.

The organisers hope that it will go some way to restoring closer contact between government and its citizens. Previous projects have shown that people who don’t vote in general elections often do cast their ballot on the specific proposals that participatory budgeting entails. Moreover, those who make the proposals often become passionate about them, campaigning for votes, flyering, making YouTube videos, going door-to-door and so fuelling a public discussion that involves ever more people in the process.

On the other side, it can bring public servants nearer to their fellow citizens by sharpening their understanding of what people want and what their priorities are. It can also raise the quality of public services by directing them more precisely to where they’re needed as well as by tapping the collective intelligence and imagination of thousands of participants….

Although it will not be used this year, because the project is still very much in the trial phase, the use of ATMs is potentially revolutionary. As Fonseca puts it, ‘In every remote part of the country, you might have nothing else, but you have an ATM.’ Moreover, an ATM could display proposals and allow people to vote directly, not least because it already contains a secure way of verifying their identity. At the moment, for comparison, people can vote by text or online, sending in the number from their ID card, which is checked against a database….(More)”.