Is Open Data the Death of FOIA?


Beth Noveck at the Yale Law Journal: “For fifty years, the Freedom of Information Act (FOIA) has been the platinum standard for open government in the United States. The statute is considered the legal bedrock of the public’s right to know about the workings of our government. More than one hundred countries and all fifty states have enacted their own freedom of information laws. At the same time, FOIA’s many limitations have also become evident: a cumbersome process, delays in responses, and redactions that frustrate journalists and other information seekers. Politically-motivated nuisance requests bedevil government agencies.With over 700,000 FOIA requests filed every year, the federal government faces the costs of a mounting backlog.

In recent years, however, an entirely different approach to government transparency in line with the era of big data has emerged: open government data. Open government data —generally shortened to open data—has many definitions but is generally considered to be publicly available information that can be universally and readily accessed, used, and redistributed free of charge in digital form. Open data is not limited to statistics, but also includes text such as the United States Federal Register, the daily newspaper of government, which was released as open data in bulk form in 2010.

To understand how significant the open data movement is for FOIA, this Essay discusses the impact of open data on the institutions and functions of government and the ways open data contrasts markedly with FOIA. Open data emphasizes the proactive publication of whole classes of information. Open data includes data about the workings of government but also data collected by the government about the economy and society posted online in a centralized repository for use by the wider public, including academic users seeking information as the basis for original research and commercial users looking to create new products and services. For example, Pixar used open data from the United States Geological Survey to create more realistic detail in scenes from its movie The Good Dinosaur.

By contrast, FOIA promotes ex post publication of information created by the government especially about its own workings in response to specific demands by individual requestors. I argue that open data’s more systematic and collaborative approach represents a radical and welcome departure from FOIA because open data concentrates on information as a means to solve problems to the end of improving government effectiveness. Open data is legitimated by the improved outcomes it yields and grounded in a theory of government effectiveness and, as a result, eschews the adversarial and ad hoc FOIA approach. Ultimately, however, each tactic offers important complementary benefits. The proactive information disclosure regime of open data is strengthened by FOIA’s rights of legal enforcement. Together, they stand to become the hallmark of government transparency in the fifty years ahead….(More)”.

Comparing resistance to open data performance measurement


Paper by Gregory Michener and Otavio Ritter in Public Administration : “Much is known about governmental resistance to disclosure laws, less so about multi-stakeholder resistance to open data. This study compares open data initiatives within the primary and secondary school systems of Brazil and the UK, focusing on stakeholder resistance and corresponding policy solutions. The analytical framework is based on the ‘Three-Ps’ of open data resistance to performance metrics, corresponding to professional, political, and privacy-related concerns. Evidence shows that resistance is highly nuanced, as stakeholders alternately serve as both principals and agents. School administrators, for example, are simultaneously principals to service providers and teachers, and at once agents to parents and politicians. Relying on a different systems comparison, in-depth interviews, and newspaper content analyses, we find that similar stakeholders across countries demonstrate strikingly divergent levels of resistance. In overcoming stakeholder resistance – across socioeconomic divides – context conscientious ‘data-informed’ evaluations may promote greater acceptance than narrowly ‘data-driven’ performance measurements…(More)”

Towards a DataPlace: mapping data in a game to encourage participatory design in smart cities


Paper by Barker, Matthew; Wolff, Annika and van der Linden, Janet: “The smart city has been envisioned as a place where citizens can participate in city decision making and in the design of city services. As a key part of this vision, pervasive digital technology and open data legislation are being framed as vehicles for citizens to access rich data about their city. It has become apparent though, that simply providing access to these resources does not automatically lead to the development of data-driven applications. If we are going to engage more of the citizenry in smart city design and raise productivity, we are going to need to make the data itself more accessible, engaging and intelligible for non-experts. This ongoing study is exploring one method for doing so. As part of the MK:Smart City project team, we are developing a tangible data look-up interface that acts as an alternative to the conventional DataBase. This interface, or DataPlace as we are calling it, takes the form of a map, which the user places sensors on to physically capture real-time data. This is a simulation of the physical act of capturing data in the real world. We discuss the design of the DataPlace prototype under development and the planned user trials to test out our hypothesis; that a DataPlace can make handling data more accessible, intelligible and engaging for non-experts than conventional interface types….(More)”

New UN resolution on the right to privacy in the digital age: crucial and timely


Deborah Brown at the Internet Policy Review: “The rapid pace of technological development enables individuals all over the world to use new information and communications technologies (ICTs) to improve their lives. At the same time, technology is enhancing the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy. In this context, the UN General Assembly’s Third Committee adoption on 21 November of a new resolution on the right to privacy in the digital age comes as timely and crucial for protecting the right to privacy in light of new challenges.

As with previous UN resolutions on this topic, the resolution adopted on 21 November 2016 recognises the importance of respecting international commitments in relation to the right to privacy. It underscores that any legitimate concerns states may have with regard to their security can and should be addressed in a manner consistent with obligations under international human rights law.

Recognising that more and more personal data is being collected, processed, and shared, this year’s resolution expresses concern about the sale or multiple re-sales of personal data, which often happens without the individual’s free, explicit and informed consent. It calls for the strengthening of prevention of and protection against such violations, and calls on states to develop preventative measures, sanctions, and remedies.

This year, the resolution more explicitly acknowledges the role of the private sector. It calls on states to put in place (or maintain) effective sanctions and remedies to prevent the private sector from committing violations and abuses of the right to privacy. This is in line with states’ obligations under the UN Guiding Principles on Business and Human Rights, which require states to protect against abuses by businesses within their territories or jurisdictions. The resolution specifically calls on states to refrain from requiring companies to take steps that interfere with the right to privacy in an arbitrary or unlawful way. With respect to companies, it recalls the responsibility of the private sector to respect human rights, and specifically calls on them to inform users about company policies that may impact their right to privacy….(More)”

New Institute Pushes the Boundaries of Big Data


Press Release: “Each year thousands of genomes are sequenced, millions of neuronal activity traces are recorded, and light from hundreds of millions of galaxies is captured by our newest telescopes, all creating datasets of staggering size. These complex datasets are then stored for analysis.

Ongoing analysis of these information streams has illuminated a problem, however: Scientists’ standard methodologies are inadequate to the task of analyzing massive quantities of data. The development of new methods and software to learn from data and to model — at sufficient resolution — the complex processes they reflect is now a pressing concern in the scientific community.

To address these challenges, the Simons Foundation has launched a substantial new internal research group called the Flatiron Institute (FI). The FI is the first multidisciplinary institute focused entirely on computation. It is also the first center of its kind to be wholly supported by private philanthropy, providing a permanent home for up to 250 scientists and collaborating expert programmers all working together to create, deploy and support new state-of-the-art computational methods. Few existing institutions support the combination of scientists and programmers, instead leaving programming to relatively impermanent graduate students and postdoctoral fellows, and none have done so at the scale of the Flatiron Institute or with such a broad scope, at a single location.

The institute will hold conferences and meetings and serve as a focal point for computational science around the world….(More)”.

Governance and Service Delivery: Practical Applications of Social Accountability Across Sectors


Book edited by Derick W. Brinkerhoff, Jana C. Hertz, and Anna Wetterberg: “…Historically, donors and academics have sought to clarify what makes sectoral projects effective and sustainable contributors to development. Among the key factors identified have been (1) the role and capabilities of the state and (2) the relationships between the state and citizens, phenomena often lumped together under the broad rubric of “governance.” Given the importance of a functioning state and positive interactions with citizens, donors have treated governance as a sector in its own right, with projects ranging from public sector management reform, to civil society strengthening, to democratization (Brinkerhoff, 2008). The link between governance and sectoral service delivery was highlighted in the World Bank’s 2004 World Development Report, which focused on accountability structures and processes (World Bank, 2004).

Since then, sectoral specialists’ awareness that governance interventions can contribute to service delivery improvements has increased substantially, and there is growing recognition that both technical and governance elements are necessary facets of strengthening public services. However, expanded awareness has not reliably translated into effective integration of governance into sectoral programs and projects in, for example, health, education, water, agriculture, or community development. The bureaucratic realities of donor programming offer a partial explanation…. Beyond bureaucratic barriers, though, lie ongoing gaps in practical knowledge of how best to combine attention to governance with sector-specific technical investments. What interventions make sense, and what results can reasonably be expected? What conditions support or limit both improved governance and better service delivery? How can citizens interact with public officials and service providers to express their needs, improve services, and increase responsiveness? Various models and compilations of best practices have been developed, but debates remain, and answers to these questions are far from settled. This volume investigates these questions and contributes to building understanding that will enhance both knowledge and practice. In this book, we examine six recent projects, funded mostly by the United States Agency for International Development and implemented by RTI International, that pursued several different paths to engaging citizens, public officials, and service providers on issues related to accountability and sectoral services…(More)”

Digital Kenya: An Entrepreneurial Revolution in the Making


(Open Access) book edited by Bitange Ndemo and Tim Weiss: “Presenting rigorous and original research, this volume offers key insights into the historical, cultural, social, economic and political forces at play in the creation of world-class ICT innovations in Kenya. Following the arrival of fiber-optic cables in 2009, Digital Kenya examines why the initial entrepreneurial spirit and digital revolution has begun to falter despite support from motivated entrepreneurs, international investors, policy experts and others. Written by engaged scholars and professionals in the field, the book offers 15 eye-opening chapters and 14 one-on-one conversations with entrepreneurs and investors to ask why establishing ICT start-ups on a continental and global scale remains a challenge on the “Silicon Savannah”. The authors present evidence-based recommendations to help Kenya to continue producing globally impactful  ICT innovations that improve the lives of those still waiting on the side-lines, and to inspire other nations to do the same….(More)”

Making Sense of Statistics


Report for the BBC Trust: “The BBC, as the UK’s main public service broadcaster, has a particularly important role to play in bringing statistics to public attention and helping audiences to digest, understand and apply them to their daily lives. Accuracy and impartiality have a specific meaning when applied to statistics. Reporting accurately and impartially on critical and sometimes controversial topics requires understanding the data that informs them and accurate and impartial presentation of that data.

Overall, the BBC is to be commended in its approach to the use of statistics. People at the BBC place great value on using statistics responsibly. Journalists often go to some lengths to verify the statistics they receive. They exercise judgement when deciding which statistics to cover and the BBC has a strong record in selecting and presenting statistics effectively. Journalists and programme makers often make attempts to challenge conventional wisdom and provide independent assessments of stories reported elsewhere. Many areas of the BBC give careful thought to the way in which statistics are presented for audiences and the BBC has prioritised responsiveness to mistakes in recent years.

Informed by the evidence supporting this report, including Cardiff University’s content analysis and Oxygen Brand Consulting’s audience research study, we have nevertheless identified some areas for improvement. These include the following:

Contextualising statistics: Numbers are sometimes used by the BBC in ways which make it difficult for audiences to understand whether they are really big or small, worrying or not. Audiences have difficulty in particular in interpreting “big numbers”. And a number on its own, without trends or comparisons, rarely means much. We recommend that much more is done to ensure that statistics are always contextualised in such a way that audiences can understand their significance.

Interpreting, evaluating and “refereeing’“statistics: …The BBC needs to get better and braver in interpreting and explaining rival statistics and guiding the audience. Going beyond the headlines There is also a need for more regular, deeper investigation of the figures underlying sources such as press releases. This is especially pertinent as the Government is the predominant source of statistics on the BBC. We cannot expect, and do not suggest it is necessary for, all journalists to have access to and a full understanding of every single statistic which is in  the public domain. But there is a need to look beyond the headlines to ask how the figures were obtained and whether they seem sensible. Failure to dig deeper into the data also represents lost opportunities to provide new and broader insights on topical issues. For example, reporting GDP per head of population might give a different perspective of the economy than just GDP alone, and we would like to see such analyses covered by the BBC more often. Geographic breakdowns could enhance reporting on the devolved UK.

We recommend that “Reality Check” becomes a permanent feature of the BBC’s activities, with a prominent online presence, reinforcing the BBC’s commitment to providing well-informed, accurate information on topical and important issues.

…The BBC needs to have the internal capacity to question press releases, relate them to other data sources and, if necessary, do some additional calculations – for example translating relative to absolute risk. There remains a need for basic training on, for example, percentages and percentage change, and nominal and real financial numbers….(More)”

Playing politics: exposing the flaws of nudge thinking


Book Review by Pat Kane in The New Scientist: “The cover of this book echoes its core anxiety. A giant foot presses down on a sullen, Michael Jackson-like figure – a besuited citizen coolly holding off its massive weight. This is a sinister image to associate with a volume (and its author, Cass Sunstein) that should be able to proclaim a decade of success in the government’s use of “behavioural science”, or nudge theory. But doubts are brewing about its long-term effectiveness in changing public behaviour – as well as about its selective account of evolved human nature.

influence

Nudging has had a strong and illustrious run at the highest level. Outgoing US President Barack Obama and former UK Prime Minister David Cameron both set up behavioural science units at the heart of their administrations (Sunstein was the administrator of the White House Office of Information and Regulatory Affairs from 2009 to 2012).

Sunstein insists that the powers that be cannot avoid nudging us. Every shop floor plan, every new office design, every commercial marketing campaign, every public information campaign, is an “architecting of choices”. As anyone who ever tried to leave IKEA quickly will suspect, that endless, furniture-strewn path to the exit is no accident.

Nudges “steer people in particular directions, but also allow them to go their own way”. They are entreaties to change our habits, to accept old or new norms, but they presume thatwe are ultimately free to refuse the request.

However, our freedom is easily constrained by “cognitive biases”. Our brains, say the nudgers, are lazy, energy-conserving mechanisms, often overwhelmed by information. So a good way to ensure that people pay into their pensions, for example, is to set payment as a “default” in employment contracts, so the employee has to actively untick the box. Defaults of all kinds exploit our preference for inertia and the status quo in order to increase future security….

Sunstein makes useful distinctions between nudges and the other things governments and enterprises can do. Nudges are not “mandates” (laws, regulations, punishments). A mandate would be, for example, a rigorous and well-administered carbon tax, secured through a democratic or representative process. A “nudge” puts smiley faces on your energy bill, and compares your usage to that of the eco-efficient Joneses next door (nudgers like to game our herd-like social impulses).

In a fascinating survey section, which asks Americans and others what they actually think about being the subjects of the “architecting” of their choices, Sunstein discovers that “if people are told that they are being nudged, they will react adversely and resist”.

This is why nudge thinking may be faltering – its understanding of human nature unnecessarily (and perhaps expediently) downgrades our powers of conscious thought….(More)

See The Ethics of Influence: Government in the age of behavioral science Cass R. Sunstein, Cambridge University Press

Talent Gap Is a Main Roadblock as Agencies Eye Emerging Tech


Theo Douglas in GovTech: “U.S. public service agencies are closely eyeing emerging technologies, chiefly advanced analytics and predictive modeling, according to a new report from Accenture, but like their counterparts globally they must address talent and complexity issues before adoption rates will rise.

The report, Emerging Technologies in Public Service, compiled a nine-nation survey of IT officials across all levels of government in policing and justice, health and social services, revenue, border services, pension/Social Security and administration, and was released earlier this week.

It revealed a deep interest in emerging tech from the public sector, finding 70 percent of agencies are evaluating their potential — but a much lower adoption level, with just 25 percent going beyond piloting to implementation….

The revenue and tax industries have been early adopters of advanced analytics and predictive modeling, he said, while biometrics and video analytics are resonating with police agencies.

In Australia, the tax office found using voiceprint technology could save 75,000 work hours annually.

Closer to home, Utah Chief Technology Officer Dave Fletcher told Accenture that consolidating data centers into a virtualized infrastructure improved speed and flexibility, so some processes that once took weeks or months can now happen in minutes or hours.

Nationally, 70 percent of agencies have either piloted or implemented an advanced analytics or predictive modeling program. Biometrics and identity analytics were the next most popular technologies, with 29 percent piloting or implementing, followed by machine learning at 22 percent.

Those numbers contrast globally with Australia, where 68 percent of government agencies have charged into piloting and implementing biometric and identity analytics programs; and Germany and Singapore, where 27 percent and 57 percent of agencies respectively have piloted or adopted video analytic programs.

Overall, 78 percent of respondents said they were either underway or had implemented some machine-learning technologies.

The benefits of embracing emerging tech that were identified ranged from finding better ways of working through automation to innovating and developing new services and reducing costs.

Agencies told Accenture their No. 1 objective was increasing customer satisfaction. But 89 percent said they’d expect a return on implementing intelligent technology within two years. Four-fifths, or 80 percent, agreed intelligent tech would improve employees’ job satisfaction….(More).