Statistics and Open Data: Harvesting unused knowledge, empowering citizens and improving public services


House of Commons Public Administration Committee (Tenth Report):
“1. Open data is playing an increasingly important role in Government and society. It is data that is accessible to all, free of restrictions on use or redistribution and also digital and machine-readable so that it can be combined with other data, and thereby made more useful. This report looks at how the vast amounts of data generated by central and local Government can be used in open ways to improve accountability, make Government work better and strengthen the economy.

2. In this inquiry, we examined progress against a series of major government policy announcements on open data in recent years, and considered the prospects for further development. We heard of government open data initiatives going back some years, including the decision in 2009 to release some Ordnance Survey (OS) data as open data, and the Public Sector Mapping Agreement (PSMA) which makes OS data available for free to the public sector.  The 2012 Open Data White Paper ‘Unleashing the Potential’ says that transparency through open data is “at the heart” of the Government’s agenda and that opening up would “foster innovation and reform public services”. In 2013 the report of the independently-chaired review by Stephan Shakespeare, Chief Executive of the market research and polling company YouGov, of the use, re-use, funding and regulation of Public Sector Information urged Government to move fast to make use of data. He criticised traditional public service attitudes to data before setting out his vision:

    • To paraphrase the great retailer Sir Terry Leahy, to run an enterprise without data is like driving by night with no headlights. And yet that is what Government often does. It has a strong institutional tendency to proceed by hunch, or prejudice, or by the easy option. So the new world of data is good for government, good for business, and above all good for citizens. Imagine if we could combine all the data we produce on education and health, tax and spending, work and productivity, and use that to enhance the myriad decisions which define our future; well, we can, right now. And Britain can be first to make it happen for real.

3. This was followed by publication in October 2013 of a National Action Plan which sets out the Government’s view of the economic potential of open data as well as its aspirations for greater transparency.

4. This inquiry is part of our wider programme of work on statistics and their use in Government. A full description of the studies is set out under the heading “Statistics” in the inquiries section of our website, which can be found at www.parliament.uk/pasc. For this inquiry we received 30 pieces of written evidence and took oral evidence from 12 witnesses. We are grateful to all those who have provided evidence and to our Specialist Adviser on statistics, Simon Briscoe, for his assistance with this inquiry.”

Table of Contents:

Summary
1 Introduction
2 Improving accountability through open data
3 Open Data and Economic Growth
4 Improving Government through open data
5 Moving faster to make a reality of open data
6 A strategic approach to open data?
Conclusion
Conclusions and recommendations

Building a More Open Government


Corinna Zarek at the White House: “It’s Sunshine Week again—a chance to celebrate transparency and participation in government and freedom of information. Every year in mid-March, we take stock of our progress and where we are headed to make our government more open for the benefit of citizens.
In December, 2013, the Administration announced 23 ambitious commitments to further open up government over the next two years in U.S. Government’s  second Open Government National Action Plan. Those commitments are now all underway or in development, including:
·         Launching an improved Data.gov: The updated Data.gov debuted in January, 2014, and continues to grow with thousands of updated or new government data sets being proactively made available to the public.
·         Increasing public collaboration: Through crowdsourcing, citizen science, and other methods, Federal agencies continue to expand the ways they collaborate with the public. For example, the National Aeronautics and Space Administration, for instance, recently launched its third Asteroid Grand Challenge, a broad call to action, seeking the best and brightest ideas from non-traditional partners to enhance and accelerate the work NASA is already doing for planetary defense.
·         Improving We the People: The online petition platform We the People gives the public a direct way to participate in their government and is currently incorporating improvements to make it easier for the public to submit petitions and signatures.”

New Field Guide Explores Open Data Innovations in Disaster Risk and Resilience


Worldbank: “From Indonesia to Bangladesh to Nepal, community members armed with smartphones and GPS systems are contributing to some of the most extensive and versatile maps ever created, helping inform policy and better prepare their communities for disaster risk.
In Jakarta, more than 500 community members have been trained to collect data on thousands of hospitals, schools, private buildings, and critical infrastructure. In Sri Lanka, government and academic volunteers mapped over 30,000 buildings and 450 km of roadways using a collaborative online resource called OpenStreetMaps.
These are just a few of the projects that have been catalyzed by the Open Data for Resilience Initiative (OpenDRI), developed by the World Bank’s Global Facility for Disaster Reduction and Recovery (GFDRR). Launched in 2011, OpenDRI is active in more than 20 countries today, mapping tens of thousands of buildings and urban infrastructure, providing more than 1,000 geospatial datasets to the public, and developing innovative application tools.
To expand this work, the World Bank Group has launched the OpenDRI Field Guide as a showcase of successful projects and a practical guide for governments and other organizations to shape their own open data programs….
The field guide walks readers through the steps to build open data programs based on the OpenDRI methodology. One of the first steps is data collation. Relevant datasets are often locked because of proprietary arrangements or fragmented in government bureaucracies. The field guide explores tools and methods to enable the participatory mapping projects that can fill in gaps and keep existing data relevant as cities rapidly expand.

GeoNode: Mapping Disaster Damage for Faster Recovery
One example is GeoNode, a locally controlled and open source cataloguing tool that helps manage and visualize geospatial data. The tool, already in use in two dozen countries, can be modified and easily be integrated into existing platforms, giving communities greater control over mapping information.
GeoNode was used extensively after Typhoon Yolanda (Haiyan) swept the Philippines with 300 km/hour winds and a storm surge of over six meters last fall. The storm displaced nearly 11 million people and killed more than 6,000.
An event-specific GeoNode project was created immediately and ultimately collected more than 72 layers of geospatial data, from damage assessments to situation reports. The data and quick analysis capability contributed to recovery efforts and is still operating in response mode at Yolandadata.org.
InaSAFE: Targeting Risk Reduction
A sister project, InaSAFE, is an open, easy-to-use tool for creating impact assessments for targeted risk reduction. The assessments are based on how an impact layer – such as a tsunami, flood, or earthquake – affects exposure data, such as population or buildings.
With InaSAFE, users can generate maps and statistical information that can be easily disseminated and even fed back into projects like GeoNode for simple, open source sharing.
The initiative, developed in collaboration with AusAID and the Government of Indonesia, was put to the test in the 2012 flood season in Jakarta, and its successes provoked a rapid national rollout and widespread interest from the international community.
Open Cities: Improving Urban Planning & Resilience
The Open Cities project, another program operating under the OpenDRI platform, aims to catalyze the creation, management and use of open data to produce innovative solutions for urban planning and resilience challenges across South Asia.
In 2013, Kathmandu was chosen as a pilot city, in part because the population faces the highest mortality threat from earthquakes in the world. Under the project, teams from the World Bank assembled partners and community mobilizers to help execute the largest regional community mapping project to date. The project surveyed more than 2,200 schools and 350 health facilities, along with road networks, points of interest, and digitized building footprints – representing nearly 340,000 individual data nodes.”

Why the wealthiest countries are also the most open with their data


Emily Badger in the Washington Post: “The Oxford Internet Institute this week posted a nice visualization of the state of open data in 70 countries around the world, reflecting the willingness of national governments to release everything from transportation timetables to election results to machine-readable national maps. The tool is based on the Open Knowledge Foundation’s Open Data Index, an admittedly incomplete but telling assessment of who willingly publishes updated, accurate national information on, say, pollutants (Sweden) and who does not (ahem, South Africa).

Oxford Internet Institute
Tally up the open data scores for these 70 countries, and the picture looks like this, per the Oxford Internet Institute (click on the picture to link through to the larger interactive version):
Oxford Internet Institute
…With apologies for the tiny, tiny type (and the fact that many countries aren’t listed here at all), a couple of broad trends are apparent. For one, there’s a prominent global “openness divide,” in the words of the Oxford Internet Institute. The high scores mostly come from Europe and North America, the low scores from Asia, Africa and Latin America. Wealth is strongly correlated with “openness” by this measure, whether we look at World Bank income groups or Gross National Income per capita. By the OII’s calculation, wealth accounts for about a third of the variation in these Open Data Index scores.
Perhaps this is an obvious correlation, but the reasons why open data looks like the luxury of rich economies are many, and they point to the reality that poor countries face a lot more obstacles to openness than do places like the United States. For one thing, openness is also closely correlated with Internet penetration. Why open your census results if people don’t have ways to access it (or means to demand it)? It’s no easy task to do this, either.”

The intelligent citizen


John Bell in AlJazeera: “A quarter century after the demise of the Soviet Union, is the Western model of government under threat? …. The pressures are coming from several directions.
All states are feeling the pressure from unregulated global flows of capital that create obscene concentrations of wealth, and an inability of the nation-state to respond.Relatedly, citizens either ignore or distrust traditional institutions, and ethnic groups demand greater local autonomy.
A recent Pew survey shows that Americans aged 18-33 mostly identify as political independents and distrust institutions. The classic model is indeed frayed, and new developments have made it feel like very old cloth.
One natural reflex is to assert even greater control, a move suited to the authoritarians, such as China, Russia or General Abdel Fattah al-Sisi‘s Egypt: Strengthen the nation by any means to withstand the pressures. The reality, however, is that all systems, democracies or otherwise, were designed for an industrial age, and the management of an anonymous mass, and cannot cope with globalised economics and the information world of today.
The question remains: What can effectively replace the Western model? The answer may not lie only in the invention of new structures, but in the improvement of a key component found in all: The citizen.
The citizen today is mostly a consumer, focused on the purchase of goods or services, or the insistent consumption of virtual information, often as an ersatz politics. Occasionally, when a threat rises, he or she also becomes a demandeur of absolute security from the state. Indeed, some are using the new technologies for democratic purposes, they are better informed, criticise abuse or corruption, and organise and rally movements.
But, the vast majority is politically disengaged and cynical of political process; many others are too busy trying to survive to even care. A grand apathy has set in, the stage left vacant for a very few extremists, or pied pipers of the old tunes of nationalisms and tribal belonging disguised as leaders. The citizen is either invisible in this circus, an endangered species in the 21st century, or increasingly drawn to dark and polarising forces.
Some see the glass as half full and believe that technology and direct democracy can bridge the gaps. Indeed, the internet provides a plethora of information and a greater sense of empowerment. Lesser-known protests in Bosnia have led to direct democracy plenums, and the Swiss do revert to national referenda. However, whether direct or representative, democracy will still depend on the quality of the citizen, and his or her decisions.
Opinion, dogma and bias
Opinion, dogma and bias remain common political operating system and, as a result, our politics are still an unaffordable game of chance. The optimists may be right, but discussions in social media on issues ranging from Ukraine to gun control reveal more deep bias and the lure of excitement than the pursuit of a constructive answer.
People crave excitement in their politics. Whether it is through asserting their own opinion or in battling others, politics offers a great ground for this high. The cost, however, comes in poor judgment and dangerous decisions. George W. Bush was elected twice, Vladimir Putin has much support, climate change is denied, and an intoxicated Mayor of Toronto, Rob Ford, may be re-elected.
Few are willing to admit their role in this state of affairs, but they will gladly see the ill in others. Even fewer, including often myself, will admit that they don’t really know how to think through a challenge, political or otherwise. This may seem absurd, thinking feels as natural as walking, but the formation of political opinion is a complex affair, a flawed duet between our minds and outside input. Media, government propaganda, family, culture, and our own unique set of personal experiences, from traumas to chance meetings, all play into the mix. High states of emotion, “excitement”, also weigh in, making us dumb and easily manipulated….
This step may also be a precursor for another that involves the ordinary person. Today being a citizen involves occasional voting, politics as spectator sport, and, among some, being a watchdog against corruption or street activism. What may be required is more citizens’ participation in local democracy, not just in assemblies or casting votes, but in local management and administration.
This will help people understand the complexities of politics, gain greater responsibility, and mitigate some of the vices of centralisation and representative democracy. It may also harmonise with the information age, where everyone, it seems, wishes to be empowered.
Do people have time in their busy lives? A rotational involvement in local affairs can help solve this, and many might even find the engagement enjoyable. This injection of a real citizen into the mix may improve the future of politics while large institutions continue to hum their tune.
In the end, a citizen who has responsibility for his actions can probably make any structure work, while rejecting any that would endanger his independence and dignity. The rise of a more intelligent and committed citizen may clarify politics, improve liberal democracies, and make populism and authoritarianism less appealing and likely paths.”

How Maps Drive Decisions at EPA


Joseph Marks at NextGov: “The Environmental Protection Agency office charged with taking civil and criminal actions against water and air polluters used to organize its enforcement targeting meetings and conference calls around spreadsheets and graphs.

The USA National Wetlands Inventory is one of the interactive maps produced by the Geoplatform.gov tool.

Those spreadsheets detailed places with large oil and gas production and other possible pollutants where EPA might want to focus its own inspection efforts or reach out to state-level enforcement agencies.
During the past two years, the agency has largely replaced those spreadsheets and tables with digital maps, which make it easier for participants to visualize precisely where the top polluting areas are and how those areas correspond to population centers, said Harvey Simon, EPA’s geospatial information officer, making it easier for the agency to focus inspections and enforcement efforts where they will do the most good.
“Rather than verbally going through tables and spreadsheets you have a lot of people who are not [geographic information systems] practitioners who are able to share map information,” Simon said. “That’s allowed them to take a more targeted and data-driven approach to deciding what to do where.”
The change is a result of the EPA Geoplatform, a tool built off Esri’s ArcGIS Online product, which allows companies and government agencies to build custom Web maps using base maps provided by Esri mashed up with their own data.
When the EPA Geoplatform launched in May 2012 there were about 250 people registered to create and share mapping data within the agency. That number has grown to more than 1,000 during the past 20 months, Simon said.
“The whole idea of the platform effort is to democratize the use of geospatial information within the agency,” he said. “It’s relatively simple now to make a Web map and mash up data that’s useful for your work, so many users are creating Web maps themselves without any support from a consultant or from a GIS expert in their office.”
A governmentwide Geoplatform launched in 2012, spurred largely by agencies’ frustrations with the difficulty of sharing mapping data after the 2010 explosion of the Deepwater Horizon oil rig in the Gulf of Mexico. The platform’s goal was twofold. First officials wanted to share mapping data more widely between agencies so they could avoid duplicating each other’s work and to share data more easily during an emergency.
Second, the government wanted to simplify the process for viewing and creating Web maps so they could be used more easily by nonspecialists.
EPA’s geoplatform has essentially the same goals. The majority of the maps the agency builds using the platform aren’t  publicly accessible so the EPA doesn’t have to worry about scrubbing maps of data that could reveal personal information about citizens or proprietary data about companies. It publishes some maps that don’t pose any privacy concerns on EPA websites as well as on the national geoplatform and to Data.gov, the government data repository.
Once ArcGIS Online is judged compliant with the Federal Information Security Management Act, or FISMA, which is expected this month, EPA will be able to share significantly more nonpublic maps through the national geoplatform and rely on more maps produced by other agencies, Simon said.
EPA’s geoplatform has also made it easier for the agency’s environmental justice office to share common data….”

Open Data is a Civil Right


Yo Yoshida, Founder & CEO, Appallicious in GovTech: “As Americans, we expect a certain standardization of basic services, infrastructure and laws — no matter where we call home. When you live in Seattle and take a business trip to New York, the electric outlet in the hotel you’re staying in is always compatible with your computer charger. When you drive from San Francisco to Los Angeles, I-5 doesn’t all-of-a-sudden turn into a dirt country road because some cities won’t cover maintenance costs. If you take a 10-minute bus ride from Boston to the city of Cambridge, you know the money in your wallet is still considered legal tender.

But what if these expectations of consistency were not always a given? What if cities, counties and states had absolutely zero coordination when it came to basic services? This is what it is like for us in the open data movement. There are so many important applications and products that have been built by civic startups and concerned citizens. However, all too often these efforts are confided to city limits, and unavailable to anyone outside of them. It’s time to start reimagining the way cities function and how local governments operate. There is a wealth of information housed in local governments that should be public by default to help fuel a new wave of civic participation.
Appallicious’ Neighborhood Score provides an overall health and sustainability score, block-by-block for every neighborhood in the city of San Francisco. The first time metrics have been applied to neighborhoods so we can judge how government allocates our resources, so we can better plan how to move forward. But, if you’re thinking about moving to Oakland, just a subway stop away from San Francisco and want to see the score for a neighborhood, our app can’t help you, because that city has yet to release the data sets we need.
In Contra Costa County, there is the lifesaving PulsePoint app, which notifies smartphone users who are trained in CPR when someone nearby may be in need of help. This is an amazing app—for residents of Contra Costa County. But if someone in neighboring Alameda County needs CPR, the app, unfortunately, is completely useless.
Buildingeye visualizes planning and building permit data to allow users to see what projects are being proposed in their area or city. However, buildingeye is only available in a handful of places, simply because most cities have yet to make permits publicly available. Think about what this could do for the construction sector — an industry that has millions of jobs for Americans. Buildingeye also gives concerned citizens access to public documents like never before, so they can see what might be built in their cities or on their streets.
Along with other open data advocates, I have been going from city-to-city, county-to-county and state-to-state, trying to get governments and departments to open up their massive amounts of valuable data. Each time one city, or one county, agrees to make their data publicly accessible, I can’t help but think it’s only a drop in the bucket. We need to think bigger.
Every government, every agency and every department in the country that has already released this information to the public is a case study that points to the success of open data — and why every public entity should follow their lead. There needs to be a national referendum that instructs that all government data should be open and accessible to the public.
Last May, President Obama issued an executive order requiring that going forward, any data generated by the federal government must be made available to the public in open, machine-readable formats. In the executive order, Obama stated that, “openness in government strengthens our democracy, promotes the delivery of efficient and effective services to the public, and contributes to economic growth.”
If this is truly the case, Washington has an obligation to compel local and state governments to release their data as well. Many have tried to spur this effort. California Lt. Gov. Gavin Newsom created the Citizenville Challenge to speed up adoption on the local level. The U.S. Conference of Mayors has also been vocal in promoting open data efforts. But none of these initiatives could have the same effect of a federal mandate.
What I am proposing is no small feat, and it won’t happen overnight. But there should be a concerted effort by those in the technology industry, specifically civic startups, to call on Congress to draft legislation that would require every city in the country to make their data open, free and machine readable. Passing federal legislation will not be an easy task — but creating a “universal open data” law is possible. It would require little to no funding, and it is completely nonpartisan. It’s actually not a political issue at all; it is, for lack of a better word, and administrative issue.
Often good legislation is blocked because lawmakers and citizens are concerned about project funding. While there should be support to help cities and towns achieve the capability of opening their data, a lot of the time, they don’t need it. In 2009, the city and county of San Francisco opened up its data with zero dollars. Many other cities have done the same. There will be cities and municipalities that will need financial assistance to accomplish this. But it is worth it, and it will not require a significant investment for a substantial return. There are free online open data portals, like ckan, dkan and a new effort from Accela, CivicData.com, to centralize open data efforts.
When the UK Government recently announced a £1.5 million investment to support open data initiatives, its Cabinet Office Minister said, “We know that it creates a more accountable, efficient and effective government. Open Data is a raw material for economic growth, supporting the creation of new markets, business and jobs and helping us compete in the global race.”
We should not fall behind these efforts. There is too much at stake for our citizens, not to mention our economy. A recent McKinsey report found that making open data has the potential to create $3 trillion in value worldwide.
Former Speaker Tip O’Neil famously said, “all politics are local.” But we in the civic startup space believe all data is local. Data is reporting potholes in your neighborhood and identifying high crime areas in your communities. It’s seeing how many farmers’ markets there are in your town compared to liquor stores. Data helps predict which areas of a city are most at risk during a heat wave and other natural disasters. A federal open data law would give the raw material needed to create tools to improve the lives of all Americans, not just those who are lucky enough to live in a city that has released this information on its own.
It’s a different way of thinking about how a government operates and the relationship it has with its citizens. Open data gives the public an amazing opportunity to be more involved with governmental decisions. We can increase accountability and transparency, but most importantly we can revolutionize the way local residents communicate and work with their government.
Access to this data is a civil right. If this is truly a government by, of and for the people, then its data needs to be available to all of us. By opening up this wealth of information, we will design a better government that takes advantage of the technology and skills of civic startups and innovative citizens….”

A Framework for Benchmarking Open Government Data Efforts


DS Sayogo, TA Pardo, M Cook in the HICSS ’14 Proceedings of the 2014 47th Hawaii International Conference on System Sciences: “This paper presents a preliminary exploration on the status of open government data worldwide as well as in-depth evaluation of selected open government data portals. Using web content analysis of the open government data portals from 35 countries, this study outlines the progress of open government data efforts at the national government level. This paper also conducted in-depth evaluation of selected cases to justify the application of a proposed framework for understanding the status of open government data initiatives. This paper suggest that findings of this exploration offer a new-level of understanding of the depth, breath, and impact of current open government data efforts. The review results also point to the different stages of open government data portal development in term of data content, data manipulation capability and participatory and engagement capability. This finding suggests that development of open government portal follows an incremental approach similar to those of e-government development stages in general. Subsequently, this paper offers several observations in terms of policy and practical implication of open government data portal development drawn from the application of the proposed framework”

New Research Network to Study and Design Innovative Ways of Solving Public Problems


Network

MacArthur Foundation Research Network on Opening Governance formed to gather evidence and develop new designs for governing 

NEW YORK, NY, March 4, 2014 The Governance Lab (The GovLab) at New York University today announced the formation of a Research Network on Opening Governance, which will seek to develop blueprints for more effective and legitimate democratic institutions to help improve people’s lives.
Convened and organized by the GovLab, the MacArthur Foundation Research Network on Opening Governance is made possible by a three-year grant of $5 million from the John D. and Catherine T. MacArthur Foundation as well as a gift from Google.org, which will allow the Network to tap the latest technological advances to further its work.
Combining empirical research with real-world experiments, the Research Network will study what happens when governments and institutions open themselves to diverse participation, pursue collaborative problem-solving, and seek input and expertise from a range of people. Network members include twelve experts (see below) in computer science, political science, policy informatics, social psychology and philosophy, law, and communications. This core group is supported by an advisory network of academics, technologists, and current and former government officials. Together, they will assess existing innovations in governing and experiment with new practices and how institutions make decisions at the local, national, and international levels.
Support for the Network from Google.org will be used to build technology platforms to solve problems more openly and to run agile, real-world, empirical experiments with institutional partners such as governments and NGOs to discover what can enhance collaboration and decision-making in the public interest.
The Network’s research will be complemented by theoretical writing and compelling storytelling designed to articulate and demonstrate clearly and concretely how governing agencies might work better than they do today. “We want to arm policymakers and practitioners with evidence of what works and what does not,” says Professor Beth Simone Noveck, Network Chair and author of Wiki Government: How Technology Can Make Government Better, Democracy Stronger and Citi More Powerful, “which is vital to drive innovation, re-establish legitimacy and more effectively target scarce resources to solve today’s problems.”
“From prize-backed challenges to spur creative thinking to the use of expert networks to get the smartest people focused on a problem no matter where they work, this shift from top-down, closed, and professional government to decentralized, open, and smarter governance may be the major social innovation of the 21st century,” says Noveck. “The MacArthur Research Network on Opening Governance is the ideal crucible for helping  transition from closed and centralized to open and collaborative institutions of governance in a way that is scientifically sound and yields new insights to inform future efforts, always with an eye toward real-world impacts.”
MacArthur Foundation President Robert Gallucci added, “Recognizing that we cannot solve today’s challenges with yesterday’s tools, this interdisciplinary group will bring fresh thinking to questions about how our governing institutions operate, and how they can develop better ways to help address seemingly intractable social problems for the common good.”
Members
The MacArthur Research Network on Opening Governance comprises:
Chair: Beth Simone Noveck
Network Coordinator: Andrew Young
Chief of Research: Stefaan Verhulst
Faculty Members:

  • Sir Tim Berners-Lee (Massachusetts Institute of Technology (MIT)/University of Southampton, UK)
  • Deborah Estrin (Cornell Tech/Weill Cornell Medical College)
  • Erik Johnston (Arizona State University)
  • Henry Farrell (George Washington University)
  • Sheena S. Iyengar (Columbia Business School/Jerome A. Chazen Institute of International Business)
  • Karim Lakhani (Harvard Business School)
  • Anita McGahan (University of Toronto)
  • Cosma Shalizi (Carnegie Mellon/Santa Fe Institute)

Institutional Members:

  • Christian Bason and Jesper Christiansen (MindLab, Denmark)
  • Geoff Mulgan (National Endowment for Science Technology and the Arts – NESTA, United Kingdom)
  • Lee Rainie (Pew Research Center)

The Network is eager to hear from and engage with the public as it undertakes its work. Please contact Stefaan Verhulst to share your ideas or identify opportunities to collaborate.”

The Economics of Access to Information


Article by Mariano Mosquera at Edmond J. Safra Research Lab: “There has been an important development in the study of the right of access to public information and the so-called economics of information: by combining these two premises, it is possible to outline an economics theory of access to public information.


Moral Hazard
The legal development of the right of access to public information has been remarkable. Many international conventions, laws and national regulations have been passed on this matter. In this regard, access to information has consolidated within the framework of international human rights law.
The Inter-American Court of Human Rights was the first international court to acknowledge that access to information is a human right that is part of the right to freedom of speech. The Court recognized this right in two parts, as the individual right of any person to search for information and as a positive obligation of the state to ensure the individual’s right to receive the requested information.
This right and obligation can also be seen as the demand and supply of information.
The so-called economics of information has focused on the issue of information asymmetry between the principal and the agent. The principal (society) and the agent (state) enter into a contract.This contract is based on the idea that the agent’s specialization and professionalism (or the politician’s, according to Weber) enables him to attend to the principal’s affairs, such as public affairs in this case. This representation contract does not provide for a complete delegation,but rather it involves the principal’s commitment to monitoring the agent.
When we study corruption, it is important to note that monitoring aims to ensure that the agent adjusts its behavior to comply with the contract, in order to pursue public goals, and not to serve private interests. Stiglitz4 describes moral hazard as a situation arising from information asymmetry between the principal and the agent. The principal takes a risk when acting without comprehensive information about the agent’s actions. The moral hazard means that the handling of closed, privileged information by the agent could bring about negative consequences for the principal.
In this case, it is a risk related to corrupt practices, since a public official could use the state’s power and information to achieve private benefits, and not to resolve public issues in accordance with the principal-agent contract. This creates negative social consequences.
In this model, there are a number of safeguards against moral hazard, such as monitoring institutions (with members of the opposition) and rewards for efficient and effective administration,5 among others. Access to public information could also serve as an effective means of monitoring the agent, so that the agent adjusts its behavior to comply with the contract.
The Economic Principle of Public Information
According to this principal-agent model, public information should be defined as:
information whose social interpretation enables the state to act in the best interests of society. This definition is based on the idea of information for monitoring purposes and uses a systematic approach to feedback. This definition also implies that the state is not entirely effective at adjusting its behavior by itself.
Technically, as an economic principle of public information, public information is:
information whose interpretation by the principal is useful for the agent, so that the latter adjusts its behavior to comply with the principal-agent contract. It should be noted that this is very different from the legal definition of public information, such as “any information produced or held by the state.” This type of legal definition is focused only on supply, but not on demand.
In this principal-agent model, public information stems from two different rationales: the principal’s interpretation and the usefulness for the agent. The measure of the principal’s interpretation is the likelihood of being useful for the agent. The measure of usefulness for the agent is the likelihood of adjusting the principal-agent contract.
Another totally different situation is the development of institutions that ensure the application of this principle. For example, the channels of supplied, and demanded, information, and the channels of feedback, could be strengthened so that the social interpretation that is useful for the state actually reaches the public authorities that are able to adjust policies….”