Datafication and empowerment: How the open data movement re-articulates notions of democracy, participation, and journalism


Paper by Stefan Baack at Big Data and Society: “This article shows how activists in the open data movement re-articulate notions of democracy, participation, and journalism by applying practices and values from open source culture to the creation and use of data. Focusing on the Open Knowledge Foundation Germany and drawing from a combination of interviews and content analysis, it argues that this process leads activists to develop new rationalities around datafication that can support the agency of datafied publics. Three modulations of open source are identified: First, by regarding data as a prerequisite for generating knowledge, activists transform the sharing of source code to include the sharing of raw data. Sharing raw data should break the interpretative monopoly of governments and would allow people to make their own interpretation of data about public issues. Second, activists connect this idea to an open and flexible form of representative democracy by applying the open source model of participation to political participation. Third, activists acknowledge that intermediaries are necessary to make raw data accessible to the public. This leads them to an interest in transforming journalism to become an intermediary in this sense. At the same time, they try to act as intermediaries themselves and develop civic technologies to put their ideas into practice. The article concludes with suggesting that the practices and ideas of open data activists are relevant because they illustrate the connection between datafication and open source culture and help to understand how datafication might support the agency of publics and actors outside big government and big business….(More)

We are data: the future of machine intelligence


Douglas Coupland in the Financial Times: “…But what if the rise of Artificial Intuition instead blossoms under the aegis of theology or political ideology? With politics we can see an interesting scenario developing in Europe, where Google is by far the dominant search engine. What is interesting there is that people are perfectly free to use Yahoo or Bing yet they choose to stick with Google and then they get worried about Google having too much power — which is an unusual relationship dynamic, like an old married couple. Maybe Google could be carved up into baby Googles? But no. How do you break apart a search engine? AT&T was broken into seven more or less regional entities in 1982 but you can’t really do that with a search engine. Germany gets gaming? France gets porn? Holland gets commerce? It’s not a pie that can be sliced.

The time to fix this data search inequity isn’t right now, either. The time to fix this problem was 20 years ago, and the only country that got it right was China, which now has its own search engine and social networking systems. But were the British or Spanish governments — or any other government — to say, “OK, we’re making our own proprietary national search engine”, that would somehow be far scarier than having a private company running things. (If you want paranoia, let your government control what you can and can’t access — which is what you basically have in China. Irony!)

The tendency in theocracies would almost invariably be one of intense censorship, extreme limitations of access, as well as machine intelligence endlessly scouring its system in search of apostasy and dissent. The Americans, on the other hand, are desperately trying to implement a two-tiered system to monetise information in the same way they’ve monetised medicine, agriculture, food and criminality. One almost gets misty-eyed looking at North Koreans who, if nothing else, have yet to have their neurons reconfigured, thus turning them into a nation of click junkies. But even if they did have an internet, it would have only one site to visit, and its name would be gloriousleader.nk.

. . .

To summarise. Everyone, basically, wants access to and control over what you will become, both as a physical and metadata entity. We are also on our way to a world of concrete walls surrounding any number of niche beliefs. On our journey, we get to watch machine intelligence become profoundly more intelligent while, as a society, we get to watch one labour category after another be systematically burped out of the labour pool. (Doug’s Law: An app is only successful if it puts a lot of people out of work.)…(More)”

The digital revolution liberating Latin American people


Luis Alberto Moreno in the Financial Times: “Imagine a place where citizens can deal with the state entirely online, where all health records are electronic and the wait for emergency care is just seven minutes. Singapore? Switzerland? Try Colima, Mexico.

Pessimists fear the digital revolution will only widen social and economic disparities in the developing world — particularly in Latin America, the world’s most unequal region. But Colima, though small and relatively prosperous, shows how some of the region’s governments are harnessing these tools to modernise services, improve quality of life and share the benefits of technology more equitably.

In the past 10 years, this state of about 600,000 people has transformed the way government works, going completely digital. Its citizens can carry out 62 procedures online, from applying for permits to filing crime reports. No internet at home? Colima offers hundreds of free WiFi hotspots.

Colombia and Peru are taking broadband to remote corners of their rugged territories. Bogotá has subsidised the ex­pansion of its fibre optic network, which now links virtually every town in the country. Peru is expanding a programme that aims to bring WiFi to schools, hospitals and other public buildings in each of its 25 regions. The Colombian plan, Vive Digital, fosters internet adoption among all its citizens. Taxes on computers, tablets and smartphones have been scrapped. Low-income families have been given vouchers to sign up for broadband. In five years, the percentage of households connected to the internet jumped from 16 per cent to 50 per cent. Among small businesses it soared from 7 per cent to 61 per cent .

Inexpensive devices and ubiquitous WiFi, however, do not guarantee widespread usage. Diego Molano Vega, an architect of Vive Digital, found that many programs designed for customers in developed countries were ill suited to most Colombians. “There are no poor people in Silicon Valley,” he says. Latin American governments should use their purchasing power to push for development of digital services easily adopted by their citizens and businesses. Chile is a leader: it has digitised hundreds of trámites — bureaucratic procedures involving endless forms and queues. In a 4,300km-longcountry of mountains, deserts and forests, this enables access to all sorts of services through the internet. Entrepreneurs can now register businesses online for free in a single day.

In Chile, entrepreneurs can now register new businesses online for free in a single day

Technology can be harnessed to boost equity in education. Brazil’s Mato Grosso do Sul state launched a free online service to prepare high school students for a tough national exam in which a good grade is a prerequisite for admission to federal universities. On average the results of the students who used the service were 31 per cent higher than those of their peers, prompting 10 other states to adopt the system.

Digital tools can also help raise competitiveness in business. Uruguay’s livestock information system keeps track of the country’s cattle. The publicly financed electronic registry ensures every beast can be traced, making it easier to monitor outbreaks of diseases….(More)”

 

A computational algorithm for fact-checking


Kurzweil News: “Computers can now do fact-checking for any body of knowledge, according to Indiana University network scientists, writing in an open-access paper published June 17 in PLoS ONE.

Using factual information from summary infoboxes from Wikipedia* as a source, they built a “knowledge graph” with 3 million concepts and 23 million links between them. A link between two concepts in the graph can be read as a simple factual statement, such as “Socrates is a person” or “Paris is the capital of France.”

In the first use of this method, IU scientists created a simple computational fact-checker that assigns “truth scores” to statements concerning history, geography and entertainment, as well as random statements drawn from the text of Wikipedia. In multiple experiments, the automated system consistently matched the assessment of human fact-checkers in terms of the humans’ certitude about the accuracy of these statements.

Dealing with misinformation and disinformation

In what the IU scientists describe as an “automatic game of trivia,” the team applied their algorithm to answer simple questions related to geography, history, and entertainment, including statements that matched states or nations with their capitals, presidents with their spouses, and Oscar-winning film directors with the movie for which they won the Best Picture awards. The majority of tests returned highly accurate truth scores.

Lastly, the scientists used the algorithm to fact-check excerpts from the main text of Wikipedia, which were previously labeled by human fact-checkers as true or false, and found a positive correlation between the truth scores produced by the algorithm and the answers provided by the fact-checkers.

Significantly, the IU team found their computational method could even assess the truthfulness of statements about information not directly contained in the infoboxes. For example, the fact that Steve Tesich — the Serbian-American screenwriter of the classic Hoosier film “Breaking Away” — graduated from IU, despite the information not being specifically addressed in the infobox about him.

Using multiple sources to improve accuracy and richness of data

“The measurement of the truthfulness of statements appears to rely strongly on indirect connections, or ‘paths,’ between concepts,” said Giovanni Luca Ciampaglia, a postdoctoral fellow at the Center for Complex Networks and Systems Research in the IU Bloomington School of Informatics and Computing, who led the study….

“These results are encouraging and exciting. We live in an age of information overload, including abundant misinformation, unsubstantiated rumors and conspiracy theories whose volume threatens to overwhelm journalists and the public. Our experiments point to methods to abstract the vital and complex human task of fact-checking into a network analysis problem, which is easy to solve computationally.”

Expanding the knowledge base

Although the experiments were conducted using Wikipedia, the IU team’s method does not assume any particular source of knowledge. The scientists aim to conduct additional experiments using knowledge graphs built from other sources of human knowledge, such as Freebase, the open-knowledge base built by Google, and note that multiple information sources could be used together to account for different belief systems….(More)”

Can We Focus on What Works?


John Kamensky in GovExec: “Can we shift the conversation in Washington from “waste, fraud, and abuse” to “what works and let’s fund it” instead?

I attended a recent Senate hearing on wasteful spending in the federal government, and some of the witnesses pointed to examples such as the legislative requirement that the Defense Department ship coal to Germany to heat American bases there. Others pointed to failures of large-scale computer projects and the dozens of programs on the Government Accountability Office’s High Risk List.

While many of the examples were seen as shocking, there was little conversation about focusing on what works and expanding those programs.

Interestingly, there is a movement underway across the U.S. to do just that. There are advocacy groups, foundations, states and localities promoting the idea of “let’s find out what works and fund it.” Some call this “evidence-based government,” “Moneyball government,” or “pay for success.” The federal government has dipped its toes in the water, a well, with several pilot programs in various agencies and bipartisan legislation pending in Congress.

The hot, new thing that has captured the imaginations of many policy wonks is called “Pay for Success,” or in some circles, “social impact bonds.”

In 2010, the British government launched an innovative funding scheme, which it called social impact bonds, where private sector investors committed funding upfront to pay for improved social outcomes that result in public sector savings. The investors were repaid by the government only when the outcomes were determined to have been achieved.

This funding scheme has attracted substantial attention in the U.S. where it and many variations are being piloted.

What is “Pay for Success?” According to the Urban Institute, PFS is a type of performance-based contracting used to support the delivery of targeted, high-impact preventive social services, in which intervention at an early stage can reduce the need for higher-cost services in the future.

For example, experts believe that preventing asthma attacks among at-risk children reduces emergency room visits and hospitalization, which are more costly than preventive services. When the government pays for preventive services, it hopes to lower its costs….(More)”

Want to fix the world? Start by making clean energy a default setting


Chris Mooney in the Washington Post: “In recent years, psychologists and behavioral scientists have begun to decipher why we make the choices that we do when it comes to using energy. And the bottom line is that it’s hard to characterize those choices as fully “rational.”

Rather than acting like perfect homo economicuses, they’ve found, we’rehighly swayed by the energy use of our neighbors and friends — peer pressure, basically. At the same time, we’re also heavily biased by the status quo — we delay in switching to new energy choices, even when they make a great deal of economic sense.

 All of which has led to the popular idea of “nudging,” or the idea that you can subtly sway people to change their behavior by changing, say, the environment in which they make choices, or the kinds of information they receive. Not in a coercive way, but rather, through gentle tweaks and prompts. And now, a major study in Nature Climate Change demonstrates that one very popular form of energy-use nudging that might be called “default switching,” or the “default effect,” does indeed work — and indeed, could possibly work at a very large scale.

“This is the first demonstration of a large-scale nudging effect using defaults in the domain of energy choices,” says Sebastian Lotz of Stanford University and the University of Lausanne in Switzerland, who conducted the research with Felix Ebeling of the University of Cologne in Germany….(More)”

Shedding light on government, one dataset at a time


Bill Below of the OECD Directorate for Public Governance and Territorial Development at OECD Insights: “…As part of its Open Government Data (OGD) work, the OECD has created OURdata, an index that assesses governments’ efforts to implement OGD in three critical areas: Openness, Usefulness and Re-usability. The results are promising. Those countries that began the process in earnest some five years ago, today rank very high on the scale. According to this Index, which closely follows the principles of the G8 Open Data Charter, Korea is leading the implementation of OGD initiatives with France a close second.

ourdata

Those who have started the process but who are lagging (such as Poland) can draw on the experience of other OECD countries, and benefit from a clear roadmap to guide them.

Indeed, bringing one’s own country’s weaknesses out into the light is the first, and sometimes most courageous, step towards achieving the benefits of OGD. Poland has just completed its Open Government Data country review with the OECD revealing some sizable challenges ahead in transforming the internal culture of its institutions. For the moment, a supply-side rather than people-driven approach to data release is prevalent. Also, OGD in Poland is not widely understood to be a source of value creation and growth….(More)”

Constitutional Conventions in the Digital Era: Lessons from Iceland and Ireland


Paper by Silvia Suteu: “Mechanisms of constitutional development have recently attracted significant attention, specifically, instances where popular involvement was central to the constitutional change. Examples include attempts by British Columbia, the Netherlands, and Ontario at electoral reform, in addition to the more sweeping reforms sought in Iceland and Ireland. Each of these countries’ attempts exemplifies varied innovative avenues to reform involving participatory and partially citizen-led processes aimed at revitalizing politics. The little legal scholarship on these developments has provided an insufficient analytical account of such novel approaches to constitution-making. This Essay seeks to build upon the current descriptive work on constitutional conventions by focusing on the cases of Iceland and Ireland. The Essay further aims to evaluate whether the means undertaken by each country translates into novelty at a more substantive level, namely, the quality of the process and legitimacy of the end product. The Essay proposes standards of direct democratic engagements that adequately fit these new developments and further identifies lessons for participatory constitution-making processes in the digital twenty-first century….(More)”

Selected Readings on Data Governance


Jos Berens (Centre for Innovation, Leiden University) and Stefaan G. Verhulst (GovLab)

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data governance was originally published in 2015.

Context
The field of Data Collaboratives is premised on the idea that sharing and opening-up private sector datasets has great – and yet untapped – potential for promoting social good. At the same time, the potential of data collaboratives depends on the level of societal trust in the exchange, analysis and use of the data exchanged. Strong data governance frameworks are essential to ensure responsible data use. Without such governance regimes, the emergent data ecosystem will be hampered and the (perceived) risks will dominate the (perceived) benefits. Further, without adopting a human-centered approach to the design of data governance frameworks, including iterative prototyping and careful consideration of the experience, the responses may fail to be flexible and targeted to real needs.

Selected Readings List (in alphabetical order)

Annotated Selected Readings List (in alphabetical order)

Better Place Lab, “Privacy, Transparency and Trust.” Mozilla, 2015. Available from: http://www.betterplace-lab.org/privacy-report.

  • This report looks specifically at the risks involved in the social sector having access to datasets, and the main risks development organizations should focus on to develop a responsible data use practice.
  • Focusing on five specific countries (Brazil, China, Germany, India and Indonesia), the report displays specific country profiles, followed by a comparative analysis centering around the topics of privacy, transparency, online behavior and trust.
  • Some of the key findings mentioned are:
    • A general concern on the importance of privacy, with cultural differences influencing conception of what privacy is.
    • Cultural differences determining how transparency is perceived, and how much value is attached to achieving it.
    • To build trust, individuals need to feel a personal connection or get a personal recommendation – it is hard to build trust regarding automated processes.

Montjoye, Yves Alexandre de; Kendall, Jake and; Kerry, Cameron F. “Enabling Humanitarian Use of Mobile Phone Data.” The Brookings Institution, 2015. Available from: http://www.brookings.edu/research/papers/2014/11/12-enabling-humanitarian-use-mobile-phone-data.

  • Focussing in particular on mobile phone data, this paper explores ways of mitigating privacy harms involved in using call detail records for social good.
  • Key takeaways are the following recommendations for using data for social good:
    • Engaging companies, NGOs, researchers, privacy experts, and governments to agree on a set of best practices for new privacy-conscientious metadata sharing models.
    • Accepting that no framework for maximizing data for the public good will offer perfect protection for privacy, but there must be a balanced application of privacy concerns against the potential for social good.
    • Establishing systems and processes for recognizing trusted third-parties and systems to manage datasets, enable detailed audits, and control the use of data so as to combat the potential for data abuse and re-identification of anonymous data.
    • Simplifying the process among developing governments in regards to the collection and use of mobile phone metadata data for research and public good purposes.

Centre for Democracy and Technology, “Health Big Data in the Commercial Context.” Centre for Democracy and Technology, 2015. Available from: https://cdt.org/insight/health-big-data-in-the-commercial-context/.

  • Focusing particularly on the privacy issues related to using data generated by individuals, this paper explores the overlap in privacy questions this field has with other data uses.
  • The authors note that although the Health Insurance Portability and Accountability Act (HIPAA) has proven a successful approach in ensuring accountability for health data, most of these standards do not apply to developers of the new technologies used to collect these new data sets.
  • For non-HIPAA covered, customer facing technologies, the paper bases an alternative framework for consideration of privacy issues. The framework is based on the Fair Information Practice Principles, and three rounds of stakeholder consultations.

Center for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice.” Centre for Information Policy Leadership, Hunton & Williams LLP, 2015. Available from: https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.

  • This white paper is part of a project aiming to explain what is often referred to as a new, risk-based approach to privacy, and the development of a privacy risk framework and methodology.
  • With the pace of technological progress often outstripping the capabilities of privacy officers to keep up, this method aims to offer the ability to approach privacy matters in a structured way, assessing privacy implications from the perspective of possible negative impact on individuals.
  • With the intended outcomes of the project being “materials to help policy-makers and legislators to identify desired outcomes and shape rules for the future which are more effective and less burdensome”, insights from this paper might also feed into the development of innovative governance mechanisms aimed specifically at preventing individual harm.

Centre for Information Policy Leadership, “Data Governance for the Evolving Digital Market Place”, Centre for Information Policy Leadership, Hunton & Williams LLP, 2011. Available from: http://www.huntonfiles.com/files/webupload/CIPL_Centre_Accountability_Data_Governance_Paper_2011.pdf.

  • This paper argues that as a result of the proliferation of large scale data analytics, new models governing data inferred from society will shift responsibility to the side of organizations deriving and creating value from that data.
  • It is noted that, with the reality of the challenge corporations face of enabling agile and innovative data use “In exchange for increased corporate responsibility, accountability [and the governance models it mandates, ed.] allows for more flexible use of data.”
  • Proposed as a means to shift responsibility to the side of data-users, the accountability principle has been researched by a worldwide group of policymakers. Tailing the history of the accountability principle, the paper argues that it “(…) requires that companies implement programs that foster compliance with data protection principles, and be able to describe how those programs provide the required protections for individuals.”
  • The following essential elements of accountability are listed:
    • Organisation commitment to accountability and adoption of internal policies consistent with external criteria
    • Mechanisms to put privacy policies into effect, including tools, training and education
    • Systems for internal, ongoing oversight and assurance reviews and external verification
    • Transparency and mechanisms for individual participation
    • Means of remediation and external enforcement

Crawford, Kate; Schulz, Jason. “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harm.” NYU School of Law, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2325784&download=yes.

  • Considering the privacy implications of large-scale analysis of numerous data sources, this paper proposes the implementation of a ‘procedural data due process’ mechanism to arm data subjects against potential privacy intrusions.
  • The authors acknowledge that some privacy protection structures already know similar mechanisms. However, due to the “inherent analytical assumptions and methodological biases” of big data systems, the authors argue for a more rigorous framework.

Letouze, Emmanuel, and; Vinck, Patrick. “The Ethics and Politics of Call Data Analytics”, DataPop Alliance, 2015. Available from: http://static1.squarespace.com/static/531a2b4be4b009ca7e474c05/t/54b97f82e4b0ff9569874fe9/1421442946517/WhitePaperCDRsEthicFrameworkDec10-2014Draft-2.pdf.

  • Focusing on the use of Call Detail Records (CDRs) for social good in development contexts, this whitepaper explores both the potential of these datasets – in part by detailing recent successful efforts in the space – and political and ethical constraints to their use.
  • Drawing from the Menlo Report Ethical Principles Guiding ICT Research, the paper explores how these principles might be unpacked to inform an ethics framework for the analysis of CDRs.

Data for Development External Ethics Panel, “Report of the External Ethics Review Panel.” Orange, 2015. Available from: http://www.d4d.orange.com/fr/content/download/43823/426571/version/2/file/D4D_Challenge_DEEP_Report_IBE.pdf.

  • This report presents the findings of the external expert panel overseeing the Orange Data for Development Challenge.
  • Several types of issues faced by the panel are described, along with the various ways in which the panel dealt with those issues.

Federal Trade Commission Staff Report, “Mobile Privacy Disclosures: Building Trust Through Transparency.” Federal Trade Commission, 2013. Available from: www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf.

  • This report looks at ways to address privacy concerns regarding mobile phone data use. Specific advise is provided for the following actors:
    • Platforms, or operating systems providers
    • App developers
    • Advertising networks and other third parties
    • App developer trade associations, along with academics, usability experts and privacy researchers

Mirani, Leo. “How to use mobile phone data for good without invading anyone’s privacy.” Quartz, 2015. Available from: http://qz.com/398257/how-to-use-mobile-phone-data-for-good-without-invading-anyones-privacy/.

  • This paper considers the privacy implications of using call detail records for social good, and ways to mitigate risks of privacy intrusion.
  • Taking example of the Orange D4D challenge and the anonymization strategy that was employed there, the paper describes how classic ‘anonymization’ is often not enough. The paper then lists further measures that can be taken to ensure adequate privacy protection.

Bernholz, Lucy. “Several Examples of Digital Ethics and Proposed Practices” Stanford Ethics of Data conference, 2014, Available from: http://www.scribd.com/doc/237527226/Several-Examples-of-Digital-Ethics-and-Proposed-Practices.

  • This list of readings prepared for Stanford’s Ethics of Data conference lists some of the leading available literature regarding ethical data use.

Abrams, Martin. “A Unified Ethical Frame for Big Data Analysis.” The Information Accountability Foundation, 2014. Available from: http://www.privacyconference2014.org/media/17388/Plenary5-Martin-Abrams-Ethics-Fundamental-Rights-and-BigData.pdf.

  • Going beyond privacy, this paper discusses the following elements as central to developing a broad framework for data analysis:
    • Beneficial
    • Progressive
    • Sustainable
    • Respectful
    • Fair

Lane, Julia; Stodden, Victoria; Bender, Stefan, and; Nissenbaum, Helen, “Privacy, Big Data and the Public Good”, Cambridge University Press, 2014. Available from: http://www.dataprivacybook.org.

  • This book treats the privacy issues surrounding the use of big data for promoting the public good.
  • The questions being asked include the following:
    • What are the ethical and legal requirements for scientists and government officials seeking to serve the public good without harming individual citizens?
    • What are the rules of engagement?
    • What are the best ways to provide access while protecting confidentiality?
    • Are there reasonable mechanisms to compensate citizens for privacy loss?

Richards, Neil M, and; King, Jonathan H. “Big Data Ethics”. Wake Forest Law Review, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2384174.

  • This paper describes the growing impact of big data analytics on society, and argues that because of this impact, a set of ethical principles to guide data use is called for.
  • The four proposed themes are: privacy, confidentiality, transparency and identity.
  • Finally, the paper discusses how big data can be integrated into society, going into multiple facets of this integration, including the law, roles of institutions and ethical principles.

OECD, “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data”. Available from: http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.

  • A globally used set of principles to inform thought about handling personal data, the OECD privacy guidelines serve as one the leading standards for informing privacy policies and data governance structures.
  • The basic principles of national application are the following:
    • Collection Limitation Principle
    • Data Quality Principle
    • Purpose Specification Principle
    • Use Limitation Principle
    • Security Safeguards Principle
    • Openness Principle
    • Individual Participation Principle
    • Accountability Principle

The White House Big Data and Privacy Working Group, “Big Data: Seizing Opportunities, Preserving Values”, White House, 2015. Available from: https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf.

  • Documenting the findings of the White House big data and privacy working group, this report lists i.a. the following key recommendations regarding data governance:
    • Bringing greater transparency to the data services industry
    • Stimulating international conversation on big data, with multiple stakeholders
    • With regard to educational data: ensuring data is used for the purpose it is collected for
    • Paying attention to the potential for big data to facilitate discrimination, and expanding technical understanding to stop discrimination

William Hoffman, “Pathways for Progress” World Economic Forum, 2015. Available from: http://www3.weforum.org/docs/WEFUSA_DataDrivenDevelopment_Report2015.pdf.

  • This paper treats i.a. the lack of well-defined and balanced governance mechanisms as one of the key obstacles preventing particularly corporate sector data from being shared in a controlled space.
  • An approach that balances the benefits against the risks of large scale data usage in a development context, building trust among all stake holders in the data ecosystem, is viewed as key.
  • Furthermore, this whitepaper notes that new governance models are required not just by the growing amount of data and analytical capacity, and more refined methods for analysis. The current “super-structure” of information flows between institutions is also seen as one of the key reasons to develop alternatives to the current – outdated – approaches to data governance.

The perils of extreme democracy


The Economist: “California cannot pass timely budgets even in good years, which is one reason why its credit rating has, in one generation, fallen from one of the best to the absolute worst among the 50 states. How can a place which has so much going for it—from its diversity and natural beauty to its unsurpassed talent clusters in Silicon Valley and Hollywood—be so poorly governed? ….But as our special report this week argues, the main culprit has been direct democracy: recalls, in which Californians fire elected officials in mid-term; referendums, in which they can reject acts of their legislature; and especially initiatives, in which the voters write their own rules. Since 1978, when Proposition 13 lowered property-tax rates, hundreds of initiatives have been approved on subjects from education to the regulation of chicken coops.

This citizen legislature has caused chaos. Many initiatives have either limited taxes or mandated spending, making it even harder to balance the budget. Some are so ill-thought-out that they achieve the opposite of their intent: for all its small-government pretensions, Proposition 13 ended up centralising California’s finances, shifting them from local to state government. Rather than being the curb on elites that they were supposed to be, ballot initiatives have become a tool of special interests, with lobbyists and extremists bankrolling laws that are often bewildering in their complexity and obscure in their ramifications. And they have impoverished the state’s representative government. Who would want to sit in a legislature where 70-90% of the budget has already been allocated?

This has been a tragedy for California, but it matters far beyond the state’s borders. Around half of America’s states and an increasing number of countries have direct democracy in some form (article). Next month Britain will have its first referendum for years (on whether to change its voting system), and there is talk of voter recalls for aberrant MPs. The European Union has just introduced the first supranational initiative process. With technology making it ever easier to hold referendums and Western voters ever more angry with their politicians, direct democracy could be on the march.

And why not? There is, after all, a successful model: in Switzerland direct democracy goes back to the Middle Ages at the local level and to the 19th century at the federal. This mixture of direct and representative democracy seems to work well. Surely it is just a case of California (which explicitly borrowed the Swiss model) executing a good idea poorly?

Not entirely. Very few people, least of all this newspaper, want to ban direct democracy. Indeed, in some cases referendums are good things: they are a way of holding a legislature to account. In California reforms to curb gerrymandering and non-partisan primaries, both improvements, have recently been introduced by initiatives; and they were pushed by Arnold Schwarzenegger, a governor elected through the recall process. But there is a strong case for proceeding with caution, especially when it comes to allowing people to circumvent a legislature with citizen-made legislation.

The debate about the merits of representative and direct democracy goes back to ancient times. To simplify a little, the Athenians favoured pure democracy (“people rule”, though in fact oligarchs often had the last word); the Romans chose a republic, as a “public thing”, where representatives could make trade-offs for the common good and were accountable for the sum of their achievements. America’s Founding Fathers, especially James Madison and Alexander Hamilton, backed the Romans. Indeed, in their guise of “Publius” in the “Federalist Papers”, Madison and Hamilton warn against the dangerous “passions” of the mob and the threat of “minority factions” (ie, special interests) seizing the democratic process.

Proper democracy is far more than a perpetual ballot process. It must include deliberation, mature institutions and checks and balances such as those in the American constitution. Ironically, California imported direct democracy almost a century ago as a “safety valve” in case government should become corrupt. The process began to malfunction only relatively recently. With Proposition 13, it stopped being a valve and instead became almost the entire engine.

….More important, direct democracy must revert to being a safety valve, not the engine. Initiatives should be far harder to introduce. They should be shorter and simpler, so that voters can actually understand them. They should state what they cost, and where that money is to come from. And, if successful, initiatives must be subject to amendment by the legislature. Those would be good principles to apply to referendums, too….(More)”