India wants all government organizations to develop open APIs


Medianama: “The department of electronics and information technology (DeitY) is looking to frame a policy (pdf) for adopting and developing open application programming interfaces (APIs) in government organizations to promote software interoperability for all e-governance applications & systems. The policy shall be applicable to all central government organizations and to those state governments that choose to adopt the policy.

DeitY also said that all information and data of a government organisation shall be made available by open APIs, as per the National Data Sharing and Accessibility Policy and adhere to National Cyber Security Policy.

Policy points

– Each published API of a Government organization shall be provided free of charge whenever possible to other government organizations and public.

– Each published API shall be properly documented with sample code and sufficient information for developers to make use of the API.

– The life-cycle of the open API shall be made available by the API publishing Government organisation. The API shall be backward compatible with at least two earlier versions.

– Government organizations may use an authentication mechanism to enable service interoperability and single sign-on.

– All Open API systems built and data provided shall adhere to GoI security policies and guidelines.

…. This would allow anyone to build a website or an application and pull government information into the public domain. Everyone knows navigating a government website can be nightmarish. For example, Indian Railways provides open APIs which enabled the development of applications such as RailYatri. Through the eRail APIs, the application pulls info which includes list of stations, trains between stations, route of a train, Train Fares, PNR Status, Live train status, seat availability, cancelled, rescheduled or diverted train information and current running status of the train. …(More)”

See also “Policy on Open Application Programming Interfaces (APIs) for Government of India

Selected Readings on Data Governance


Jos Berens (Centre for Innovation, Leiden University) and Stefaan G. Verhulst (GovLab)

The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data governance was originally published in 2015.

Context
The field of Data Collaboratives is premised on the idea that sharing and opening-up private sector datasets has great – and yet untapped – potential for promoting social good. At the same time, the potential of data collaboratives depends on the level of societal trust in the exchange, analysis and use of the data exchanged. Strong data governance frameworks are essential to ensure responsible data use. Without such governance regimes, the emergent data ecosystem will be hampered and the (perceived) risks will dominate the (perceived) benefits. Further, without adopting a human-centered approach to the design of data governance frameworks, including iterative prototyping and careful consideration of the experience, the responses may fail to be flexible and targeted to real needs.

Selected Readings List (in alphabetical order)

Annotated Selected Readings List (in alphabetical order)

Better Place Lab, “Privacy, Transparency and Trust.” Mozilla, 2015. Available from: http://www.betterplace-lab.org/privacy-report.

  • This report looks specifically at the risks involved in the social sector having access to datasets, and the main risks development organizations should focus on to develop a responsible data use practice.
  • Focusing on five specific countries (Brazil, China, Germany, India and Indonesia), the report displays specific country profiles, followed by a comparative analysis centering around the topics of privacy, transparency, online behavior and trust.
  • Some of the key findings mentioned are:
    • A general concern on the importance of privacy, with cultural differences influencing conception of what privacy is.
    • Cultural differences determining how transparency is perceived, and how much value is attached to achieving it.
    • To build trust, individuals need to feel a personal connection or get a personal recommendation – it is hard to build trust regarding automated processes.

Montjoye, Yves Alexandre de; Kendall, Jake and; Kerry, Cameron F. “Enabling Humanitarian Use of Mobile Phone Data.” The Brookings Institution, 2015. Available from: http://www.brookings.edu/research/papers/2014/11/12-enabling-humanitarian-use-mobile-phone-data.

  • Focussing in particular on mobile phone data, this paper explores ways of mitigating privacy harms involved in using call detail records for social good.
  • Key takeaways are the following recommendations for using data for social good:
    • Engaging companies, NGOs, researchers, privacy experts, and governments to agree on a set of best practices for new privacy-conscientious metadata sharing models.
    • Accepting that no framework for maximizing data for the public good will offer perfect protection for privacy, but there must be a balanced application of privacy concerns against the potential for social good.
    • Establishing systems and processes for recognizing trusted third-parties and systems to manage datasets, enable detailed audits, and control the use of data so as to combat the potential for data abuse and re-identification of anonymous data.
    • Simplifying the process among developing governments in regards to the collection and use of mobile phone metadata data for research and public good purposes.

Centre for Democracy and Technology, “Health Big Data in the Commercial Context.” Centre for Democracy and Technology, 2015. Available from: https://cdt.org/insight/health-big-data-in-the-commercial-context/.

  • Focusing particularly on the privacy issues related to using data generated by individuals, this paper explores the overlap in privacy questions this field has with other data uses.
  • The authors note that although the Health Insurance Portability and Accountability Act (HIPAA) has proven a successful approach in ensuring accountability for health data, most of these standards do not apply to developers of the new technologies used to collect these new data sets.
  • For non-HIPAA covered, customer facing technologies, the paper bases an alternative framework for consideration of privacy issues. The framework is based on the Fair Information Practice Principles, and three rounds of stakeholder consultations.

Center for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice.” Centre for Information Policy Leadership, Hunton & Williams LLP, 2015. Available from: https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.

  • This white paper is part of a project aiming to explain what is often referred to as a new, risk-based approach to privacy, and the development of a privacy risk framework and methodology.
  • With the pace of technological progress often outstripping the capabilities of privacy officers to keep up, this method aims to offer the ability to approach privacy matters in a structured way, assessing privacy implications from the perspective of possible negative impact on individuals.
  • With the intended outcomes of the project being “materials to help policy-makers and legislators to identify desired outcomes and shape rules for the future which are more effective and less burdensome”, insights from this paper might also feed into the development of innovative governance mechanisms aimed specifically at preventing individual harm.

Centre for Information Policy Leadership, “Data Governance for the Evolving Digital Market Place”, Centre for Information Policy Leadership, Hunton & Williams LLP, 2011. Available from: http://www.huntonfiles.com/files/webupload/CIPL_Centre_Accountability_Data_Governance_Paper_2011.pdf.

  • This paper argues that as a result of the proliferation of large scale data analytics, new models governing data inferred from society will shift responsibility to the side of organizations deriving and creating value from that data.
  • It is noted that, with the reality of the challenge corporations face of enabling agile and innovative data use “In exchange for increased corporate responsibility, accountability [and the governance models it mandates, ed.] allows for more flexible use of data.”
  • Proposed as a means to shift responsibility to the side of data-users, the accountability principle has been researched by a worldwide group of policymakers. Tailing the history of the accountability principle, the paper argues that it “(…) requires that companies implement programs that foster compliance with data protection principles, and be able to describe how those programs provide the required protections for individuals.”
  • The following essential elements of accountability are listed:
    • Organisation commitment to accountability and adoption of internal policies consistent with external criteria
    • Mechanisms to put privacy policies into effect, including tools, training and education
    • Systems for internal, ongoing oversight and assurance reviews and external verification
    • Transparency and mechanisms for individual participation
    • Means of remediation and external enforcement

Crawford, Kate; Schulz, Jason. “Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harm.” NYU School of Law, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2325784&download=yes.

  • Considering the privacy implications of large-scale analysis of numerous data sources, this paper proposes the implementation of a ‘procedural data due process’ mechanism to arm data subjects against potential privacy intrusions.
  • The authors acknowledge that some privacy protection structures already know similar mechanisms. However, due to the “inherent analytical assumptions and methodological biases” of big data systems, the authors argue for a more rigorous framework.

Letouze, Emmanuel, and; Vinck, Patrick. “The Ethics and Politics of Call Data Analytics”, DataPop Alliance, 2015. Available from: http://static1.squarespace.com/static/531a2b4be4b009ca7e474c05/t/54b97f82e4b0ff9569874fe9/1421442946517/WhitePaperCDRsEthicFrameworkDec10-2014Draft-2.pdf.

  • Focusing on the use of Call Detail Records (CDRs) for social good in development contexts, this whitepaper explores both the potential of these datasets – in part by detailing recent successful efforts in the space – and political and ethical constraints to their use.
  • Drawing from the Menlo Report Ethical Principles Guiding ICT Research, the paper explores how these principles might be unpacked to inform an ethics framework for the analysis of CDRs.

Data for Development External Ethics Panel, “Report of the External Ethics Review Panel.” Orange, 2015. Available from: http://www.d4d.orange.com/fr/content/download/43823/426571/version/2/file/D4D_Challenge_DEEP_Report_IBE.pdf.

  • This report presents the findings of the external expert panel overseeing the Orange Data for Development Challenge.
  • Several types of issues faced by the panel are described, along with the various ways in which the panel dealt with those issues.

Federal Trade Commission Staff Report, “Mobile Privacy Disclosures: Building Trust Through Transparency.” Federal Trade Commission, 2013. Available from: www.ftc.gov/os/2013/02/130201mobileprivacyreport.pdf.

  • This report looks at ways to address privacy concerns regarding mobile phone data use. Specific advise is provided for the following actors:
    • Platforms, or operating systems providers
    • App developers
    • Advertising networks and other third parties
    • App developer trade associations, along with academics, usability experts and privacy researchers

Mirani, Leo. “How to use mobile phone data for good without invading anyone’s privacy.” Quartz, 2015. Available from: http://qz.com/398257/how-to-use-mobile-phone-data-for-good-without-invading-anyones-privacy/.

  • This paper considers the privacy implications of using call detail records for social good, and ways to mitigate risks of privacy intrusion.
  • Taking example of the Orange D4D challenge and the anonymization strategy that was employed there, the paper describes how classic ‘anonymization’ is often not enough. The paper then lists further measures that can be taken to ensure adequate privacy protection.

Bernholz, Lucy. “Several Examples of Digital Ethics and Proposed Practices” Stanford Ethics of Data conference, 2014, Available from: http://www.scribd.com/doc/237527226/Several-Examples-of-Digital-Ethics-and-Proposed-Practices.

  • This list of readings prepared for Stanford’s Ethics of Data conference lists some of the leading available literature regarding ethical data use.

Abrams, Martin. “A Unified Ethical Frame for Big Data Analysis.” The Information Accountability Foundation, 2014. Available from: http://www.privacyconference2014.org/media/17388/Plenary5-Martin-Abrams-Ethics-Fundamental-Rights-and-BigData.pdf.

  • Going beyond privacy, this paper discusses the following elements as central to developing a broad framework for data analysis:
    • Beneficial
    • Progressive
    • Sustainable
    • Respectful
    • Fair

Lane, Julia; Stodden, Victoria; Bender, Stefan, and; Nissenbaum, Helen, “Privacy, Big Data and the Public Good”, Cambridge University Press, 2014. Available from: http://www.dataprivacybook.org.

  • This book treats the privacy issues surrounding the use of big data for promoting the public good.
  • The questions being asked include the following:
    • What are the ethical and legal requirements for scientists and government officials seeking to serve the public good without harming individual citizens?
    • What are the rules of engagement?
    • What are the best ways to provide access while protecting confidentiality?
    • Are there reasonable mechanisms to compensate citizens for privacy loss?

Richards, Neil M, and; King, Jonathan H. “Big Data Ethics”. Wake Forest Law Review, 2014. Available from: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2384174.

  • This paper describes the growing impact of big data analytics on society, and argues that because of this impact, a set of ethical principles to guide data use is called for.
  • The four proposed themes are: privacy, confidentiality, transparency and identity.
  • Finally, the paper discusses how big data can be integrated into society, going into multiple facets of this integration, including the law, roles of institutions and ethical principles.

OECD, “OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data”. Available from: http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm.

  • A globally used set of principles to inform thought about handling personal data, the OECD privacy guidelines serve as one the leading standards for informing privacy policies and data governance structures.
  • The basic principles of national application are the following:
    • Collection Limitation Principle
    • Data Quality Principle
    • Purpose Specification Principle
    • Use Limitation Principle
    • Security Safeguards Principle
    • Openness Principle
    • Individual Participation Principle
    • Accountability Principle

The White House Big Data and Privacy Working Group, “Big Data: Seizing Opportunities, Preserving Values”, White House, 2015. Available from: https://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf.

  • Documenting the findings of the White House big data and privacy working group, this report lists i.a. the following key recommendations regarding data governance:
    • Bringing greater transparency to the data services industry
    • Stimulating international conversation on big data, with multiple stakeholders
    • With regard to educational data: ensuring data is used for the purpose it is collected for
    • Paying attention to the potential for big data to facilitate discrimination, and expanding technical understanding to stop discrimination

William Hoffman, “Pathways for Progress” World Economic Forum, 2015. Available from: http://www3.weforum.org/docs/WEFUSA_DataDrivenDevelopment_Report2015.pdf.

  • This paper treats i.a. the lack of well-defined and balanced governance mechanisms as one of the key obstacles preventing particularly corporate sector data from being shared in a controlled space.
  • An approach that balances the benefits against the risks of large scale data usage in a development context, building trust among all stake holders in the data ecosystem, is viewed as key.
  • Furthermore, this whitepaper notes that new governance models are required not just by the growing amount of data and analytical capacity, and more refined methods for analysis. The current “super-structure” of information flows between institutions is also seen as one of the key reasons to develop alternatives to the current – outdated – approaches to data governance.

India asks its citizens: please digitise our files


Joshua Chambers in FutureGov: “India has asked its citizens to help digitise records so that it can move away from paper processes.

Using its crowdsourcing web site MyGov, the government wrote that “we cannot talk of Digital India and transforming India into a knowledge society if most of the transactions continue to be physical.”

It is “essential” that paper records are converted into machine readable digital versions, the government added, but “the cost of such digitisation is very large and existing budgetary constraints of government and many other organisations do not allow such lavish digitisation effort.”

Consequently, the government is asking citizens for advice on how to build a cheap content management system and tools that will allow it to crowdsource records transcriptions. Citizens would be rewarded for every word that they transcribe through a points system, which can then be recouped into cash prizes.

“The proposed platform will create earning and income generation opportunities for our literate rural and urban citizens, develop digital literacy and IT skills and include them in the making of Digital India,” the government added.

The announcement also noted the importance of privacy, suggesting that documents are split so that no portion gives any clue regarded the overall content of the document.

Instead, two people will be given the same words to transcribe, and the software will compare their statements to ensure accuracy. Only successful transcription will be rewarded with points….(More)”

Nepal Aid Workers Helped by Drones, Crowdsourcing


Shirley Wang et al in the Wall Street Journal: “….It is too early to gauge the exact impact of the technology in Nepal relief efforts, which have just begun amid chaos on the ground. Aid organizations have reported hospitals are overstretched, a shortage of capacity at Katmandu’s airport is crippling aid distribution and damaged roads and the mountainous country’s difficult terrain make reaching villages difficult.

Still, technology is playing an increasing role in the global response to humanitarian crises. Within hours of Saturday’s 7.8-magnitude temblor, U.S. giants such as Google Inc. and Facebook Inc. were offering their networks for use in verifying survivors and helping worried friends and relatives locate their loved ones.

Advances in online mapping—long used to calculate distances and plot driving routes—and the ability of camera-equipped drones are playing an increasingly important role in coordinating emergency responses at ground zero of any disaster.

A community of nonprofit groups uses satellite images, private images and open-source mapping technology to remap areas affected by the earthquake. They mark damaged buildings and roads so rescuers can identify the worst-hit areas and assess how accessible different areas are. The technology complements more traditional intelligence from aircraft.

Such crowdsourced real-time mapping technologies were first used in the 2010 Haiti earthquake, according to Chris Grundy, a professor in Geographical Information Systems at the London School of Hygiene and Tropical Medicine. The technology “has been advancing a little bit every time [every situation where it is used] as we start to see what works,” said Prof. Grundy.

The American Red Cross supplied its relief team on the Wednesday night flight to Nepal from Washington, D.C. with 50 digital maps and an inch-thick pile of paper maps that help identify where the needs are. The charity has a mapping project with the British Red Cross, Doctors Without Borders and the Humanitarian OpenStreetMap Team, a crowdsourced data-sharing group.

Almost a week after the Nepal earthquake, two more people have been pulled from the rubble in Katmandu by teams of international rescuers. But hope for finding more survivors is waning. Photo: Sean McLain/The Wall Street Journal.

Mapping efforts have grown substantially since Haiti, according to Dale Kunce, head of the geographic information systems team at the American Red Cross. In the two months after the Haiti temblor, 600 mapping contributors made 1.5 million edits, while in the first 48 hours after the Nepal earthquake, 2,000 mappers had already made three million edits, Mr. Kunce said.

Some 3,400 volunteers from around the world are now inspecting images of Nepal online to identify road networks and conditions, to assess the extent of damage and pinpoint open spaces where displaced persons tend to congregate, according to Nama Budhathoki, executive director of a nonprofit technology company called Katmandu Living Labs.

His group is operating from a cramped but largely undamaged meeting room in a central-Katmandu office building to help coordinate the global effort of various mapping organizations with the needs of agencies like Doctors Without Borders and the international Red Cross community.

In recent days the Nepal Red Cross and Nepalese army have requested and been supplied with updated maps of severely damaged districts, said Dr. Budhathoki….(More)”

How Google and Facebook are finding victims of the Nepal earthquake


Caitlin Dewey in the Washington Post: “As the death toll from Saturday’s 7.8-magnitude Nepalese earthquake inches higher, help in finding and identifying missing persons has come from an unusual source: Silicon Valley tech giants.

Both Google and Facebook deployed collaborative, cellphone-based tools over the weekend to help track victims of the earthquake. In the midst of both company’s big push to bring Internet to the developing world, it’s an important illustration of exactly how powerful that connectivity could be. And yet, in a country like Nepal — where there are only 77 cellphone subscriptions per 100 people versus 96 in the U.S. and 125 in the U.K. — it’s also a reminder of how very far that effort still has to go.

Facebook Safety Check

Facebook’s Safety Check essentially lets users do two things, depending on where they are. Users in an area impacted by a natural disaster can log onto the site and mark themselves as “safe.” Meanwhile, users around the world can log into the site and check if any of their friends are in the impacted area. The tool was built by Japanese engineers in response to the 2011 earthquake and tsunami that devastated coastal Japan.

Facebook hasn’t publicized how many people have used the tool, though the network only has 4.4 million users in the country based on estimates by its ad platform. Notably, you must also a smartphone running the Facebook app to use this feature — and smartphone penetration in Nepal is quite low.

Google Person Finder

Like Safety Check, Google Person Finder is intended to connect people in a disaster area with friends and family around the world. Google’s five-year-old project also operates on a larger scale, however: It basically provides a massive, open platform to collaboratively track missing persons’ reports. Previously, Google’s deployed the tool to help victims in the wake of Typhoon Haiyan and the Boston bombing.

 

New surveys reveal dynamism, challenges of open data-driven businesses in developing countries


Alla Morrison at World Bank Open Data blog: “Was there a class of entrepreneurs emerging to take advantage of the economic possibilities offered by open data, were investors keen to back such companies, were governments tuned to and responsive to the demands of such companies, and what were some of the key financing challenges and opportunities in emerging markets? As we began our work on the concept of an Open Fund, we partnered with Ennovent (India), MDIF (East Asia and Latin America) and Digital Data Divide (Africa) to conduct short market surveys to answer these questions, with a focus on trying to understand whether a financing gap truly existed in these markets. The studies were fairly quick (4-6 weeks) and reached only a small number of companies (193 in India, 70 in Latin America, 63 in South East Asia, and 41 in Africa – and not everybody responded) but the findings were fairly consistent.

  • Open data is still a very nascent concept in emerging markets. and there’s only a small class of entrepreneurs/investors that is aware of the economic possibilities; there’s a lot of work to do in the ‘enabling environment’
    • In many regions the distinction between open data, big data, and private sector generated/scraped/collected data was blurry at best among entrepreneurs and investors (some of our findings consequently are better indicators of  data-driven rather than open data-driven businesses)
  • There’s a small but growing number of open data-driven companies in all the markets we surveyed and these companies target a wide range of consumers/users and are active in multiple sectors
    • A large percentage of identified companies operate in sectors with high social impact – health and wellness, environment, agriculture, transport. For instance, in India, after excluding business analytics companies, a third of data companies seeking financing are in healthcare and a fifth in food and agriculture, and some of them have the low-income population or the rural segment of India as an intended beneficiary segment. In Latin America, the number of companies in business services, research and analytics was closely followed by health, environment and agriculture. In Southeast Asia, business, consumer services, and transport came out in the lead.
    • We found the highest number of companies in Latin America and Asia with the following countries leading the way – Mexico, Chile, and Brazil, with Colombia and Argentina closely behind in Latin America; and India, Indonesia, Philippines, and Malaysia in Asia
  • An actionable pipeline of data-driven companies exists in Latin America and in Asia
    • We heard demand for different kinds of financing (equity, debt, working capital) but the majority of the need was for equity and quasi-equity in amounts ranging from $100,000 to $5 million USD, with averages of between $2 and $3 million USD depending on the region.
  • There’s a significant financing gap in all the markets
    • The investment sizes required, while they range up to several million dollars, are generally small. Analysis of more than 300 data companies in Latin America and Asia indicates a total estimated need for financing of more than $400 million
  • Venture capitals generally don’t recognize data as a separate sector and club data-driven companies with their standard information communication technology (ICT) investments
    • Interviews with founders suggest that moving beyond seed stage is particularly difficult for data-driven startups. While many companies are able to cobble together an initial seed round augmented by bootstrapping to get their idea off the ground, they face a great deal of difficulty when trying to raise a second, larger seed round or Series A investment.
    • From the perspective of startups, investors favor banal e-commerce (e.g., according toTech in Asia, out of the $645 million in technology investments made public across the region in 2013, 92% were related to fashion and online retail) or consumer service startups and ignore open data-focused startups even if they have a strong business model and solid key performance indicators. The space is ripe for a long-term investor with a generous risk appetite and multiple bottom line goals.
  • Poor data quality was the number one issue these companies reported.
    • Companies reported significant waste and inefficiency in accessing/scraping/cleaning data.

The analysis below borrows heavily from the work done by the partners. We should of course mention that the findings are provisional and should not be considered authoritative (please see the section on methodology for more details)….(More).”

Big Data for Social Good


Introduction to a Special Issue of the Journal “Big Data” by Catlett Charlie and Ghani Rayid: “…organizations focused on social good are realizing the potential as well but face several challenges as they seek to become more data-driven. The biggest challenge they face is a paucity of examples and case studies on how data can be used for social good. This special issue of Big Data is targeted at tackling that challenge and focuses on highlighting some exciting and impactful examples of work that uses data for social good. The special issue is just one example of the recent surge in such efforts by the data science community. …

This special issue solicited case studies and problem statements that would either highlight (1) the use of data to solve a social problem or (2) social challenges that need data-driven solutions. From roughly 20 submissions, we selected 5 articles that exemplify this type of work. These cover five broad application areas: international development, healthcare, democracy and government, human rights, and crime prevention.

“Understanding Democracy and Development Traps Using a Data-Driven Approach” (Ranganathan et al.) details a data-driven model between democracy, cultural values, and socioeconomic indicators to identify a model of two types of “traps” that hinder the development of democracy. They use historical data to detect causal factors and make predictions about the time expected for a given country to overcome these traps.

“Targeting Villages for Rural Development Using Satellite Image Analysis” (Varshney et al.) discusses two case studies that use data and machine learning techniques for international economic development—solar-powered microgrids in rural India and targeting financial aid to villages in sub-Saharan Africa. In the process, the authors stress the importance of understanding the characteristics and provenance of the data and the criticality of incorporating local “on the ground” expertise.

In “Human Rights Event Detection from Heterogeneous Social Media Graphs,” Chen and Neil describe efficient and scalable techniques to use social media in order to detect emerging patterns in human rights events. They test their approach on recent events in Mexico and show that they can accurately detect relevant human rights–related tweets prior to international news sources, and in some cases, prior to local news reports, which could potentially lead to more timely, targeted, and effective advocacy by relevant human rights groups.

“Finding Patterns with a Rotten Core: Data Mining for Crime Series with Core Sets” (Wang et al.) describes a case study with the Cambridge Police Department, using a subspace clustering method to analyze the department’s full housebreak database, which contains detailed information from thousands of crimes from over a decade. They find that the method allows human crime analysts to handle vast amounts of data and provides new insights into true patterns of crime committed in Cambridge…..(More)

UNESCO demonstrates global impact through new transparency portal


“Opendata.UNESCO.org  is intended to present comprehensive, quality and timely information about UNESCO’s projects, enabling users to find information by country/region, funding source, and sector and providing comprehensive project data, including budget, expenditure, completion status, implementing organization, project documents, and more. It publishes program and financial information that are in line with UN system-experience of the IATI (International Aid Transparency Initiative) standards and other relevant transparency initiatives. UNESCO is now part of more than 230 organizations that have published to the IATI Registry, which brings together donor and developing countries, civil society organizations and other experts in aid information who are committed to working together to increase the transparency of aid.

Since its creation 70 years ago, UNESCO has tirelessly championed the causes of education, culture, natural sciences, social and human sciences, communication and information, globally. For instance – started in March 2010, the program for the Enhancement of Literacy in Afghanistan (ELA) benefited from a $19.5 million contribution by Japan. It aimed to improve the level of literacy, numeracy and vocational skills of the adult population in 70 districts of 15 provinces of Afghanistan. Over the next three years, until April 2013, the ELA programme helped some 360,000 adult learners in General Literacy compotency. An interactive map allows for an easy identification of UNESCO’s high-impact programs, and up-to-date information of current and future aid allocations within and across countries.

Public participation and interactivity are key to the success of any open data project. http://Opendata.UNESCO.org will evolve as Member States and partners will get involved, by displaying data on their own websites and sharing data among different networks, building and sharing applications, providing feedback, comments, and recommendations. …(More)”

2015 Edelman Trust Barometer


The 2015 Edelman Trust Barometer shows a global decline in trust over the last year, and the number of countries with trusted institutions has fallen to an all-time low among the informed public.

Among the general population, the trust deficit is even more pronounced, with nearly two-thirds of countries falling into the distruster category.
In the last year, trust has declined for three of the four institutions measured. NGOs continue to be the most trusted institution, but trust in NGOs declined from 66 to 63 percent. Sixty percent of countries now distrust media. Trust in government increased slightly, driven by big gains in India, Russia and Indonesia but government is still distrusted in 19 of the 27 markets surveyed. And trust in business is below 50 percent in half of those markets.
 

Why Is Democracy Performing So Poorly?


Essay by Francis Fukuyama in the Journal of Democracy: “The Journal of Democracy published its inaugural issue a bit past the midpoint of what Samuel P. Huntington labeled the “third wave” of democratization, right after the fall of the Berlin Wall and just before the breakup of the former Soviet Union. The transitions in Southern Europe and most of those in Latin America had already happened, and Eastern Europe was moving at dizzying speed away from communism, while the democratic transitions in sub-Saharan Africa and the former USSR were just getting underway. Overall, there has been remarkable worldwide progress in democratization over a period of almost 45 years, raising the number of electoral democracies from about 35 in 1970 to well over 110 in 2014.
But as Larry Diamond has pointed out, there has been a democratic recession since 2006, with a decline in aggregate Freedom House scores every year since then. The year 2014 has not been good for democracy, with two big authoritarian powers, Russia and China, on the move at either end of Eurasia. The “Arab Spring” of 2011, which raised expectations that the Arab exception to the third wave might end, has degenerated into renewed dictatorship in the case of Egypt, and into anarchy in Libya, Yemen, and also Syria, which along with Iraq has seen the emergence of a new radical Islamist movement, the Islamic State in Iraq and Syria (ISIS).
It is hard to know whether we are experiencing a momentary setback in a general movement toward greater democracy around the world, similar to a stock-market correction, or whether the events of this year signal a broader shift in world politics and the rise of serious alternatives to democracy. In either case, it is hard not to feel that the performance of democracies around the world has been deficient in recent years. This begins with the most developed and successful democracies, those of the United States and the European Union, which experienced massive economic crises in the late 2000s and seem to be mired in a period of slow growth and stagnating incomes. But a number of newer democracies, from Brazil to Turkey to India, have also been disappointing in their performance in many respects, and subject to their own protest movements.
Spontaneous democratic movements against authoritarian regimes continue to arise out of civil society, from Ukraine and Georgia to Tunisia and Egypt to Hong Kong. But few of these movements have been successful in leading to the establishment of stable, well-functioning democracies. It is worth asking why the performance of democracy around the world has been so disappointing.
In my view, a single important factor lies at the core of many democratic setbacks over the past generation. It has to do with a failure of institutionalization—the fact that state capacity in many new and existing democracies has not kept pace with popular demands for democratic accountability. It is much harder to move from a patrimonial or neopatrimonial state to a modern, impersonal one than it is to move from an authoritarian regime to one that holds regular, free, and fair elections. It is the failure to establish modern, well-governed states that has been the Achilles heel of recent democratic transitions… (More)”