World Justice Project (WJP) Open Government Index


“The World Justice Project (WJP) Open Government Index™ provides scores and rankings on four dimensions of government openness: publicized laws and government data, right to information, civic participation, and complaint mechanisms (full descriptions below). Scores are based on responses to household surveys and in-country expert questionnaires collected for the WJP Rule of Law Index. The WJP Open Government Index 2015 covers a total of 102 countries and jurisdictions.

This index is the product of two years of development, consultation, and vetting with policy makers, civil society groups, and academics from several countries. It is our hope that over time this diagnostic tool will help identify strengths and weaknesses in countries under review and encourage policy choices that enhance openness, promote effective public oversight, and increase collaboration amongst public and private sectors.

Mission Control: A History of the Urban Dashboard


Futuristic control rooms have proliferated in dozens of global cities. Baltimore has its CitiStat Room, where department heads stand at a podium before a wall of screens and account for their units’ performance.  The Mayor’s office in London’s City Hall features a 4×3 array of iPads mounted in a wooden panel, which seems an almost parodic, Terry Gilliam-esque take on the Brazilian Ops Center. Meanwhile, British Prime Minister David Cameron commissioned an iPad app – the “No. 10 Dashboard” (a reference to his residence at 10 Downing Street) – which gives him access to financial, housing, employment, and public opinion data. As The Guardian reported, “the prime minister said that he could run government remotely from his smartphone.”

This is the age of Dashboard Governance, heralded by gurus like Stephen Few, founder of the “visual business intelligence” and “sensemaking” consultancy Perceptual Edge, who defines the dashboard as a “visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.” A well-designed dashboard, he says — one that makes proper use of bullet graphs, sparklines, and other visualization techniques informed by the “brain science” of aesthetics and cognition — can afford its users not only a perceptual edge, but a performance edge, too. The ideal display offers a big-picture view of what is happening in real time, along with information on historical trends, so that users can divine the how and why and redirect future action. As David Nettleton emphasizes, the dashboard’s utility extends beyond monitoring “the current situation”; it also “allows a manager to … make provisions, and take appropriate actions.”….

The dashboard market now extends far beyond the corporate world. In 1994, New York City police commissioner William Bratton adapted former officer Jack Maple’s analog crime maps to create the CompStat model of aggregating and mapping crime statistics. Around the same time, the administrators of Charlotte, North Carolina, borrowed a business idea — Robert Kaplan’s and David Norton’s “total quality management” strategy known as the “Balanced Scorecard” — and began tracking performance in five “focus areas” defined by the City Council: housing and neighborhood development, community safety, transportation, economic development, and the environment. Atlanta followed Charlotte’s example in creating its own city dashboard.

In 1999, Baltimore mayor Martin O’Malley, confronting a crippling crime rate and high taxes, designed CitiStat, “an internal process of using metrics to create accountability within his government.” (This rhetoric of data-tested internal “accountability” is prevalent in early dashboard development efforts.) The project turned to face the public in 2003, when Baltimore launched a website of city operational statistics, which inspired DCStat (2005), Maryland’s StateStat (2007), and NYCStat (2008). Since then, myriad other states and metro areas — driven by a “new managerialist” approach to urban governance, committed to “benchmarking” their performance against other regions, and obligated to demonstrate compliance with sustainability agendas — have developed their own dashboards.

The Open Michigan Mi Dashboard is typical of these efforts. The state website presents data on education, health and wellness, infrastructure, “talent” (employment, innovation), public safety, energy and environment, financial health, and seniors. You (or “Mi”) can monitor the state’s performance through a side-by-side comparison of “prior” and “current” data, punctuated with a thumbs-up or thumbs-down icon indicating the state’s “progress” on each metric. Another click reveals a graph of annual trends and a citation for the data source, but little detail about how the data are actually derived. How the public is supposed to use this information is an open question….(More)”

Big Data for Social Good


Introduction to a Special Issue of the Journal “Big Data” by Catlett Charlie and Ghani Rayid: “…organizations focused on social good are realizing the potential as well but face several challenges as they seek to become more data-driven. The biggest challenge they face is a paucity of examples and case studies on how data can be used for social good. This special issue of Big Data is targeted at tackling that challenge and focuses on highlighting some exciting and impactful examples of work that uses data for social good. The special issue is just one example of the recent surge in such efforts by the data science community. …

This special issue solicited case studies and problem statements that would either highlight (1) the use of data to solve a social problem or (2) social challenges that need data-driven solutions. From roughly 20 submissions, we selected 5 articles that exemplify this type of work. These cover five broad application areas: international development, healthcare, democracy and government, human rights, and crime prevention.

“Understanding Democracy and Development Traps Using a Data-Driven Approach” (Ranganathan et al.) details a data-driven model between democracy, cultural values, and socioeconomic indicators to identify a model of two types of “traps” that hinder the development of democracy. They use historical data to detect causal factors and make predictions about the time expected for a given country to overcome these traps.

“Targeting Villages for Rural Development Using Satellite Image Analysis” (Varshney et al.) discusses two case studies that use data and machine learning techniques for international economic development—solar-powered microgrids in rural India and targeting financial aid to villages in sub-Saharan Africa. In the process, the authors stress the importance of understanding the characteristics and provenance of the data and the criticality of incorporating local “on the ground” expertise.

In “Human Rights Event Detection from Heterogeneous Social Media Graphs,” Chen and Neil describe efficient and scalable techniques to use social media in order to detect emerging patterns in human rights events. They test their approach on recent events in Mexico and show that they can accurately detect relevant human rights–related tweets prior to international news sources, and in some cases, prior to local news reports, which could potentially lead to more timely, targeted, and effective advocacy by relevant human rights groups.

“Finding Patterns with a Rotten Core: Data Mining for Crime Series with Core Sets” (Wang et al.) describes a case study with the Cambridge Police Department, using a subspace clustering method to analyze the department’s full housebreak database, which contains detailed information from thousands of crimes from over a decade. They find that the method allows human crime analysts to handle vast amounts of data and provides new insights into true patterns of crime committed in Cambridge…..(More)

An In-Depth Analysis of Open Data Portals as an Emerging Public E-Service


Paper by Martin Lnenicka: “Governments collect and produce large amounts of data. Increasingly, governments worldwide have started to implement open data initiatives and also launch open data portals to enable the release of these data in open and reusable formats. Therefore, a large number of open data repositories, catalogues and portals have been emerging in the world. The greater availability of interoperable and linkable open government data catalyzes secondary use of such data, so they can be used for building useful applications which leverage their value, allow insight, provide access to government services, and support transparency. The efficient development of successful open data portals makes it necessary to evaluate them systematic, in order to understand them better and assess the various types of value they generate, and identify the required improvements for increasing this value. Thus, the attention of this paper is directed particularly to the field of open data portals. The main aim of this paper is to compare the selected open data portals on the national level using content analysis and propose a new evaluation framework, which further improves the quality of these portals. It also establishes a set of considerations for involving businesses and citizens to create eservices and applications that leverage on the datasets available from these portals….(More)”

The Platform for Political Innovation


The Platform for Political Innovation …”is a project aiming to strengthen the role and efficiency of Civil Society in Greece, focusing on the imperative need for innovation in policy-making. The project is based on the collaboration of three Greek Civil Society Organizations which combine and build on their experience and know-how in order to maximize the impact of activities for Political Innovation. From October 2014 to May 2015, experimental applications of innovative processes and tools are initiated  in different Greek cities, focusing on the re-design of decision-making processes at local and national level. The proposed action plan constitutes the phase B of the wider social project POLITEIA 2.0 which won the Audience Award in the 2012 European Investment Bank Social Innovation Tournament.

The activities of the Platform for Political Innovation focus on Research, Networking, Training, Digital Tools and Innovation Workshops Development in 4 Greek cities….including:

Syntagma 2.0: workshops for the participatory design of a new Constitution for Greece by its citizens.

Pedio_Agora: workshops for the participatory design of a public space. Focus area: Varvakeios Square, Athens….(More)”

Data scientists rejoice! There’s an online marketplace selling algorithms from academics


SiliconRepublic: “Algorithmia, an online marketplace that connects computer science researchers’ algorithms with developers who may have uses for them, has exited its private beta.

Algorithms are essential to our online experience. Google uses them to determine which search results are the most relevant. Facebook uses them to decide what should appear in your news feed. Netflix uses them to make movie recommendations.

Founded in 2013, Algorithmia could be described as an app store for algorithms, with over 800 of them available in its library. These algorithms provide the means of completing various tasks in the fields of machine learning, audio and visual processing, and computer vision.

Algorithmia found a way to monetise algorithms by creating a platform where academics can share their creations and charge a royalty fee per use, while developers and data scientists can request specific algorithms in return for a monetary reward. One such suggestion is for ‘punctuation prediction’, which would insert correct punctuation and capitalisation in speech-to-text translation.

While it’s not the first algorithm marketplace online, Algorithmia will accept and sell any type of algorithm and host them on its servers. What this means is that developers need only add a simple piece of code to their software in order to send a query to Algorithmia’s servers, so the algorithm itself doesn’t have to be integrated in its entirety….

Computer science researchers can spend years developing algorithms, only for them to be published in a scientific journal never to be read by software engineers.

Algorithmia intends to create a community space where academics and engineers can meet to discuss and refine these algorithms for practical use. A voting and commenting system on the site will allow users to engage and even share insights on how contributions can be improved.

To that end, Algorithmia’s ultimate goal is to advance the development of algorithms as well as their discovery and use….(More)”

Turning smartphones into personal, real-time pollution-location monitors


Kurzweil Newsletter: “Scientists reporting in the ACS journal Environmental Science & Technology have used smartphone and sensing technology to better pinpoint times and locations of the worst air pollution, which is associated with respiratory and cardiovascular problems.

Most such studies create a picture of exposure based on air pollution levels outside people’s homes. This approach ignores big differences in air quality in school and work environments. It also ignores spikes in pollution that happen over the course of the day such as during rush hour.

To fill in these gaps, Mark J. Nieuwenhuijsen and colleagues in Spain, The Netherlands, and the U.S. equipped 54 school children from from 29 different schools around Barcelona with smartphones that could track their location and physical activity. The children also received sensors that continuously measured the ambient levels of black carbon, a component of soot. Although most children spent less than 4 percent of their day traveling to and from school, this exposure contributed 13 percent of their total potential black carbon exposure.

The study was associated with BREATHE, an epidemiological study of the relation between air pollution and brain development.

The researchers conclude that mobile technologies could contribute valuable new insights into air pollution exposure….

More: Mark J. Nieuwenhuijsen, David Donaire-Gonzalez, Ioar Rivas, Montserrat de Castro, Marta Cirach, Gerard Hoek, Edmund Seto, Michael Jerrett, Jordi Sunyer. Variability in and Agreement between Modeled and Personal Continuously Measured Black Carbon Levels Using Novel Smartphone and Sensor Technologies. Environmental Science & Technology, 2015; 150209104136008 DOI: 10.1021/es505362x

Study to examine Australian businesses’ use of government data


ComputerWorld: “The New York University’s GovLab and the federal Department of Communications have embarked on a study of how Australian organisations are employing government data sets.

The ‘Open Data 500’ study was launched today at the Locate15 conference. It aims to provide a basis for assessing the value of open data and encourage the development of new businesses based on open data, as well as encourage discussion about how to make government data more useful to businesses and not-for-profit organisations.

The study is part of a series of studies taking place under the auspices of the OD500 Global Network.

“This study will help ensure the focus of Government is on the publication of high value datasets, with an emphasis on quality rather than quantity,” a statement issued by the Department of Communications said.

“Open Data 500 advances the government’s policy of increasing the number of high value public datasets in Australia in an effort to drive productivity and innovation, as well as its commitment to greater consultation with private sector stakeholders on open data,” Communications Minister Malcolm Turnbull said in remarks prepared for the Locate 15 conference….(More)”

Wittgenstein, #TheDress and Google’s search for a bigger truth


Robert Shrimsley at the Financial Times: “As the world burnt with a BuzzFeed-prompted debate over whether a dress was black and blue or white and gold, the BBC published a short article posing the question everyone was surely asking: “What would Wittgenstein say about that dress?

Wittgenstein died in 1951, so we cannot know if the philosopher of language, truth and context would have been a devotee of BuzzFeed. (I guess it depends on whether we are talking of the early or the late Ludwig. The early Wittgenstein, it is well known, was something of an enthusiast for LOLs, whereas the later was more into WTFs and OMGs.)

The dress will now join the pantheon of web phenomena such as “Diet Coke and Mentos” and “Charlie bit my finger”. But this trivial debate on perceived truth captured in miniature a wider issue for the web: how to distil fact from noise when opinion drowns out information and value is determined by popularity.

At about the same time as the dress was turning the air blue — or was it white? — the New Scientist published a report on how one web giant might tackle this problem, a development in which Wittgenstein might have been very interested. The magazine reported on a Google research paper about how the company might reorder its search rankings to promote sites that could be trusted to tell the truth. (Google produces many such papers a year so this is a long way short of official policy.) It posits a formula for finding and promoting sites with a record of reliability.

This raises an interesting question over how troubled we should be by the notion that a private company with its own commercial interests and a huge concentration of power could be the arbiter of truth. There is no current reason to see sinister motives in Google’s search for a better web: it is both honourable and good business. But one might ask how, for example, Google Truth might determine established truths on net neutrality….

The paper suggests using fidelity to proved facts as a proxy for trust. This is easiest with single facts, such as a date or place of birth. For example, it suggests claiming Barack Obama was born in Kenya would push a site down the rankings. This would be good for politics but facts are not always neutral. Google would risk being depicted as part of “the mainstream media”. Fox Search here we come….(More)”

Models and Patterns of Trust


Paper presented by Bran Knowles et al at the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing: “As in all collaborative work, trust is a vital ingredient of successful computer supported cooperative work, yet there is little in the way of design principles to help practitioners develop systems that foster trust. To address this gap, we present a set of design patterns, based on our experience designing systems with the explicit intention of increasing trust between stakeholders. We contextualize these patterns by describing our own learning process, from the development, testing and refinement of a trust model, to our realization that the insights we gained along the way were most usefully expressed through design patterns. In addition to a set of patterns for trust, this paper seeks to demonstrate of the value of patterns as a means of communicating the nuances revealed through ethnographic investigation….(More)