Buenos Aires, A Pocket of Civic Innovation in Argentina


Rebecca Chao in TechPresident: “…In only a few years, the government, civil society and media in Buenos Aires have actively embraced open data. The Buenos Aires city government has been publishing data under a creative commons license and encouraging civic innovation through hackathons. NGOs have launched a number of tech-driven tools and Argentina’s second largest newspaper, La Nación, has published several hard-hitting data journalism projects. The result is a fledgling but flourishing open data culture in Buenos Aires, in a country that has not yet adopted a freedom of information law.

A Wikipedia for Open Government Data

In late August of this year, the Buenos Aires government declared a creative commons license for all of its digital content, which allows it be used for free, like Wikipedia content, with proper attribution. This applies to their new open data catalog that allows users to visualize the data, examine apps that have been created using the data and even includes a design lab for posting app ideas. Launched only in March, the government has already published fairly substantial data sets, including the salaries of city officials. The website also embodies the principals of openness in its design; it is built with open-source software and its code is available for reuse via GitHub.
“We were the first city in Argentina doing open government,” Rudi Borrmann tells techPresident over Skype. Borrmann is the Director of Buenos Aires’ Open Government Initiative. Previously, he was the social media editor at the city’s New Media Office but he also worked for many years in digital media…
While the civil society and media sectors have forged ahead in using open data, Borrmann tells techPresident that up in the ivory tower, openness to open data has been lagging. “Only technical schools are starting to create areas focused on working on open data,” he says.
In an interview with NYU’s govlab, Borrmann explained the significance of academia in using and pushing for more open data. “They have the means, the resources, the methodology to analyze…because in government you don’t have that time to analyze,” he said.
Another issue with open data is getting other branches of the government to modernize. Borrmann says that a lot of the Open Government’s work is done behind the scenes. “In general, you have very poor IT infrastructure all over Latin America” that interferes with the gathering and publishing of data, he says. “So in some cases it’s not about publishing or not publishing,” but about “having robust infrastructure for the information.”
It seems that the behind the scenes work is bearing some fruit. Just last week, on Dec. 6, the team behind the Buenos Aires open data website launched an impressive, interactive timeline, based on a similar timelapse map developed by a 2013 Knight-Mozilla Fellow, Noah Veltman. Against faded black and white photos depicting the subway from different decades over the last century, colorful pops of the Subterráneo lines emerge alongside factoids that go all the way back to 1910.”

Tech challenge develops algorithms to predict


SciDevNet: “Mathematical models that use existing socio-political data to predict mass atrocities could soon inform governments and NGOs on how and where to take preventative action.
The models emerged from one strand of the Tech Challenge for Atrocity Prevention, a competition run by the US Agency for International Development (USAID) and NGO Humanity United. The winners were announced last month (18 November) and will now work with the organiser to further develop and pilot their innovations.
The five winners from different countries who won between US$1,000 and US$12,000, were among nearly 100 entrants who developed algorithms to predict when and where mass atrocities are likely to happen.
Around 1.5 billion people live in countries affected by conflict, sometimes including atrocities such as genocides, mass rape and ethnic cleansing, according to the World Bank’s World Development Report 2011. Many of these countries are in the developing world.
The competition organisers hope the new algorithms could help governments and human rights organisations identify at-risk regions, potentially allowing them to intervene before mass atrocities happen.
The competition started from the premise that certain social and political measurements are linked to increased likelihood of atrocities. Yet because such factors interact in complex ways, organisations working to prevent atrocities lack a reliable method of predicting when and where they might happen next.
The algorithms use sociopolitical indicators and data on past atrocities as their inputs. The data was drawn from archives such as the Global Database of Events, Language and Tone, a data set that encodes more than 200 million globally newsworthy events, recording cultural information such as the people involved, their location and any religious connections.”
Link to the winners of the Model Challenge

The Brainstorm Begins: Initial Ideas for Evolving ICANN


Screen Shot 2013-12-09 at 6.41.19 PM“The ICANN Strategy Panel on Multistakeholder Innovation (MSI Panel) is underway working to curate a set of concrete proposals for ways that the Internet Corporation for Assigned Names & Numbers (ICANN) could prototype new institutional arrangements for the 21st century. The Panel is working to identify how ICANN can open itself to more global participation in its governance functions. Specifically, the MSI Panel is charged with:

  • Proposing new models for international engagement, consensus-driven policymaking and institutional structures to support such enhanced functions; and
  • Designing processes, tools and platforms that enable the global ICANN community to engage in these new forms of participatory decision-making.

To help answer this charter, the MSI Panel launched an “Idea Generation” or ideation platform, designed to brainstorm with the global public on how to evolve the way ICANN could operate given the innovations in governance happening across the world.

We’re now 3 weeks in to this Idea Generation stage – taking place online here: thegovlab.ideascale.com – and we wanted to share with you what the Panel and The GovLab has heard so far regarding what tools, technologies, platforms and techniques ICANN could learn from or adapt to help design an innovative approach to problem-solving within the Domain Name System going forward.

These initial ideas begin to paint a picture of what 21st century coordination of a shared global commons might involve. These brainstorms all point to certain core principles the Panel believes provide the groundwork for an institution to legitimately operate in the global public interest today. These principles include:

  • Openness –  Ensuring open channels as well as very low or no barriers to meaningful participation.
  • Transparency – Providing public access to information and deliberation data.
  • Accessibility – Developing simple and legible organizational communications.
  • Inclusivity and Lack of Domination – Ensuring access to global participation and that no one player, entity or interest dominates processes or outcomes.
  • Accountability – Creating mechanisms for the global public to check institutional power.
  • Effectiveness –  Improving decision-making through greater reliance on evidence and a focus on flexibility and agility.
  • Efficiency – Streamlining processes to better leverage time, resources and human capital.

With these core principles as the backdrop, the ideas we’ve heard so far roughly fall within the following categories…
See also thegovlab.ideascale.com

Google Global Impact Award Expands Zooniverse


Press Release: “A $1.8 million Google Global Impact Award will enable Zooniverse, a nonprofit collaboration led by the Adler Planetarium and the University of Oxford, to make setting up a citizen science project as easy as starting a blog and could lead to thousands of innovative new projects around the world, accelerating the pace of scientific research.
The award supports the further development of the Zooniverse, the world’s leading ‘citizen science’ platform, which has already given more than 900,000 online volunteers the chance to contribute to science by taking part in activities including discovering planets, classifying plankton or searching through old ship’s logs for observations of interest to climate scientists. As part of the Global Impact Award, the Adler will receive $400,000 to support the Zooniverse platform.
With the Google Global Impact Award, Zooniverse will be able to rebuild their platform so that research groups with no web development expertise can build and launch their own citizen science projects.
“We are entering a new era of citizen science – this effort will enable prolific development of science projects in which hundreds of thousands of additional volunteers will be able to work alongside professional scientists to conduct important research – the potential for discovery is limitless,” said Michelle B. Larson, Ph.D., Adler Planetarium president and CEO. “The Adler is honored to join its fellow Zooniverse partner, the University of Oxford, as a Google Global Impact Award recipient.”
The Zooniverse – the world’s leading citizen science platform – is a global collaboration across several institutions that design and build citizen science projects. The Adler is a founding partner of the Zooniverse, which has already engaged more than 900,000 online volunteers as active scientists by discovering planets, mapping the surface of Mars and detecting solar flares. Adler-directed citizen science projects include: Galaxy Zoo (astronomy), Solar Stormwatch (solar physics), Moon Zoo (planetary science), Planet Hunters (exoplanets) and The Milky Way Project (star formation). The Zooniverse (zooniverse.org) also includes projects in environmental, biological and medical sciences. Google’s investment in the Adler and its Zooniverse partner, the University of Oxford, will further the global reach, making thousands of new projects possible.”

Selected Readings on Crowdsourcing Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing data was originally published in 2013.

As institutions seek to improve decision-making through data and put public data to use to improve the lives of citizens, new tools and projects are allowing citizens to play a role in both the collection and utilization of data. Participatory sensing and other citizen data collection initiatives, notably in the realm of disaster response, are allowing citizens to crowdsource important data, often using smartphones, that would be either impossible or burdensomely time-consuming for institutions to collect themselves. Civic hacking, often performed in hackathon events, on the other hand, is a growing trend in which governments encourage citizens to transform data from government and other sources into useful tools to benefit the public good.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Baraniuk, Chris. “Power Politechs.” New Scientist 218, no. 2923 (June 29, 2013): 36–39. http://bit.ly/167ul3J.

  • In this article, Baraniuk discusses civic hackers, “an army of volunteer coders who are challenging preconceptions about hacking and changing the way your government operates. In a time of plummeting budgets and efficiency drives, those in power have realised they needn’t always rely on slow-moving, expensive outsourcing and development to improve public services. Instead, they can consider running a hackathon, at which tech-savvy members of the public come together to create apps and other digital tools that promise to enhace the provision of healthcare, schools or policing.”
  • While recognizing that “civic hacking has established a pedigree that demonstrates its potential for positive impact,” Baraniuk argues that a “more rigorous debate over how this activity should evolve, or how authorities ought to engage in it” is needed.

Barnett, Brandon, Muki Hansteen Izora, and Jose Sia. “Civic Hackathon Challenges Design Principles: Making Data Relevant and Useful for Individuals and Communities.” Hack for Change, https://bit.ly/2Ge6z09.

  • In this paper, researchers from Intel Labs offer “guiding principles to support the efforts of local civic hackathon organizers and participants as they seek to design actionable challenges and build useful solutions that will positively benefit their communities.”
  • The authors proposed design principles are:
    • Focus on the specific needs and concerns of people or institutions in the local community. Solve their problems and challenges by combining different kinds of data.
    • Seek out data far and wide (local, municipal, state, institutional, non-profits, companies) that is relevant to the concern or problem you are trying to solve.
    • Keep it simple! This can’t be overstated. Focus [on] making data easily understood and useful to those who will use your application or service.
    • Enable users to collaborate and form new communities and alliances around data.

Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. “Amazon’s Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?” Perspectives on Psychological Science 6, no. 1 (January 1, 2011): 3–5. http://bit.ly/H56lER.

  • This article examines the capability of Amazon’s Mechanical Turk to act a source of data for researchers, in addition to its traditional role as a microtasking platform.
  • The authors examine the demographics of MTurkers and find that “MTurk participants are slightly more demographically diverse than are standard Internet samples and are significantly more diverse than typical American college samples; (b) participation is affected by compensation rate and task length, but participants can still be recruited rapidly and inexpensively; (c) realistic compensation rates do not affect data quality; and (d) the data obtained are at least as reliable as those obtained via traditional methods.”
  • The paper concludes that, just as MTurk can be a strong tool for crowdsourcing tasks, data derived from MTurk can be high quality while also being inexpensive and obtained rapidly.

Goodchild, Michael F., and J. Alan Glennon. “Crowdsourcing Geographic Information for Disaster Response: a Research Frontier.” International Journal of Digital Earth 3, no. 3 (2010): 231–241. http://bit.ly/17MBFPs.

  • This article examines issues of data quality in the face of the new phenomenon of geographic information being generated by citizens, in order to examine whether this data can play a role in emergency management.
  • The authors argue that “[d]ata quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data.”
  • Due to the fact that time is crucial during emergencies, the authors argue that, “the risks associated with volunteered information are often outweighed by the benefits of its use.”
  • The paper examines four wildfires in Santa Barbara in 2007-2009 to discuss current challenges with volunteered geographical data, and concludes that further research is required to answer how volunteer citizens can be used to provide effective assistance to emergency managers and responders.

Hudson-Smith, Andrew, Michael Batty, Andrew Crooks, and Richard Milton. “Mapping for the Masses Accessing Web 2.0 Through Crowdsourcing.” Social Science Computer Review 27, no. 4 (November 1, 2009): 524–538. http://bit.ly/1c1eFQb.

  • This article describes the way in which “we are harnessing the power of web 2.0 technologies to create new approaches to collecting, mapping, and sharing geocoded data.”
  • The authors examine GMapCreator and MapTube, which allow users to do a range of map-related functions such as create new maps, archive existing maps, and share or produce bottom-up maps through crowdsourcing.
  • They conclude that “these tools are helping to define a neogeography that is essentially ‘mapping for the masses,’ while noting that there are many issues of quality, accuracy, copyright, and trust that will influence the impact of these tools on map-based communication.”

Kanhere, Salil S. “Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology, edited by Chittaranjan Hota and Pradip K. Srimani, 19–26. Lecture Notes in Computer Science 7753. Springer Berlin Heidelberg. 2013. https://bit.ly/2zX8Szj.

  • This paper provides a comprehensive overview of participatory sensing — a “new paradigm for monitoring the urban landscape” in which “ordinary citizens can collect multi-modal data streams from the surrounding environment using their mobile devices and share the same using existing communications infrastructure.”
  • In addition to examining a number of innovative applications of participatory sensing, Kanhere outlines the following key research challenges:
    • Dealing with incomplete samples
    •  Inferring user context
    • Protecting user privacy
    • Evaluating data trustworthiness
    • Conserving energy

Government's Crowdsourcing Revolution


John M. Kamensky  in Governing: “In a recent report for the IBM Center for the Business of Government, Brabham says that an important distinction between crowdsourcing and other forms of online participation is that crowdsourcing “entails a mix of top-down, traditional, hierarchical process and a bottom-up, open process involving an online community.”
Crowdsourcing in the public sector can be done within government, among employees as a way to surface ideas — such as the New York City government’s “Simplicity” initiative — or it can be done by nonprofit groups in ways that influence government operations. For example, a transportation advocacy group in New York City has created a site where citizens can report “near miss” accidents, which are then mapped to determine patterns. The idea is that, while the city government already maps accidents that have happened, hazardous traffic zones can be detected and resolved faster by mapping near-misses without waiting for a large number of actual accidents.
Brabham offers a strategic view of crowdsourcing and when it is useful to address public problems. His report also identifies four specific approaches, describing which is most useful for a given category of problem:
Knowledge discovery and management. This approach is best for information-gathering and cataloguing problems through an online community, such as the reporting of earth tremors or potholes to a central source. This approach could also be used to report conditions of parks or hiking trails or for cataloging public art projects as have been done in several cities across the country.
Distributed human-intelligence tasking: This approach is most useful when human intelligence is more effective than computer analysis. It involves distributing “micro-tasks” that require human intelligence to solve, such as transcribing handwritten historical documents into electronic files. For example, when the handwritten 1940 census records were publicly released in 2012, the National Archives catalyzed the electronic tagging of more than 130 million records so they could be searchable online. More than 150,000 people volunteered.
Broadcast search: This approach is most useful when an agency is attempting to find creative solutions to problems. It involves broadcasting a problem-solving challenge widely on the Internet and offering an award for the best solution. NASA, for example, offered a prize for an algorithm to predict solar flares. The federal government sponsors a contest and awards Web platform, Challenge.gov, that various federal agencies can use to post their challenges. To date, hundreds of diverse challenges have been posted, with thousands of people proposing solutions.
Peer-vetted creative production: This approach is most useful when an agency is looking for innovative ideas that must meet a test of taste or market support. It involves an online community that both proposes possible solutions and is empowered to collectively choose among them. For example, the Utah Transit Authority sponsored the Next Stop Design project, allowing citizens to design and vote on an ideal bus-stop shelter. Nearly 3,200 people participated, submitting 260 high-quality architectural renderings, and there were more than 10,000 votes leading to a final selection….”

The Decay of American Political Institutions


in the American Interest: “Many political institutions in the United States are decaying. This is not the same thing as the broader phenomenon of societal or civilization decline, which has become a highly politicized topic in the discourse about America. Political decay in this instance simply means that a specific political process—sometimes an individual government agency—has become dysfunctional. This is the result of intellectual rigidity and the growing power of entrenched political actors that prevent reform and rebalancing. This doesn’t mean that America is set on a permanent course of decline, or that its power relative to other countries will necessarily diminish. Institutional reform is, however, an extremely difficult thing to bring about, and there is no guarantee that it can be accomplished without a major disruption of the political order. So while decay is not the same as decline, neither are the two discussions unrelated.
There are many diagnoses of America’s current woes. In my view, there is no single “silver bullet” cause of institutional decay, or of the more expansive notion of decline. In general, however, the historical context of American political development is all too often given short shrift in much analysis. If we look more closely at American history as compared to that of other liberal democracies, we notice three key structural characteristics of American political culture that, however they developed and however effective they have been in the past, have become problematic in the present.
The first is that, relative to other liberal democracies, the judiciary and the legislature (including the roles played by the two major political parties) continue to play outsized roles in American government at the expense of Executive Branch bureaucracies. Americans’ traditional distrust of government thus leads to judicial solutions for administrative problems. Over time this has become a very expensive and inefficient way to manage administrative requirements.
The second is that the accretion of interest group and lobbying influences has distorted democratic processes and eroded the ability of the government to operate effectively. What biologists label kin selection and reciprocal altruism (the favoring of family and friends with whom one has exchanged favors) are the two natural modes of human sociability. It is to these types of relationships that people revert when modern, impersonal government breaks down.
The third is that under conditions of ideological polarization in a federal governance structure, the American system of checks and balances, originally designed to prevent the emergence of too strong an executive authority, has become a vetocracy. The decision system has become too porous—too democratic—for its own good, giving too many actors the means to stifle adjustments in public policy. We need stronger mechanisms to force collective decisions but, because of the judicialization of government and the outsized role of interest groups, we are unlikely to acquire such mechanisms short of a systemic crisis. In that sense these three structural characteristics have become intertwined….
In short, the problems of American government flow from a structural imbalance between the strength and competence of the state, on the one hand, and the institutions that were originally designed to constrain the state, on the other. There is too much law and too much “democracy”, in the form of legislative intervention, relative to American state capacity. Some history can make this assertion clearer….
In well-functioning governance systems, moreover, a great deal of deliberation occurs not just in legislatures but within bureaucracies. This is not a matter of bureaucrats simply talking to one another, but rather a complex series of consultations between government officials and businesses, outside implementers and service providers, civil society groups, the media and other sources of information about societal interests and opinions. The Congress wisely mandated consultation in the landmark 1946 Administrative Procedures Act, which requires regulatory agencies to publicly post proposed rule changes and to solicit comment about them. But these consultative procedures have become highly routinized and pro forma, with actual decisions being the outcome not of genuine deliberation, but of political confrontations between well organized interest groups….”

Power to the people: how open data is improving health service delivery


The Guardian: “…What’s really interesting is how this data can be utilised by citizens to enable them to make more informed choices and demand improved services in sectors such as health. A growing community of technologists and social activists is emerging across Africa, supported by a burgeoning network of technology innovation hubs. They’re beginning to explore the ways in which data can be utilised to improve health outcomes.
In Northern Uganda, the brutal Lord’s Resistance Army conflict displaced two million people, leaving the social infrastructure in tatters. In 2008, the government launched a Peace, Recovery and Development Plan, but progress has been limited. There are insufficient health centres to serve the population, a severe shortage of staff, drugs and equipment, and corruption is widespread.
Cipesa – an organisation that uses communication technologies to support poverty reduction and development – and Numec, a local media organisation, have launched the iParticipate project. A multimedia platform is being populated with baseline data outlining the current status of the health service across three districts….
In the same region, Wougnet is training women to use information technologies to tackle social challenges. Local officials and community members have formed voluntary social accountability committees and been trained in the use of an online platform to capture and store information relating to poor governance and corruption in the health sector, often via mobile phones.
The platform strengthened campaign efforts which resulted in the construction of a health centre in Aloni Parish. In Amuru district, five health workers were arrested following reports highlighting negligence.
In the village of Bagega in Nigeria, 400 children died and thousands suffered significant health problems as the result of lead poisoning caused by poor mining practices. The government pledged $5.3m (£3.23m) for remediation, but the funds never reached the affected region.
A local organisation, Follow the Money, created an infographic highlighting the government’s commitments and combined this with real life testimonies and photographs showing the actual situation on the ground. Within 48 hours of a targeted Twitter campaign, the president committed to releasing funds to the village and, in February this year, children started receiving long overdue medical attention.
All these initiatives depend on access to critical government data and an active citizens who feel empowered to effect change in their own lives and communities. At present, it’s often hard to access data which is sufficiently granular, particularly at district or local level. For citizens to be engaged with information from government, it also needs to be accessible in ways that are simple to understand and linked to campaigns that impact their daily lives.
Tracking expenditure can also operate across borders. Donors are beginning to open up aid data by publishing to the IATI registry. This transparency by donor governments should improve the effectiveness of aid spending and contribute towards improved health outcomes.
It’s hard to draw general conclusions about how technology can contribute towards improving health outcomes, particularly when context is so critical and the field is so new. Nonetheless, some themes are emerging which can maximise the chances of an intervention’s success.
It can at times be challenging to encourage citizens to report for an array of reasons, including a lack of belief in their ability to effect change, cultural norms, a lack of time and both perceived and real risks. Still, participation seems to increase when citizens receive feedback from reports submitted and when mechanisms are in place that enable citizens to take collective action. On-the-ground testimonies and evidence can also help shift public opinion and amplify critical messages.
Interventions are dramatically strengthened when integrated into wider programmes, implemented by organisations that have established a strong relationship with the communities in which they work. They need to be backed by at least one strong civil society organisation that can follow up on any reports, queries or challenges which may arise. Where possible, engagement from government and local leaders can make a real difference. Identifying champions within government can also significantly improve responsiveness.”

The value and challenges of public sector information


A paper by M. Henninger in Cosmopolitan Civil Societies: An Interdisciplinary Journal: “The aim of this paper is to explore the concept of public sector information (PSI), what it is, its history and evolution, what constitutes its corpus of documents and the issues and challenges it presents to society, its institutions and to those who use and manage it. The paper, by examining the literatures of the law, political science, civil society, economics and information and library science explores the inherent tensions of access to and use of PSI—pragmatism vs. idealism; openness vs. secrecy; commerce vs. altruism; property vs. commons; public good vs. private good. It focusses on open government data (OGD)—a subset of what is popularly referred to as ‘big data’—its background and development since much of the current debate of its use concerns its commercial value for both the private sector and the public sector itself. In particular it looks at the information itself which, driven by technologies of networks, data mining and visualisation gives value in industrial and economic terms, and in its ability to enable new ideas and knowledge.”