Buenos Aires, A Pocket of Civic Innovation in Argentina


Rebecca Chao in TechPresident: “…In only a few years, the government, civil society and media in Buenos Aires have actively embraced open data. The Buenos Aires city government has been publishing data under a creative commons license and encouraging civic innovation through hackathons. NGOs have launched a number of tech-driven tools and Argentina’s second largest newspaper, La Nación, has published several hard-hitting data journalism projects. The result is a fledgling but flourishing open data culture in Buenos Aires, in a country that has not yet adopted a freedom of information law.

A Wikipedia for Open Government Data

In late August of this year, the Buenos Aires government declared a creative commons license for all of its digital content, which allows it be used for free, like Wikipedia content, with proper attribution. This applies to their new open data catalog that allows users to visualize the data, examine apps that have been created using the data and even includes a design lab for posting app ideas. Launched only in March, the government has already published fairly substantial data sets, including the salaries of city officials. The website also embodies the principals of openness in its design; it is built with open-source software and its code is available for reuse via GitHub.
“We were the first city in Argentina doing open government,” Rudi Borrmann tells techPresident over Skype. Borrmann is the Director of Buenos Aires’ Open Government Initiative. Previously, he was the social media editor at the city’s New Media Office but he also worked for many years in digital media…
While the civil society and media sectors have forged ahead in using open data, Borrmann tells techPresident that up in the ivory tower, openness to open data has been lagging. “Only technical schools are starting to create areas focused on working on open data,” he says.
In an interview with NYU’s govlab, Borrmann explained the significance of academia in using and pushing for more open data. “They have the means, the resources, the methodology to analyze…because in government you don’t have that time to analyze,” he said.
Another issue with open data is getting other branches of the government to modernize. Borrmann says that a lot of the Open Government’s work is done behind the scenes. “In general, you have very poor IT infrastructure all over Latin America” that interferes with the gathering and publishing of data, he says. “So in some cases it’s not about publishing or not publishing,” but about “having robust infrastructure for the information.”
It seems that the behind the scenes work is bearing some fruit. Just last week, on Dec. 6, the team behind the Buenos Aires open data website launched an impressive, interactive timeline, based on a similar timelapse map developed by a 2013 Knight-Mozilla Fellow, Noah Veltman. Against faded black and white photos depicting the subway from different decades over the last century, colorful pops of the Subterráneo lines emerge alongside factoids that go all the way back to 1910.”

Digital Passivity


Jaron Lanier in the New York Times: “I fear that 2013 will be remembered as a tragic  and dark year in the digital universe, despite the fact that a lot of wonderful advances took place.

It was the year in which tablets became ubiquitous and advanced gadgets like 3-D printers and wearable interfaces emerged as pop phenomena; all great fun. Our gadgets have widened access to our world. We now regularly communicate with people we would not have been aware of before the networked age. We can find information about almost anything, any time.

But 2013 was also the year in which we became aware of the corner we’ve backed ourselves into. We learned — through the leaks of Edward J. Snowden, the former U.S. National Security Agency contractor, and the work of investigative journalists — how much our gadgets and our digital networks are being used to spy on us by ultra-powerful, remote organizations. We are being dissected more than we dissect.

I wish I could separate the two big trends of the year in computing — the cool gadgets and the revelations of digital spying — but I cannot.

Back at the dawn of personal computing, the idealistic notion that drove most of us was that computers were tools for leveraging human intelligence to ever-greater achievement and fulfillment. This was the idea that burned in the hearts of pioneers like Alan Kay, who a half-century ago was already drawing illustrations of how children would someday use tablets.

But tablets do something unforeseen: They enforce a new power structure. Unlike a personal computer, a tablet runs only programs and applications approved by a central commercial authority. You control the data you enter into a PC, while data entered into a tablet is often managed by someone else.

Steve Jobs, who oversaw the introduction of the spectacularly successful iPad at Apple, declared that personal computers were now ‘‘trucks’’ — tools for working-class guys in T-shirts and visors, but not for upwardly mobile cool people. The implication was that upscale consumers would prefer status and leisure to influence or self-determination.

I am not sure who is to blame for our digital passivity. Did we give up on ourselves too easily?

This would be bleak enough even without the concurrent rise of the surveillance economy. Not only have consumers prioritized flash and laziness over empowerment; we have also acquiesced to being spied on all the time.

The two trends are actually one. The only way to persuade people to voluntarily accept the loss of freedom is by making it look like a great bargain at first.

Consumers were offered free stuff (like search and social networking) in exchange for agreeing to be watched. Vast fortunes can be made by those who best use the personal data you voluntarily hand them. Instagram, introduced in 2010, had only 13 employees and no business plan when it was bought by Facebook less than two years later for $1 billion.

One can argue that network technology enhances democracy because it makes it possible, for example, to tweet your protests. But complaining is not yet success. Social media didn’t create jobs for young people in Cairo during the Arab Spring…”

Tech challenge develops algorithms to predict


SciDevNet: “Mathematical models that use existing socio-political data to predict mass atrocities could soon inform governments and NGOs on how and where to take preventative action.
The models emerged from one strand of the Tech Challenge for Atrocity Prevention, a competition run by the US Agency for International Development (USAID) and NGO Humanity United. The winners were announced last month (18 November) and will now work with the organiser to further develop and pilot their innovations.
The five winners from different countries who won between US$1,000 and US$12,000, were among nearly 100 entrants who developed algorithms to predict when and where mass atrocities are likely to happen.
Around 1.5 billion people live in countries affected by conflict, sometimes including atrocities such as genocides, mass rape and ethnic cleansing, according to the World Bank’s World Development Report 2011. Many of these countries are in the developing world.
The competition organisers hope the new algorithms could help governments and human rights organisations identify at-risk regions, potentially allowing them to intervene before mass atrocities happen.
The competition started from the premise that certain social and political measurements are linked to increased likelihood of atrocities. Yet because such factors interact in complex ways, organisations working to prevent atrocities lack a reliable method of predicting when and where they might happen next.
The algorithms use sociopolitical indicators and data on past atrocities as their inputs. The data was drawn from archives such as the Global Database of Events, Language and Tone, a data set that encodes more than 200 million globally newsworthy events, recording cultural information such as the people involved, their location and any religious connections.”
Link to the winners of the Model Challenge

Participation Dynamics in Crowd-Based Knowledge Production: The Scope and Sustainability of Interest-Based Motivation


New paper by Henry Sauermann and Chiara Franzoni: “Crowd-based knowledge production is attracting growing attention from scholars and practitioners. One key premise is that participants who have an intrinsic “interest” in a topic or activity are willing to expend effort at lower pay than in traditional employment relationships. However, it is not clear how strong and sustainable interest is as a source of motivation. We draw on research in psychology to discuss important static and dynamic features of interest and derive a number of research questions regarding interest-based effort in crowd-based projects. Among others, we consider the specific versus general nature of interest, highlight the potential role of matching between projects and individuals, and distinguish the intensity of interest at a point in time from the development and sustainability of interest over time. We then examine users’ participation patterns within and across 7 different crowd science projects that are hosted on a shared platform. Our results provide novel insights into contribution dynamics in crowd science projects. Moreover, given that extrinsic incentives such as pay, status, self-use, or career benefits are largely absent in these particular projects, the data also provide unique insights into the dynamics of interest-based motivation and into its potential as a driver of effort.”

20 Innovations that Mattered in 2013


New Guide from GovLoop: “The end of the year means two things: setting unrealistic New Year’s resolutions and endless retrospectives.  While we can’t force you to put down the cake and pick up a carrot, we can help you to do your job better by highlighting some of the biggest and best innovations to come out of government in the last 365 days.
The past year brought us the Interior Department’s Instagram feed and Colorado’s redesigned website. It also brought us St. Louis’ optimized data analytics that make their city safer and North Carolina’s iCenter that adopted a “try before you buy” policy.
All of these new technologies and tactics saved time and resources, critical outcomes in the current government landscape where budget cuts are making each new purchase risky.
But these were not the only buzzworthy projects for government technology in 2013. In this end-of-year issue, GovLoop analyzed the 20 best innovations in government in four different categories:

  • Mobile Apps Movers and Shakers?
  • Big Data Dynamos
  • Social Media Mavericks
  •  Website Wonders

We also asked two of the most innovative Chief Information Officers in the country to don some Google Glass’ In a year where the government shutdown and sequestration brought progress to a screeching halt, many agencies were able to rise above the inauspicious environment and produce groundbreaking and innovative programs.
For instance, when the horrible bombings brought terror to the finish line of the Boston Marathon, the local police department sprang into action. They immediately mobilized their forces on the ground. But then they did something else, too. The Boston Police Department took to Twitter. The social media team was informative, timely and, accurate. The BPD flipped the script on emergency media management. They innovated in a time of crisis and got people the information they needed in a timely and appropriate manner.
We know that oftentimes it is hard to see through the budget cuts and government shutdowns, but government innovation is all around us. Our goal with this guide is to showcase programs that are not only making a difference but demonstrate risk and reward….”

Building tech-powered public services


New publication by Sarah Bickerstaffe from IPPR (UK): “Given the rapid pace of technological change and take-up by the public, it is a question of when not if public services become ‘tech-powered’. This new paper asks how we can ensure that innovations are successfully introduced and deployed.
Can technology improve the experience of people using public services, or does it simply mean job losses and a depersonalised offer to users?
Could tech-powered public services be an affordable, sustainable solution to some of the challenges of these times of austerity?
This report looks at 20 case studies of digital innovation in public services, using these examples to explore the impact of new and disruptive technologies. It considers how tech-powered public services can be delivered, focusing on the area of health and social care in particular.
We identify three key benefits of increasing the role of technology in public services: saving time, boosting user participation, and encouraging users to take responsibility for their own wellbeing.
In terms of how to successfully implement technological innovations in public services, five particular lessons stood out clearly and consistently:

  1. User-based iterative design is critical to delivering a product that solves real-world problems. It builds trust and ensures the technology works in the context in which it will be used.
  2. Public sector expertise is essential in order for a project to make the connections necessary to initial development and early funding.
  3. Access to seed and bridge funding is necessary to get projects off the ground and allow them to scale up.
  4. Strong leadership from within the public sector is crucial to overcoming the resistance that practitioners and managers often show initially.
  5. A strong business case that sets out the quality improvements and cost savings that the innovation can deliver is important to get attention and interest from public services.

The seven headline case studies in this report are:

  • Patchwork creates an elegant solution to join up professionals working with troubled families, in an effort to ensure that frontline support is truly coordinated.
  • Casserole Club links people who like cooking with their neighbours who are in need of a hot meal, employing the simplest possible technology to grow social connections.
  • ADL Smartcare uses a facilitated assessment tool to make professional expertise accessible to staff and service users without years of training, meaning they can carry out assessments together, engaging people in their own care and freeing up occupational therapists to focus where they are needed.
  • Mental Elf makes leading research in mental health freely available via social media, providing accessible summaries to practitioners and patients who would not otherwise have the time or ability to read journal articles, which are often hidden behind a paywall.
  • Patient Opinion provides an online platform for people to give feedback on the care they have received and for healthcare professionals and providers to respond, disrupting the typical complaints process and empowering patients and their families.
  • The Digital Pen and form system has saved the pilot hospital trust three minutes per patient by avoiding the need for manual data entry, freeing up clinical and administrative staff for other tasks.
  • Woodland Wiggle allows children in hospital to enter a magical woodland world through a giant TV screen, where they can have fun, socialise, and do their physiotherapy.”

The Brainstorm Begins: Initial Ideas for Evolving ICANN


Screen Shot 2013-12-09 at 6.41.19 PM“The ICANN Strategy Panel on Multistakeholder Innovation (MSI Panel) is underway working to curate a set of concrete proposals for ways that the Internet Corporation for Assigned Names & Numbers (ICANN) could prototype new institutional arrangements for the 21st century. The Panel is working to identify how ICANN can open itself to more global participation in its governance functions. Specifically, the MSI Panel is charged with:

  • Proposing new models for international engagement, consensus-driven policymaking and institutional structures to support such enhanced functions; and
  • Designing processes, tools and platforms that enable the global ICANN community to engage in these new forms of participatory decision-making.

To help answer this charter, the MSI Panel launched an “Idea Generation” or ideation platform, designed to brainstorm with the global public on how to evolve the way ICANN could operate given the innovations in governance happening across the world.

We’re now 3 weeks in to this Idea Generation stage – taking place online here: thegovlab.ideascale.com – and we wanted to share with you what the Panel and The GovLab has heard so far regarding what tools, technologies, platforms and techniques ICANN could learn from or adapt to help design an innovative approach to problem-solving within the Domain Name System going forward.

These initial ideas begin to paint a picture of what 21st century coordination of a shared global commons might involve. These brainstorms all point to certain core principles the Panel believes provide the groundwork for an institution to legitimately operate in the global public interest today. These principles include:

  • Openness –  Ensuring open channels as well as very low or no barriers to meaningful participation.
  • Transparency – Providing public access to information and deliberation data.
  • Accessibility – Developing simple and legible organizational communications.
  • Inclusivity and Lack of Domination – Ensuring access to global participation and that no one player, entity or interest dominates processes or outcomes.
  • Accountability – Creating mechanisms for the global public to check institutional power.
  • Effectiveness –  Improving decision-making through greater reliance on evidence and a focus on flexibility and agility.
  • Efficiency – Streamlining processes to better leverage time, resources and human capital.

With these core principles as the backdrop, the ideas we’ve heard so far roughly fall within the following categories…
See also thegovlab.ideascale.com

Selected Readings on Crowdsourcing Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing data was originally published in 2013.

As institutions seek to improve decision-making through data and put public data to use to improve the lives of citizens, new tools and projects are allowing citizens to play a role in both the collection and utilization of data. Participatory sensing and other citizen data collection initiatives, notably in the realm of disaster response, are allowing citizens to crowdsource important data, often using smartphones, that would be either impossible or burdensomely time-consuming for institutions to collect themselves. Civic hacking, often performed in hackathon events, on the other hand, is a growing trend in which governments encourage citizens to transform data from government and other sources into useful tools to benefit the public good.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Baraniuk, Chris. “Power Politechs.” New Scientist 218, no. 2923 (June 29, 2013): 36–39. http://bit.ly/167ul3J.

  • In this article, Baraniuk discusses civic hackers, “an army of volunteer coders who are challenging preconceptions about hacking and changing the way your government operates. In a time of plummeting budgets and efficiency drives, those in power have realised they needn’t always rely on slow-moving, expensive outsourcing and development to improve public services. Instead, they can consider running a hackathon, at which tech-savvy members of the public come together to create apps and other digital tools that promise to enhace the provision of healthcare, schools or policing.”
  • While recognizing that “civic hacking has established a pedigree that demonstrates its potential for positive impact,” Baraniuk argues that a “more rigorous debate over how this activity should evolve, or how authorities ought to engage in it” is needed.

Barnett, Brandon, Muki Hansteen Izora, and Jose Sia. “Civic Hackathon Challenges Design Principles: Making Data Relevant and Useful for Individuals and Communities.” Hack for Change, https://bit.ly/2Ge6z09.

  • In this paper, researchers from Intel Labs offer “guiding principles to support the efforts of local civic hackathon organizers and participants as they seek to design actionable challenges and build useful solutions that will positively benefit their communities.”
  • The authors proposed design principles are:
    • Focus on the specific needs and concerns of people or institutions in the local community. Solve their problems and challenges by combining different kinds of data.
    • Seek out data far and wide (local, municipal, state, institutional, non-profits, companies) that is relevant to the concern or problem you are trying to solve.
    • Keep it simple! This can’t be overstated. Focus [on] making data easily understood and useful to those who will use your application or service.
    • Enable users to collaborate and form new communities and alliances around data.

Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. “Amazon’s Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?” Perspectives on Psychological Science 6, no. 1 (January 1, 2011): 3–5. http://bit.ly/H56lER.

  • This article examines the capability of Amazon’s Mechanical Turk to act a source of data for researchers, in addition to its traditional role as a microtasking platform.
  • The authors examine the demographics of MTurkers and find that “MTurk participants are slightly more demographically diverse than are standard Internet samples and are significantly more diverse than typical American college samples; (b) participation is affected by compensation rate and task length, but participants can still be recruited rapidly and inexpensively; (c) realistic compensation rates do not affect data quality; and (d) the data obtained are at least as reliable as those obtained via traditional methods.”
  • The paper concludes that, just as MTurk can be a strong tool for crowdsourcing tasks, data derived from MTurk can be high quality while also being inexpensive and obtained rapidly.

Goodchild, Michael F., and J. Alan Glennon. “Crowdsourcing Geographic Information for Disaster Response: a Research Frontier.” International Journal of Digital Earth 3, no. 3 (2010): 231–241. http://bit.ly/17MBFPs.

  • This article examines issues of data quality in the face of the new phenomenon of geographic information being generated by citizens, in order to examine whether this data can play a role in emergency management.
  • The authors argue that “[d]ata quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data.”
  • Due to the fact that time is crucial during emergencies, the authors argue that, “the risks associated with volunteered information are often outweighed by the benefits of its use.”
  • The paper examines four wildfires in Santa Barbara in 2007-2009 to discuss current challenges with volunteered geographical data, and concludes that further research is required to answer how volunteer citizens can be used to provide effective assistance to emergency managers and responders.

Hudson-Smith, Andrew, Michael Batty, Andrew Crooks, and Richard Milton. “Mapping for the Masses Accessing Web 2.0 Through Crowdsourcing.” Social Science Computer Review 27, no. 4 (November 1, 2009): 524–538. http://bit.ly/1c1eFQb.

  • This article describes the way in which “we are harnessing the power of web 2.0 technologies to create new approaches to collecting, mapping, and sharing geocoded data.”
  • The authors examine GMapCreator and MapTube, which allow users to do a range of map-related functions such as create new maps, archive existing maps, and share or produce bottom-up maps through crowdsourcing.
  • They conclude that “these tools are helping to define a neogeography that is essentially ‘mapping for the masses,’ while noting that there are many issues of quality, accuracy, copyright, and trust that will influence the impact of these tools on map-based communication.”

Kanhere, Salil S. “Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology, edited by Chittaranjan Hota and Pradip K. Srimani, 19–26. Lecture Notes in Computer Science 7753. Springer Berlin Heidelberg. 2013. https://bit.ly/2zX8Szj.

  • This paper provides a comprehensive overview of participatory sensing — a “new paradigm for monitoring the urban landscape” in which “ordinary citizens can collect multi-modal data streams from the surrounding environment using their mobile devices and share the same using existing communications infrastructure.”
  • In addition to examining a number of innovative applications of participatory sensing, Kanhere outlines the following key research challenges:
    • Dealing with incomplete samples
    •  Inferring user context
    • Protecting user privacy
    • Evaluating data trustworthiness
    • Conserving energy

Public Open Sensor Data: Revolutionizing Smart Cities


New Paper in Technology and Society Magazine, IEEE (Volume: 32,  Issue: 4): “Local governments have decided to take advantage of the presence of wireless sensor networks (WSNs) in their cities to efficiently manage several applications in their daily responsibilities. The enormous amount of information collected by sensor devices allows the automation of several real-time services to improve city management by using intelligent traffic-light patterns during rush hour, reducing water consumption in parks, or efficiently routing garbage collection trucks throughout the city [1]. The sensor information required by these examples is mostly self-consumed by city-designed applications and managers.”