Research Handbook On Transparency


New book edited by Padideh Ala’i and Robert G. Vaughn: ‘”Transparency” has multiple, contested meanings. This broad-ranging volume accepts that complexity and thoughtfully contrasts alternative views through conceptual pieces, country cases, and assessments of policies–such as freedom of information laws, whistleblower protections, financial disclosure, and participatory policymaking procedures.’
– Susan Rose-Ackerman, Yale University Law School, US
In the last two decades transparency has become a ubiquitous and stubbornly ambiguous term. Typically understood to promote rule of law, democratic participation, anti-corruption initiatives, human rights, and economic efficiency, transparency can also legitimate bureaucratic power, advance undemocratic forms of governance, and aid in global centralization of power. This path-breaking volume, comprising original contributions on a range of countries and environments, exposes the many faces of transparency by allowing readers to see the uncertainties, inconsistencies and surprises contained within the current conceptions and applications of the term….
The expert contributors identify the goals, purposes and ramifications of transparency while presenting both its advantages and shortcomings. Through this framework, they explore transparency from a number of international and comparative perspectives. Some chapters emphasize cultural and national aspects of the issue, with country-specific examples from China, Mexico, the US and the UK, while others focus on transparency within global organizations such as the World Bank and the WTO. A number of relevant legal considerations are also discussed, including freedom of information laws, financial disclosure of public officials and whistleblower protection…”

Ambulance Drone is a flying first aid kit that could save lives


Springwise: “When a medical emergency takes place, the response time can make all the difference between a life saved and a life lost. Unfortunately, ambulances can get stuck in traffic and on average they arrive 10 minutes after the emergency call has been made, in which time a cardiac arrest victim may have already succumbed to a lack of oxygen to the brain. We’ve already seen Germany’s Defikopter use drones to ensure defibrillators are on scene by the time a medical professional arrives, but now the Ambulance Drone is an all-purpose medical toolkit that can be automatically flown to any emergency situation and used to guide citizens to make non-technical lifesaving procedures.
Created by Alex Monton, a graduate of the Delft University of Technology, the drone is custom designed to deliver in the event of an emergency. Inside, it houses a compact defibrillator, medication and CPR aids, as well as other essential supplies for the layperson to use while they wait for a medical professional. The idea is that those at the scene can phone emergency services as normal, giving their location. An ambulance and the Ambulance Drone are despatched immediately, with the drone capable of arriving in around 1 minute.
Once it’s there, the call can be transferred to the drone, which has in-built speakers. This frees the caller’s hands to perform tasks such as placing the victim in the recovery position and preparing the defibrillator, with vocal guidance from the emergency response team. The team can see live video of the event to make sure that any procedures are completed correctly, as well as passing on relevant info to the approaching ambulance…”

Taproot Foundation Starts Online Matchmaker for Charities Seeking Pro Bono Help


Nicole Wallace at the Chronicle of Philanthropy: “The Taproot Foundation has created an online marketplace it hopes will become the Match.com of pro bono, linking skilled volunteers with nonprofits that need assistance in areas like marketing, database design, and strategic planning.
The new site, Taproot+, allows nonprofits to describe projects needing help. Taproot Foundation employees will review proposals and help improve any unclear project descriptions….
People looking to share their skills can browse projects on the site. Some charities ask for in-person help, while other projects can use volunteers working remotely. In some cases, Taproot will post the projects on sites run by partner organizations, like the LinkedIn for Volunteers, to help find the right volunteer. As the site grows, the group plans to work closely with other pro bono organizations, like NPower and DataKind.
“We want to make sure that we’re helping on the front end,” says Ms. Hamburg. “But once that project description is created, we want to make sure that the nonprofit is accessing the best talent out there, no matter where it is.
After a nonprofit and pro bono volunteer agree to work together, Taproot+ helps them plan the steps of the project and set deadlines for milestones, which are tracked on the site…”

Ebola and big data: Call for help


The Economist: “WITH at least 4,500 people dead, public-health authorities in west Africa and worldwide are struggling to contain Ebola. Borders have been closed, air passengers screened, schools suspended. But a promising tool for epidemiologists lies unused: mobile-phone data.
When people make mobile-phone calls, the network generates a call data record (CDR) containing such information as the phone numbers of the caller and receiver, the time of the call and the tower that handled it—which gives a rough indication of the device’s location. This information provides researchers with an insight into mobility patterns. Indeed phone companies use these data to decide where to build base stations and thus improve their networks, and city planners use them to identify places to extend public transport.
But perhaps the most exciting use of CDRs is in the field of epidemiology. Until recently the standard way to model the spread of a disease relied on extrapolating trends from census data and surveys. CDRs, by contrast, are empirical, immediate and updated in real time. You do not have to guess where people will flee to or move. Researchers have used them to map malaria outbreaks in Kenya and Namibia and to monitor the public response to government health warnings during Mexico’s swine-flu epidemic in 2009. Models of population movements during a cholera outbreak in Haiti following the earthquake in 2010 used CDRs and provided the best estimates of where aid was most needed.
Doing the same with Ebola would be hard: in west Africa most people do not own a phone. But CDRs are nevertheless better than simulations based on stale, unreliable statistics. If researchers could track population flows from an area where an outbreak had occurred, they could see where it would be likeliest to break out next—and therefore where they should deploy their limited resources. Yet despite months of talks, and the efforts of the mobile-network operators’ trade association and several smaller UN agencies, telecoms firms have not let researchers use the data (see article).
One excuse is privacy, which is certainly a legitimate worry, particularly in countries fresh from civil war, or where tribal tensions exist. But the phone data can be anonymised and aggregated in a way that alleviates these concerns. A bigger problem is institutional inertia. Big data is a new field. The people who grasp the benefits of examining mobile-phone usage tend to be young, and lack the clout to free them for research use.”

Chicago uses big data to save itself from urban ills


Aviva Rutkin in the New Scientist: “THIS year in Chicago, some kids will get lead poisoning from the paint or pipes in their homes. Some restaurants will cook food in unsanitary conditions and, here and there, a street corner will be suddenly overrun with rats. These kinds of dangers are hard to avoid in a city of more than 2.5 million people. The problem is, no one knows for certain where or when they will pop up.

The Chicago city government is hoping to change that by knitting powerful predictive models into its everyday city inspections. Its latest project, currently in pilot tests, analyses factors such as home inspection records and census data, and uses the results to guess which buildings are likely to cause lead poisoning in children – a problem that affects around 500,000 children in the US each year. The idea is to identify trouble spots before kids are exposed to dangerous lead levels.

“We are able to prevent problems instead of just respond to them,” says Jay Bhatt, chief innovation officer at the Chicago Department of Public Health. “These models are just the beginning of the use of predictive analytics in public health and we are excited to be at the forefront of these efforts.”

Chicago’s projects are based on the thinking that cities already have what they need to raise their municipal IQ: piles and piles of data. In 2012, city officials built WindyGrid, a platform that collected data like historical facts about buildings and up-to-date streams such as bus locations, tweets and 911 calls. The project was designed as a proof of concept and was never released publicly but it led to another, called Plenario, that allowed the public to access the data via an online portal.

The experience of building those tools has led to more practical applications. For example, one tool matches calls to the city’s municipal hotline complaining about rats with conditions that draw rats to a particular area, such as excessive moisture from a leaking pipe, or with an increase in complaints about garbage. This allows officials to proactively deploy sanitation crews to potential hotspots. It seems to be working: last year, resident requests for rodent control dropped by 15 per cent.

Some predictions are trickier to get right. Charlie Catlett, director of the Urban Center for Computation and Data in Chicago, is investigating an old axiom among city cops: that violent crime tends to spike when there’s a sudden jump in temperature. But he’s finding it difficult to test its validity in the absence of a plausible theory for why it might be the case. “For a lot of things about cities, we don’t have that underlying theory that tells us why cities work the way they do,” says Catlett.

Still, predictive modelling is maturing, as other cities succeed in using it to tackle urban ills….Such efforts can be a boon for cities, making them more productive, efficient and safe, says Rob Kitchin of Maynooth University in Ireland, who helped launched a real-time data site for Dublin last month called the Dublin Dashboard. But he cautions that there’s a limit to how far these systems can aid us. Knowing that a particular street corner is likely to be overrun with rats tomorrow doesn’t address what caused the infestation in the first place. “You might be able to create a sticking plaster or be able to manage it more efficiently, but you’re not going to be able to solve the deep structural problems….”

Innovation in Philanthropy is not a Hack-a-thon


Sam McAfee in Medium: “…Antiquated funding models and lack of a rapid data-driven evaluation process aren’t the only issues though. Most of the big ideas in the technology-for-social-impact space are focused either on incremental improvements to existing service models, maybe leveraging online services or mobile applications to improve cost-efficiency marginally. Or they solve only a very narrow niche problem for a small audience, often applying a technology that was already in development, and just happened to find a solution in the field.

Innovation Requires Disruption

When you look at innovation in the commercial world, like the Ubers and AirBnBs of the world, what you see is a clear and substantive break from previous modes of thinking about transportation and accommodation. And it’s not the technology itself that is all that impressive. There is nothing ground-breaking technically under the hood of either of those products that wasn’t already lying around for a decade. What makes them different is that they created business models that stepped completely out of the existing taxi and hotel verticals, and simply used technology to leverage existing frustrations with those antiquated models and harness latent demands, to produce a new, vibrant commercial ecosystem.

Now, let’s imagine the same framework in the social sector, where there are equivalent long-standing traditional modes of providing resources. To find new ways of meeting human needs that disrupt those models requires both safe-to-fail experimentation and rapid feedback and iteration in the field, with clear success criteria. Such rapid development can only be accomplished by a sharp, nimble and multifaceted team of thinkers and doers who are passionate about the problem, yes, but also empowered and enabled to break a few institutional eggs on the way to the creative omelet.

Agile and Lean are Proven Methods

It turns out that there are proven working models for cultivating and fostering this kind of innovative thinking and experimentation. As I mentioned above, agile and lean are probably the single greatest contribution to the world by the tech sector, far more impactful than any particular technology produced by it. Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems. They are able to do this precisely because they are unhindered by the hulking bureaucratic structures of the old guard. This is precisely why so many Fortune 500 companies are experimenting with innovation and R&D laboratories. Because they know their existing staff, structures, and processes cannot produce innovation within those constraints. Only the small, nimble teams can do it, and they can only do it if they are kept separate from, protected from even, the traditional production systems of the previous product cycle.

Yet big philanthropy still have barely experimented with this model, only trying it in a few isolated instances. Here at Neo, for example, we are working on a project for teachers funded by a forward-thinking foundation. What our client is trying to disrupt is no less than the entire US education system, and with goals and measurements developed by teachers for teachers, not by Silicon Valley hotshots who have no clue how to fix education.

Small, cross-functional teams working on tight, iterative timeframes, using an iterative data-informed methodology, can create new and disruptive solutions to big, difficult problems.

To start with, the project was funded in iterations of six-weeks at a time, each with a distinct and measurable goal. We built a small cross-functional team to tackle some of the tougher issues faced by teachers trying to raise the level of excellence in their classrooms. The team was empowered to talk directly to teachers, and incorporate their feedback into new versions of the project, released on almost a daily basis. We have iterated the design more than sixteen times in less then four months, and it’s starting to really take shape.

We have no idea whether this particular project will be successful in the long run. But what we do know is that the client and their funder have had the courage to step out of the traditional project funding models and apply agile and lean thinking to a very tough problem. And we’re proud to be invited along for the ride.

The vast majority of the social sector is still trying to tackle social problems with program and funding models that were pioneered early in the last century. Agile and lean methods hold the key to finally breaking the mold of the old, traditional model of resourcing social change initiatives. The philanthropic community should be interested in the agile and lean methods produced by the technology sector, not the money produced by it, and start reorganizing project teams and resource allocation strategies and timelines in line this proven innovation model.

Only then we will be in a position to really innovate for social change.”

The Power of Data Analytics to Transform Government


Hugo Moreno at Forbes: It’s mind boggling to consider the amount of information governments collect on their citizens. We often just expect them to manage and analyze it in a way that will benefit the general public and facilitate government transparency. However, it can be difficult to organize, manage and extract insights from these large, diverse data sets. According to “Analytics Paves the Way for Better Government,” a Forbes Insights case study sponsored by SAP, government leaders have called for investment in Big Data analytics capabilities to modernize government services and aid their economies. State and federal governments have begun to recognize the benefits of applying analytics. In fact, McKinsey & Co. estimates that by digitizing information, disseminating public data sets and applying analytics to improve decision making, governments around the world can act as catalysts for more than $3 trillion in economic value.
Governor Mike Pence of Indiana understands the importance of data and is keeping it at the center of his long-term vision for improving the management and effectiveness of government programs and making Indiana a leader in data-driven decision making. A year after taking office, he ordered state agencies to collaborate and share data to improve services. Data sharing is not a common practice in states, but the governor recognized that sharing data will lead to a successful enterprise.
Insights from analytics will help Indiana pursue six public policy goals: Increase private sector employment; attract new investment to the state; improve the quality of the state’s workforce; improve the health, safety and well-being of families; increase high school graduation rates; and improve the math and reading skills of elementary students….”

In an emergency, apps locate nearby first aiders


Springwise: “In the case of a sudden accident or health problem, a matter of seconds’ difference in the response times of emergency services can be the difference between saving a life and losing one. The irony is that there could even be trained first aiders nearby, but there’s usually no way of letting them know they’re needed. Now two apps — GoodSAM and PulsePoint — both let those with life-saving skills to receive alerts when an emergency happens close by.
The GoodSAM app comes in two versions — the Alerter and Responder. Those facing an emergency can use the Alerter app to instantly send a call for help to any responders in the vicinity, along with their exact location. At the same time, the country’s emergency number is also called. Those who have first aid or medical experience can download the Responder app, which pushes a notification to their device whenever someone needs help. If they’re unavailable, they can choose to reject the request and the Alerter will be notified. If accepted, the app offers a map and directions to the location of the incident and the two parties can communicate through an in-app messaging service.
Watch the video below for a demonstration of how the system works…
The PulsePoint Respond app works in much the same way, although is mainly focused on sudden cardiac arrests and those who can offer CPR resuscitation. Additionally, anyone in the community can use the app to track emergency activity in their neighborhood. Local authorities can also implement the PulsePoint system across their jurisdictions, with community outreach strategies and project management services included starting from USD 5,000….”

A new generation of Openaid.se


Openaid.se: “We are happy to launch the new version of the Swedish aid transparency tracker Openaid.se. This tracker brings a lot of new features – both visible and in the backend system. We have focused more on the professional user and making the site a quick but powerful tool to find the data you need.
The top navigation is structured like a sentence which filters the data. It states from who, to whom, via which organisation, for what purpose, and in which year – giving you the basic tools needed to filter the data, which then can be grouped and sorted below. You can also make a comparison with another such sentence by adding comparative data.
openaid-se-top-menu
The new Openaid.se presents either a graph that shows data over time or a map that gives a quick geographical overview. Depending on your choice of recipient (the “to whom”) and/or organisation you will also be presented with overview data on the recipient/organisation in the right column.
You can always choose to view a full list of all the activities, that can range from one to several thousands, depending on your filtering choices. You can also, at any time, choose to download a csv-file containing all the activities.
export-button
More information, including a smaller graph, is shown when you dig into the details by clicking on an activity. You can also choose to look at the full activity sheet, where you will find any document links available at the activity level.”

Mapping the Next Frontier of Open Data: Corporate Data Sharing


Stefaan Verhulst at the GovLab (cross-posted at the UN Global Pulse Blog): “When it comes to data, we are living in the Cambrian Age. About ninety percent of the data that exists today has been generated within the last two years. We create 2.5 quintillion bytes of data on a daily basis—equivalent to a “new Google every four days.”
All of this means that we are certain to witness a rapid intensification in the process of “datafication”– already well underway. Use of data will grow increasingly critical. Data will confer strategic advantages; it will become essential to addressing many of our most important social, economic and political challenges.
This explains–at least in large part–why the Open Data movement has grown so rapidly in recent years. More and more, it has become evident that questions surrounding data access and use are emerging as one of the transformational opportunities of our time.
Today, it is estimated that over one million datasets have been made open or public. The vast majority of this open data is government data—information collected by agencies and departments in countries as varied as India, Uganda and the United States. But what of the terabyte after terabyte of data that is collected and stored by corporations? This data is also quite valuable, but it has been harder to access.
The topic of private sector data sharing was the focus of a recent conference organized by the Responsible Data Forum, Data and Society Research Institute and Global Pulse (see event summary). Participants at the conference, which was hosted by The Rockefeller Foundation in New York City, included representatives from a variety of sectors who converged to discuss ways to improve access to private data; the data held by private entities and corporations. The purpose for that access was rooted in a broad recognition that private data has the potential to foster much public good. At the same time, a variety of constraints—notably privacy and security, but also proprietary interests and data protectionism on the part of some companies—hold back this potential.
The framing for issues surrounding sharing private data has been broadly referred to under the rubric of “corporate data philanthropy.” The term refers to an emerging trend whereby companies have started sharing anonymized and aggregated data with third-party users who can then look for patterns or otherwise analyze the data in ways that lead to policy insights and other public good. The term was coined at the World Economic Forum meeting in Davos, in 2011, and has gained wider currency through Global Pulse, a United Nations data project that has popularized the notion of a global “data commons.”
Although still far from prevalent, some examples of corporate data sharing exist….

Help us map the field

A more comprehensive mapping of the field of corporate data sharing would draw on a wide range of case studies and examples to identify opportunities and gaps, and to inspire more corporations to allow access to their data (consider, for instance, the GovLab Open Data 500 mapping for open government data) . From a research point of view, the following questions would be important to ask:

  • What types of data sharing have proven most successful, and which ones least?
  • Who are the users of corporate shared data, and for what purposes?
  • What conditions encourage companies to share, and what are the concerns that prevent sharing?
  • What incentives can be created (economic, regulatory, etc.) to encourage corporate data philanthropy?
  • What differences (if any) exist between shared government data and shared private sector data?
  • What steps need to be taken to minimize potential harms (e.g., to privacy and security) when sharing data?
  • What’s the value created from using shared private data?

We (the GovLab; Global Pulse; and Data & Society) welcome your input to add to this list of questions, or to help us answer them by providing case studies and examples of corporate data philanthropy. Please add your examples below, use our Google Form or email them to us at [email protected]