Selected Readings on Crowdsourcing Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing data was originally published in 2013.

As institutions seek to improve decision-making through data and put public data to use to improve the lives of citizens, new tools and projects are allowing citizens to play a role in both the collection and utilization of data. Participatory sensing and other citizen data collection initiatives, notably in the realm of disaster response, are allowing citizens to crowdsource important data, often using smartphones, that would be either impossible or burdensomely time-consuming for institutions to collect themselves. Civic hacking, often performed in hackathon events, on the other hand, is a growing trend in which governments encourage citizens to transform data from government and other sources into useful tools to benefit the public good.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Baraniuk, Chris. “Power Politechs.” New Scientist 218, no. 2923 (June 29, 2013): 36–39. http://bit.ly/167ul3J.

  • In this article, Baraniuk discusses civic hackers, “an army of volunteer coders who are challenging preconceptions about hacking and changing the way your government operates. In a time of plummeting budgets and efficiency drives, those in power have realised they needn’t always rely on slow-moving, expensive outsourcing and development to improve public services. Instead, they can consider running a hackathon, at which tech-savvy members of the public come together to create apps and other digital tools that promise to enhace the provision of healthcare, schools or policing.”
  • While recognizing that “civic hacking has established a pedigree that demonstrates its potential for positive impact,” Baraniuk argues that a “more rigorous debate over how this activity should evolve, or how authorities ought to engage in it” is needed.

Barnett, Brandon, Muki Hansteen Izora, and Jose Sia. “Civic Hackathon Challenges Design Principles: Making Data Relevant and Useful for Individuals and Communities.” Hack for Change, https://bit.ly/2Ge6z09.

  • In this paper, researchers from Intel Labs offer “guiding principles to support the efforts of local civic hackathon organizers and participants as they seek to design actionable challenges and build useful solutions that will positively benefit their communities.”
  • The authors proposed design principles are:
    • Focus on the specific needs and concerns of people or institutions in the local community. Solve their problems and challenges by combining different kinds of data.
    • Seek out data far and wide (local, municipal, state, institutional, non-profits, companies) that is relevant to the concern or problem you are trying to solve.
    • Keep it simple! This can’t be overstated. Focus [on] making data easily understood and useful to those who will use your application or service.
    • Enable users to collaborate and form new communities and alliances around data.

Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. “Amazon’s Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?” Perspectives on Psychological Science 6, no. 1 (January 1, 2011): 3–5. http://bit.ly/H56lER.

  • This article examines the capability of Amazon’s Mechanical Turk to act a source of data for researchers, in addition to its traditional role as a microtasking platform.
  • The authors examine the demographics of MTurkers and find that “MTurk participants are slightly more demographically diverse than are standard Internet samples and are significantly more diverse than typical American college samples; (b) participation is affected by compensation rate and task length, but participants can still be recruited rapidly and inexpensively; (c) realistic compensation rates do not affect data quality; and (d) the data obtained are at least as reliable as those obtained via traditional methods.”
  • The paper concludes that, just as MTurk can be a strong tool for crowdsourcing tasks, data derived from MTurk can be high quality while also being inexpensive and obtained rapidly.

Goodchild, Michael F., and J. Alan Glennon. “Crowdsourcing Geographic Information for Disaster Response: a Research Frontier.” International Journal of Digital Earth 3, no. 3 (2010): 231–241. http://bit.ly/17MBFPs.

  • This article examines issues of data quality in the face of the new phenomenon of geographic information being generated by citizens, in order to examine whether this data can play a role in emergency management.
  • The authors argue that “[d]ata quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data.”
  • Due to the fact that time is crucial during emergencies, the authors argue that, “the risks associated with volunteered information are often outweighed by the benefits of its use.”
  • The paper examines four wildfires in Santa Barbara in 2007-2009 to discuss current challenges with volunteered geographical data, and concludes that further research is required to answer how volunteer citizens can be used to provide effective assistance to emergency managers and responders.

Hudson-Smith, Andrew, Michael Batty, Andrew Crooks, and Richard Milton. “Mapping for the Masses Accessing Web 2.0 Through Crowdsourcing.” Social Science Computer Review 27, no. 4 (November 1, 2009): 524–538. http://bit.ly/1c1eFQb.

  • This article describes the way in which “we are harnessing the power of web 2.0 technologies to create new approaches to collecting, mapping, and sharing geocoded data.”
  • The authors examine GMapCreator and MapTube, which allow users to do a range of map-related functions such as create new maps, archive existing maps, and share or produce bottom-up maps through crowdsourcing.
  • They conclude that “these tools are helping to define a neogeography that is essentially ‘mapping for the masses,’ while noting that there are many issues of quality, accuracy, copyright, and trust that will influence the impact of these tools on map-based communication.”

Kanhere, Salil S. “Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology, edited by Chittaranjan Hota and Pradip K. Srimani, 19–26. Lecture Notes in Computer Science 7753. Springer Berlin Heidelberg. 2013. https://bit.ly/2zX8Szj.

  • This paper provides a comprehensive overview of participatory sensing — a “new paradigm for monitoring the urban landscape” in which “ordinary citizens can collect multi-modal data streams from the surrounding environment using their mobile devices and share the same using existing communications infrastructure.”
  • In addition to examining a number of innovative applications of participatory sensing, Kanhere outlines the following key research challenges:
    • Dealing with incomplete samples
    •  Inferring user context
    • Protecting user privacy
    • Evaluating data trustworthiness
    • Conserving energy

Data isn't a four-letter word


Speech by Neelie Kroes, Vice-President of the European Commission responsible for the Digital Agenda: “I want to talk about data too: the opportunity as well as the threat.
Making data the engine of the European economy: safeguarding fundamental rights capturing the data boost, and strengthening our defences.
Data is at a cross-roads. We have opportunities; open data, big data, datamining, cloud computing. Tim Berners Lee, creator of the world wide web, saw the massive potential of open data. As he put it, if you put that data online, it will be used by other people to do wonderful things, in ways that you could never imagine.
On the other hand, we have threats: to our privacy and our values, and to the openness that makes it possible to innovate, trade and exchange.
Get it right and we can safeguard a better economic future. Get it wrong, and we cut competitiveness without protecting privacy. So we remain dependent on the digital developments of others: and just as vulnerable to them.
How do we find that balance? Not with hysteria; nor by paralysis. Not by stopping the wonderful things, simply to prevent the not-so-wonderful. Not by seeing data as a dirty word.
We are seeing a whole economy develop around data and cloud computing. Businesses using them, whole industries depending on them, data volumes are increasing exponentially. Data is not just an economic sideshow, it is a whole new asset class; requiring new skills and creating new jobs.
And with a huge range of applications. From decoding human genes to predicting the traffic, and even the economy. Whatever you’re doing these days, chances are you’re using big data (like translation, search, apps, etc).
There is increasing recognition of the data boost on offer. For example, open data can make public administrations more transparent and stimulate a rich innovative market. That is what the G8 Leaders recognised in June, with their Open Data Charter. For scientists too, open data and open access offer new ways to research and progress.
That is a philosophy the Commission has shared for some time. And that is what our ‘Open Data’ package of December 2011 is all about. With new EU laws to open up public administrations, and a new EU Open Data Portal. And all EU-funded scientific publications available under open access.
Now not just the G8 and the Commission are seeing this data opportunity: but the European Council too. Last October, they recognised the potential of big data innovation, the need for a single market in cloud computing; and the urgency of Europe capitalising on both.
We will be acting on that. Next spring, I plan a strategic agenda for research on data. Working with private partners and national research funders to shape that agenda, and get the most bang for our research euro.
And, beyond research, there is much we can do to align our work and support secure big data. From training skilled workers, to modernising copyright for data and text mining, to different actors in the value chain working together: for example through a public-private partnership.
…Empowering people is not always easy in this complex online world. I want to see technical solutions emerge that can do that, give users control over their desired level of privacy, how their data will be used, and making it easier to verify online rights are respected.
How can we do that? How can we ensure systems that are empowering, transparent, and secure? There are a number of subtleties in play. Here’s my take.
First, companies engaged in big data will need to start thinking about privacy protection at every stage: and from system development, to procedures and practices.
This is the principle of “privacy by design”, set out clearly in the proposed Data Protection Regulation. In other words, from now on new business ideas have two purposes: delivering a service and protecting privacy at the right level.
Second, also under the regulation, big data applications that might put fundamental rights at risk would require the company to carry out a “Privacy Impact Assessment”. This is another good way to combine innovation and privacy: ensuring you think about any risks from the start.
Third, sometimes, particularly for personal data, a company might realise they need user consent. Consent is a cornerstone of data protection rules, and should stay that way.
But we need to get smart, and apply common sense to consent. Users can’t be expected to know everything. Nor asked to consent to what they cannot realistically understand. Nor presented with false dilemmas, a black-and-white choice between consenting or getting shut out of services.
Fourth, we can also get smart when it comes to anonymisation. Sometimes, full anonymisation means losing important information, so you can no longer make the links between data. That could make the difference between progress or paralysis. But using pseudonyms can let you to analyse large amounts of data: to spot, for example, that people with genetic pattern X also respond well to therapy Y.
So it is understandable why the European Parliament has proposed a more flexible data protection regime for this type of data. Companies would be able to process the data on grounds of legitimate interest, rather than consent. That could make all the positive difference to big data: without endangering privacy.
Of course, in those cases, companies still to minimise privacy risks. Their internal processes and risk assessments must show how they comply with the guiding principles of data protection law. And – if something does go wrong – the company remains accountable.
Indeed company accountability is another key element of our proposal. And here again we welcome the European Parliament’s efforts to reinforce that. Clearly, you might assure accountability in different ways for different companies. But standards for compliance and processes could make a real difference.
A single data protection law for Europe would be a big step forward. National fortresses and single market barriers just make it harder for Europe to lead in digital, harder for Europe to become the natural home of secure online services. Data protection cannot mean data protectionism. Rather, it means safeguarding privacy does not come at the expense of innovation: with laws both flexible and future proof, pragmatic and proportionate, for a changing world….
But data protection rules are really just the start. They are only part of our response to the Snowden revelations….”

Phone Apps Help Government, Others Counter Violence Against Women


NextGov: “Smart and mobile phones have helped authorities solve crimes from beatings that occurred during the London riots to the Boston Marathon bombing. A panel of experts gathered on Monday said the devices can also help reduce and combat rapes and other gender-based violence.
Smartphone apps and text messaging services proliferated in India following a sharp rise in reported gang rapes, including the brutal 2012 rape and murder of a 23-year-old medical student in Delhi, according to panelists at the Wilson Center event on gender-based violence and innovative technologies.
The apps fall into four main categories, said Alex Dehgan, chief data scientist at the United States Agency for International Development: apps that aid sexual assault and domestic violence victims, apps that empower women to fight back against gender-based violence, apps focused on advocacy and apps that crowdsource and map cases of sexual assault.
The final category of apps is largely built on the Ushahidi platform, which was developed to track reports of missing people following the 2010 Haiti earthquake.
One of the apps, Safecity, offers real-time alerts about sexual assaults across India to help women identify unsafe areas.
Similar apps have been launched in Egypt and Syria, Dehgan said. In lower-tech countries the systems often operate using text messages rather than smartphone apps so they’re more widely accessible.
One of the greatest impediments to using mobile technology to reduce gender violence is third world nations in which women often don’t have access to their own mobile or smartphones and rural areas in the U.S. and abroad in which there is limited service or broadband, Christopher Burns, USAID’s team leader for mobile access, said.
Burns suggested international policymakers should align plans for expanding broadband and mobile service with crowdsourced reports of gender violence.
“One suggestion for policy makers to focus on is to take a look at the crowd maps we’ve talked about today and see where there are greater incidences of gender-based violence and violence against women,” he said. “In all likelihood, those pockets probably don’t have the connectivity, don’t have the infrastructure [and] don’t have the capacity in place for survivors to benefit from those tools.”
One tool that’s been used in the U.S. is Circle of 6, an app for women on college campuses to automatically draw on friends when they think they’re in danger. The app allows women to pick six friends they can automatically text if they think they’re in a dangerous situation, asking them to call with an excuse for them to leave.
The app is designed to look like a game so it isn’t clear women are using their phones to seek help, said Nancy Schwartzman, executive director of Tech 4 Good, which developed the app.
Schwartzman has heard reports of gay men on college campuses using the app as well, she said. The military has been in contact with Tech 4 Good about developing a version of the app to combat sexual assault on military bases, she said.”

The Decay of American Political Institutions


in the American Interest: “Many political institutions in the United States are decaying. This is not the same thing as the broader phenomenon of societal or civilization decline, which has become a highly politicized topic in the discourse about America. Political decay in this instance simply means that a specific political process—sometimes an individual government agency—has become dysfunctional. This is the result of intellectual rigidity and the growing power of entrenched political actors that prevent reform and rebalancing. This doesn’t mean that America is set on a permanent course of decline, or that its power relative to other countries will necessarily diminish. Institutional reform is, however, an extremely difficult thing to bring about, and there is no guarantee that it can be accomplished without a major disruption of the political order. So while decay is not the same as decline, neither are the two discussions unrelated.
There are many diagnoses of America’s current woes. In my view, there is no single “silver bullet” cause of institutional decay, or of the more expansive notion of decline. In general, however, the historical context of American political development is all too often given short shrift in much analysis. If we look more closely at American history as compared to that of other liberal democracies, we notice three key structural characteristics of American political culture that, however they developed and however effective they have been in the past, have become problematic in the present.
The first is that, relative to other liberal democracies, the judiciary and the legislature (including the roles played by the two major political parties) continue to play outsized roles in American government at the expense of Executive Branch bureaucracies. Americans’ traditional distrust of government thus leads to judicial solutions for administrative problems. Over time this has become a very expensive and inefficient way to manage administrative requirements.
The second is that the accretion of interest group and lobbying influences has distorted democratic processes and eroded the ability of the government to operate effectively. What biologists label kin selection and reciprocal altruism (the favoring of family and friends with whom one has exchanged favors) are the two natural modes of human sociability. It is to these types of relationships that people revert when modern, impersonal government breaks down.
The third is that under conditions of ideological polarization in a federal governance structure, the American system of checks and balances, originally designed to prevent the emergence of too strong an executive authority, has become a vetocracy. The decision system has become too porous—too democratic—for its own good, giving too many actors the means to stifle adjustments in public policy. We need stronger mechanisms to force collective decisions but, because of the judicialization of government and the outsized role of interest groups, we are unlikely to acquire such mechanisms short of a systemic crisis. In that sense these three structural characteristics have become intertwined….
In short, the problems of American government flow from a structural imbalance between the strength and competence of the state, on the one hand, and the institutions that were originally designed to constrain the state, on the other. There is too much law and too much “democracy”, in the form of legislative intervention, relative to American state capacity. Some history can make this assertion clearer….
In well-functioning governance systems, moreover, a great deal of deliberation occurs not just in legislatures but within bureaucracies. This is not a matter of bureaucrats simply talking to one another, but rather a complex series of consultations between government officials and businesses, outside implementers and service providers, civil society groups, the media and other sources of information about societal interests and opinions. The Congress wisely mandated consultation in the landmark 1946 Administrative Procedures Act, which requires regulatory agencies to publicly post proposed rule changes and to solicit comment about them. But these consultative procedures have become highly routinized and pro forma, with actual decisions being the outcome not of genuine deliberation, but of political confrontations between well organized interest groups….”

The United States Releases its Second Open Government National Action Plan


Nick Sinai and Gayle Smith at the White House: “Since his first full day in office, President Obama has prioritized making government more open and accountable and has taken substantial steps to increase citizen participation, collaboration, and transparency in government. Today, the Obama Administration released the second U.S. Open Government National Action Plan, announcing 23 new or expanded open-government commitments that will advance these efforts even further.
…, in September 2011, the United States released its first Open Government National Action Plan, setting a series of ambitious goals to create a more open government. The United States has continued to implement and improve upon the open-government commitments set forth in the first Plan, along with many more efforts underway across government, including implementing individual Federal agency Open Government Plans. The second Plan builds on these efforts, in part through a series of key commitments highlighted in a preview report issued by the White House in October 2013, in conjunction with the Open Government Partnership Annual Summit in London.
Among the highlights of the second National Action Plan:

  • “We the People”: The White House will introduce new improvements to the We the People online petitions platform aimed at making it easier to collect and submit signatures and increase public participation in using this platform. Improvements will enable the public to perform data analysis on the signatures and petitions submitted to We the People, as well as include a more streamlined process for signing petitions and a new Application Programming Interface (API) that will allow third-parties to collect and submit signatures from their own websites.
  • Freedom of Information Act (FOIA) Modernization: The FOIA encourages accountability through transparency and represents an unwavering national commitment to open government principles. Improving FOIA administration is one of the most effective ways to make the U.S. Government more open and accountable. Today, we announced five commitments to further modernize FOIA processes, including launching a consolidated online FOIA service to improve customers’ experience, creating and making training resources available to FOIA professionals and other Federal employees, and developing common FOIA standards for agencies across government.
  • The Global Initiative on Fiscal Transparency (GIFT): The United States will join GIFT, an international network of governments and non-government organizations aimed at enhancing financial transparency, accountability, and stakeholder engagement. The U.S. Government will actively participate in the GIFT Working Group and seek opportunities to collaborate with stakeholders and champion greater fiscal openness and transparency in domestic and global spending.
  • Open Data to the Public: Over the past few years, government data has been used by journalists to uncover variations in hospital billings, by citizens to learn more about the social services provided by charities in their communities, and by entrepreneurs building new software tools to help farmers plan and manage their crops.  Building on the U.S. Government’s ongoing open data efforts, new commitments will make government data even more accessible and useful for the public, including by reforming how Federal agencies manage government data as a strategic asset, launching a new version of Data.gov to make it even easier to discover, understand, and use open government data, and expanding access to agriculture and nutrition data to help farmers and communities.
  • Participatory Budgeting: The United States will promote community-led participatory budgeting as a tool for enabling citizens to play a role in identifying, discussing, and prioritizing certain local public spending projects, and for giving citizens a voice in how taxpayer dollars are spent in their communities. This commitment will include steps by the U.S. Government to help raise awareness of the fact that participatory budgeting may be used for certain eligible Federal community development grant programs.

Other initiatives launched or expanded today include: increasing open innovation by encouraging expanded use of challenges, incentive prizes, citizen science, and crowdsourcing to harness American ingenuity, as well as modernizing the management of government records by leveraging technology to make records less burdensome to manage and easier to use and share. There are many other exciting open-government initiatives described in the second Plan — and you can view them all here.”

Index: Measuring Impact with Evidence


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on measuring impact with evidence and was originally published in 2013.

United States

  • Amount per $100 of government spending that is backed by evidence that the money is being spent wisely: less than $1
  • Number of healthcare treatments delivered in the U.S. that lack evidence of effectiveness: more than half
  • How much of total U.S. healthcare expenditure is spent to determine what works: less than 0.1 percent
  • Number of major U.S. federal social programs evaluated since 1990 using randomized experiments and found to have “weak or no positive effects”: 9 out of 10
  • Year the Coalition for Evidence-Based Policy was set up to work with federal policymakers to advance evidence-based reforms in major U.S. social programs: 2001
  • Year the Program Assessment Rating Tool (PART) was introduced by President Bush’s Office of Management and Budget (OMB): 2002
    • Out of about 1,000 programs assessed, number found to be effective in 2008: 19%
    • Percentage of programs that could not be assessed due to insufficient data: 17%
    • Amount spent on the Even Start Family Literacy Program, rated ineffective by PART, over the life of the Bush administration: more than $1 billion
  •  Year Washington State legislature began using Washington State Institute for Public Policy’s estimates on how “a portfolio of evidence-based and economically sound programs . . . could affect the state’s crime rate, the need to build more prisons, and total criminal-justice spending”: 2007
    • Amount invested by legislature in these programs: $48 million
    • Amount saved by the legislature: $250 million
  • Number of U.S. States in a pilot group working to adapt The Pew-MacArthur Results First Initiative, based on the Washington State model, to make performance-based policy decisions: 14
  • Net savings in health care expenditure by using the Transitional Care Model, which meets the Congressionally-based Top Tier Evidence Standard: $4,000 per patient
  • Number of states that conducted “at least some studies that evaluated multiple program or policy options for making smarter investments of public dollars” between 2008-2011: 29
  • Number of states that reported that their cost-benefit analysis influenced policy decisions or debate: 36
  • Date the Office of Management and Budget issued a memorandum proposing new evaluations and advising agencies to include details on determining effectiveness of their programs, link disbursement to evidence, and support evidence-based initiatives: 2007
  • Percentage increase in resources for innovation funds that use a tiered model for evidence, according to the President’s FY14 budget: 44% increase
  • Amount President Obama proposed in his FY 2013 budget to allocate in existing funding to Performance Partnerships “in which states and localities would be given the flexibility to propose better ways to combine federal resources in exchange for greater accountability for results”:  $200 million
  • Amount of U.S. federal program funding that Harvard economist Jeffrey Liebman suggests be directed towards evaluations of outcomes: 1%
  • Amount of funding the City of New York has committed for evidence-based research and development initiatives through its Center for Economic Opportunity: $100 million a year

Internationally

  • How many of the 30 OECD countries in 2005-6 have a formal requirement by law that the benefits of regulation justify the costs: half
    • Number of 30 OECD member countries in 2008 that reported quantifying benefits to regulations: 16
    • Those who reported quantifying costs: 24
  • How many members make up the Alliance for Useful Evidence, a network that “champion[s]  evidence, the opening up of government data for interrogation and use, alongside the sophistication in research methods and their applications”: over 1,000
  • Date the UK government, the ESRC and the Big Lottery Fund announced plans to create a network of ‘What Works’ evidence centres: March 2013
  • Core funding for the What Works Centre for Local Economic Growth: £1m p.a. over an initial three year term
  • How many SOLACE Summit members in 2012 were “very satisfied” with how Research and Intelligence resources support evidence-based decision-making: 4%
    • Number of areas they identified for improving evidence-based decision-making: 5
    • Evaluation of the impact of past decisions: 46% of respondents
    • Benchmarking data with other areas: 39%
    • assessment of options available: 33% 
    • how evidence is presented: 29% 
    • Feedback on public engagement and consultation: 25%
  •  Number of areas for improvement for Research and Intelligence staff development identified at the SOLACE Summit: 6
    • Strengthening customer insight and data analysis: 49%
    • Impact evaluation: 48%
    • Strategic/corporate thinking/awareness: 48%
    • Political acumen: 46%
    • Raising profile/reputation of the council for evidence-based decisions: 37%
    • Guidance/mentoring on use of research for other officers: 25%

Sources

Selected Readings on Smart Disclosure


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of smart disclosure was originally published in 2013.

While much attention is paid to open data, data transparency need not be managed by a simple On/Off switch: It’s often desirable to make specific data available to the public or individuals in targeted ways. A prime example is the use of government data in Smart Disclosure, which provides consumers with data they need to make difficult marketplace choices in health care, financial services, and other important areas. Governments collect two kinds of data that can be used for Smart Disclosure: First, governments collect information on services of high interest to consumers, and are increasingly releasing this kind of data to the public. In the United States, for example, the Department of Health and Human Services collects and releases online data on health insurance options, while the Department of Education helps consumers understand the true cost (after financial aid) of different colleges. Second, state, local, or national governments hold information on consumers themselves that can be useful to them. In the U.S., for example, the Blue Button program was launched to help veterans easily access their own medical records.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Better Choices: Better Deals Report on Progress in the Consumer Empowerment Strategy. Progress Report. Consumer Empowerment Strategy. United Kingdom: Department for Business Innovation & Skills, December 2012. http://bit.ly/17MqnL3.

  • The report details the progress made through the United Kingdom’s consumer empowerment strategy, Better Choices: Better Deals. The plan seeks to mitigate knowledge imbalances through information disclosure programs and targeted nudges.
  • The empowerment strategy’s four sections demonstrate the potential benefits of Smart Disclosure: 1. The power of information; 2. The power of the crowd; 3. Helping the vulnerable; and 4. A new approach to Government working with business.
Braunstein, Mark L.,. “Empowering the Patient.” In Health Informatics in the Cloud, 67–79. Springer Briefs in Computer Science. Springer New York Heidelberg Dordrecht London, 2013. https://bit.ly/2UB4jTU.
  • This book discusses the application of computing to healthcare delivery, public health and community based clinical research.
  • Braunstein asks and seeks to answer critical questions such as: Who should make the case for smart disclosure when the needs of consumers are not being met? What role do non-profits play in the conversation on smart disclosure especially when existing systems (or lack thereof) of information provision do not work or are unsafe?

Brodi, Elisa. “Product-Attribute Information” and “Product-Use Information”: Smart Disclosure and New Policy Implications for Consumers’ Protection. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, September 4, 2012. http://bit.ly/17hssEK.

  • This paper from the Research Area of the Bank of Italy’s Law and Economics Department “surveys the literature on product use information and analyzes whether and to what extent Italian regulator is trying to ensure consumers’ awareness as to their use pattern.” Rather than focusing on the type of information governments can release to citizens, Brodi proposes that governments require private companies to provide valuable use pattern information to citizens to inform decision-making.
  • The form of regulation proposed by Brodi and other proponents “is based on a basic concept: consumers can be protected if companies are forced to disclose data on the customers’ consumption history through electronic files.”
National Science and Technology Council. Smart Disclosure and Consumer Decision Making: Report of the Task Force on Smart Disclosure. Task Force on Smart Disclosure: Information and Efficiency in Consumer Markets. Washington, DC: United States Government: Executive Office of the President, May 30, 2013. http://1.usa.gov/1aamyoT.
    • This inter-agency report is a comprehensive description of smart disclosure approaches being used across the Federal Government. The report not only highlights the importance of making data available to consumers but also to innovators to build better options for consumers.
  • In addition to providing context about government policies that guide smart disclosure initiatives, the report raises questions about what parties have influence in this space.

“Policies in Practice: The Download Capability.” Markle Connecting for Health Work Group on Consumer Engagement, August 2010. http://bit.ly/HhMJyc.

  • This report from the Markle Connecting for Health Work Group on Consumer Engagement — the creator of the Blue Button system for downloading personal health records — features a “set of privacy and security practices to help people download their electronic health records.”
  • To help make health information easily accessible for all citizens, the report lists a number of important steps:
    • Make the download capability a common practice
    • Implement sound policies and practices to protect individuals and their information
    • Collaborate on sample data sets
    • Support the download capability as part of Meaningful Use and qualified or certified health IT
    • Include the download capability in procurement requirements.
  • The report also describes the rationale for the development of the Blue Button — perhaps the best known example of Smart Disclosure currently in existence — and the targeted release of health information in general:
    • Individual access to information is rooted in fair information principles and law
    • Patients need and want the information
    • The download capability would encourage innovation
    • A download capability frees data sources from having to make many decisions about the user interface
    • A download capability would hasten the path to standards and interoperability.
Sayogo, Djoko Sigit, and Theresa A. Pardo. “Understanding Smart Data Disclosure Policy Success: The Case of Green Button.” In Proceedings of the 14th Annual International Conference on Digital Government Research, 72–81. New York: ACM New York, NY, USA, 2013. http://bit.ly/1aanf1A.
  • This paper from the Proceedings of the 14th Annual International Conference on Digital Government Research explores the implementation of the Green Button Initiative, analyzing qualitative data from interviews with experts involved in Green Button development and implementation.
  • Moving beyond the specifics of the Green Button initiative, the authors raise questions on the motivations and success factors facilitating successful collaboration between public and private organizations to support smart disclosure policy.

Thaler, Richard H., and Will Tucker. “Smarter Information, Smarter Consumers.” Harvard Business Review January – February 2013. The Big Idea. http://bit.ly/18gimxw.

  • In this article, Thaler and Tucker make three key observations regarding the challenges related to smart disclosure:
    • “We are constantly confronted with information that is highly important but extremely hard to navigate or understand.”
    • “Repeated attempts to improve disclosure, including efforts to translate complex contracts into “plain English,” have met with only modest success.”
    • “There is a fundamental difficulty of explaining anything complex in simple terms. Most people find it difficult to write instructions explaining how to tie a pair of shoelaces.

DataViva: a Big Data Engine for the Brazilian Economy


Piece by André Victor dos Santos Barrence and Cesar A. Hidalgo: “The current Internet paradigm in which one can search about anything and retrieve information is absolutely empowering. We can browse files, websites and indexes and effortlessly reach good amount of information. Google, for instance, was explicitly built on a library analogy available to everyone. However, it is a world where information that should be easily accessible is still hidden in unfriendly databases, and that the best-case scenario is finding few snippets of information embedded within the paragraphs of a report. But is this the way it should be? Or is this just the world we are presently stuck with?
The last decade has been particularly marked by an increasing hype on big data and analytics, mainly fueled by those who are interested in writing narratives on the topic but not necessarily coding about it, even when data itself is not the problem.
Let’s take the case of governments. Governments have plenty of data and in many cases it is actually public (at least in principle). Governments “know” how many people work in every occupation, in every industry and in every location; they know their salaries, previous employers and education history. From a pure data perspective all that is embedded in tax, social security records or annual registrations. From a more pragmatic perspective, it is still inaccessible and hidden even when it is legally open the public. We live in a world where the data is there, but where the statistics and information are not.
The state government of Minas Gerais in Brazil (3rd economy the country, territory larger than France and 20 millions inhabitants) made an important step in that direction by releasing DataViva.info, a platform that opens data for exports and occupations for the entire formal sector of the Brazilian economy through more than 700 million interactive visualizations. Instead of poorly designed tables and interfaces, it guides users to answer questions or freely discover locations, industries and occupations in Brazil that are of interest to them. DataViva allows users to explore simple questions such as the evolution of exports in the last decade for each of the 5,567 municipalities in the country, or highly specific queries, for instance, the average salaries paid to computer scientists working in the software development industry in Belo Horizonte, the state capital of Minas.
DataViva’s visualizations are built on the idea that the industrial and economic activity development of locations is highly path dependent. This means that locations are more likely to be successful at developing industries and activities that are related to the ones already existing, since it indicates the existence of labor inputs, and other capabilities, that are specific and that can often be redeployed to a few related industries and activities. Thus, it informs the processes by which opportunities can be explored and prospective pathways for greater prosperity.
The idea that information is key for the functioning of economies is at least as old as Friedrich Hayek’s seminal paper The Use of Knowledge in Society from 1945. According to Hayek, prices help coordinate economic activities by providing information about the wants and needs of goods and services. Yet, the price information can only serve as a signal as long as people know those prices. Maybe the salaries for engineers in the municipality of Betim (Minas Gerais) are excellent and indicate a strong need for them? But who would have known how many engineers are there in Betim and what are their average salaries?
But the remaining question is: why is Minas Gerais making all of this public data easily available? More than resorting to the contemporary argument of open government Minas understands this is extremely valuable information for investors searching for business opportunities, entrepreneurs pursuing new ventures or workers looking for better career prospects. Lastly, the ultimate goal of DataViva is to provide a common ground for open discussions, moving away from the information deprived idea of central planning and into a future where collaborative planning might become the norm. It is a highly creative attempt to renew public governance for the 21st century.
Despite being a relatively unknown state outside of Brazil, by releasing a platform as DataViva, Minas is providing a strong signal about where in world governments are really pushing forward innovation rather than simply admiring and copying solutions that used to come from trendsetters in the developed world. It seems like real innovation isn’t necessarily taking place in Washington, Paris or London anymore.”
 

Power to the people: how open data is improving health service delivery


The Guardian: “…What’s really interesting is how this data can be utilised by citizens to enable them to make more informed choices and demand improved services in sectors such as health. A growing community of technologists and social activists is emerging across Africa, supported by a burgeoning network of technology innovation hubs. They’re beginning to explore the ways in which data can be utilised to improve health outcomes.
In Northern Uganda, the brutal Lord’s Resistance Army conflict displaced two million people, leaving the social infrastructure in tatters. In 2008, the government launched a Peace, Recovery and Development Plan, but progress has been limited. There are insufficient health centres to serve the population, a severe shortage of staff, drugs and equipment, and corruption is widespread.
Cipesa – an organisation that uses communication technologies to support poverty reduction and development – and Numec, a local media organisation, have launched the iParticipate project. A multimedia platform is being populated with baseline data outlining the current status of the health service across three districts….
In the same region, Wougnet is training women to use information technologies to tackle social challenges. Local officials and community members have formed voluntary social accountability committees and been trained in the use of an online platform to capture and store information relating to poor governance and corruption in the health sector, often via mobile phones.
The platform strengthened campaign efforts which resulted in the construction of a health centre in Aloni Parish. In Amuru district, five health workers were arrested following reports highlighting negligence.
In the village of Bagega in Nigeria, 400 children died and thousands suffered significant health problems as the result of lead poisoning caused by poor mining practices. The government pledged $5.3m (£3.23m) for remediation, but the funds never reached the affected region.
A local organisation, Follow the Money, created an infographic highlighting the government’s commitments and combined this with real life testimonies and photographs showing the actual situation on the ground. Within 48 hours of a targeted Twitter campaign, the president committed to releasing funds to the village and, in February this year, children started receiving long overdue medical attention.
All these initiatives depend on access to critical government data and an active citizens who feel empowered to effect change in their own lives and communities. At present, it’s often hard to access data which is sufficiently granular, particularly at district or local level. For citizens to be engaged with information from government, it also needs to be accessible in ways that are simple to understand and linked to campaigns that impact their daily lives.
Tracking expenditure can also operate across borders. Donors are beginning to open up aid data by publishing to the IATI registry. This transparency by donor governments should improve the effectiveness of aid spending and contribute towards improved health outcomes.
It’s hard to draw general conclusions about how technology can contribute towards improving health outcomes, particularly when context is so critical and the field is so new. Nonetheless, some themes are emerging which can maximise the chances of an intervention’s success.
It can at times be challenging to encourage citizens to report for an array of reasons, including a lack of belief in their ability to effect change, cultural norms, a lack of time and both perceived and real risks. Still, participation seems to increase when citizens receive feedback from reports submitted and when mechanisms are in place that enable citizens to take collective action. On-the-ground testimonies and evidence can also help shift public opinion and amplify critical messages.
Interventions are dramatically strengthened when integrated into wider programmes, implemented by organisations that have established a strong relationship with the communities in which they work. They need to be backed by at least one strong civil society organisation that can follow up on any reports, queries or challenges which may arise. Where possible, engagement from government and local leaders can make a real difference. Identifying champions within government can also significantly improve responsiveness.”

What do you know?


Article in the Financial Times by Eric Openshaw John Hagel: “Talent holds the key to company performance. From business units to Finance to IT, getting the right skills to the right place at the right time is a constant challenge. Now consider that workplace technologies are becoming obsolete faster, and the useful life of many skills is shorter. Workers at all levels need to be able to learn and relearn rapidly to adapt to and anticipate changing demands. Recruitment and retention initiatives can’t address the need, nor can standardised training programs and knowledge management.
New technologies offer an opportunity to rethink both talent development and traditional knowledge management and integrate learning directly into the daily work experience. Virtual platforms enable workers to connect with each other to solve problems across distributed work settings and beyond organisational boundaries.
The preponderance of sensors and advanced analytics today make it more possible than ever to collect and share individual’s real-time performance in a variety of settings. Sensors and the integration of social platforms into work allows for knowledge and experiences to be captured automatically, as they occur, rather than depending on compliance and coerced participation to populate reputation profiles and knowledge management databases. These technologies support rich, context-specific learning and participation driven by momentum as users discover and create value.
Training programs and knowledge management have a place, but they may not deliver the skilled workers needed to the right place and time in a rapidly changing environment. Pre-developed content quickly becomes obsolete or lacks the context to make it relevant to the individual. Perhaps more importantly, classroom training and knowledge databases tend to focus on the commonalities between work, the standard processes and practices, when in fact workers spend most of their days dealing with the exceptions that don’t fit into the standard processes and systems, whether it’s a one-off shipping request or a customer who can’t make your software work with their hardware. Workers typically get better at handling the non-standard aspects of their work through on-the-job experience…

In a recent paper, we detailed nine principles that help to create the type of environment that fosters learning and improvement. The following three principles demonstrate how technology can play an important role in enabling this on-the-job learning and in amplifying the learning, especially across a virtual workforce.

Real-time feedback for individuals and teams. For workers to improve performance and learn what works or doesn’t work, they need to have a context-specific understanding of what is expected and how they are doing relative to others, in the moment rather than three- or six-months down the line.
At virtual call-centre LiveOps, the independent agents see customer and program-specific metrics on an online dashboard. These metrics define the level of performance necessary for agents to remain eligible to take calls for a program and are continuously updated, providing real-time feedback after each call so that agents can see how they are doing relative to their peer group and where they can improve. This level of performance data transparency creates a meritocracy, as agents are compared to their peers and rewarded based on their relative performance.
Smart capture and share. In any work setting, a great deal of information is generated and exchanged in meetings, conversations, instant messages, and email. Easy access to that information helps foster collaboration, solve problems, and improve business processes. At SAP Community Network (SCN), intelligent cataloguing of insights from discussion forums, tagged for searchability, helps make the right information available at the right time to those who need it without requiring the burdensome documentation associated with typical knowledge management. Other users can search for solutions in the context of the original problem posed, as well as through related discussions that may have led to an ultimate solution. Instead of days of internal debate or experimentation, the typical time to receive a response is 17 minutes.
Helping workers make relevant connections. In a typical organisation, physical or virtual, it can be difficult to know who everyone is and what their experience, expertise, and interests are, and the typical knowledge and resource management tools that require individuals to maintain profiles rarely see the level of continuing compliance and participation to make them useful. Instead, workers tend to fall back on relationships. They seek help and learn from those already known to them.
Now, virtual platforms, such as the one used by Odesk, a global online workplace, automatically generate detailed, up-to-date profiles. These profiles include the contractor’s cumulative and historical ratings and hourly wages for each completed project as well as the scores for any tests or certifications. The same type of automatic, action-based reputation profiles can be used internally to facilitate assessing and connecting with the right co-workers for the job at hand…”