Science to the People


David Lang on how citizen science bridges the gap between science and society: “It’s hard to find a silver lining in the water crisis in Flint, Michigan. The striking images of jugs of brown water being held high in protest are a symbol of institutional failure on a grand scale. It’s a disaster. But even as questions of accountability and remedy remain unanswered, there is already one lesson we can take away: Citizen science can be used as a powerful tool to build (or rebuild) the public’s trust in science.

Because the other striking image from Flint is this: Citizen-scientists  sampling and testing their own water, from their homes and neighborhoods,and reporting the results as scientific data. Dr. Marc Edwards is the VirginiaTech civil engineering professor who led the investigation into the lead levels in Flint’s water supply, and in a February 2016 interview with TheChronicle of Higher Education, he gave an important answer about the methods his team used to obtain the data: “Normal people really appreciate good science that’s done in their interest. They stepped forward as citizen-scientists to explore what was happening to them and to their community,we provided some funding and the technical and analytical expertise, and they did all the work. I think that work speaks for itself.”

It’s a subtle but important message: The community is rising up and rallying by using science, not by reacting to it. Other scientists trying to highlight important issues and influence public opinion would do well to take note, because there’s a disconnect between what science reports and what the general public chooses to believe. For instance, 97 percent of scientists agree that the world’s climate is warming, likely due to human activities. Yet only 70 percent of Americans believe that global warming is real. Many of the most important issues of our time have the same, growing gap between scientific and societal consensus: genetically modified foods, evolution,vaccines are often widely distrusted or disputed despite strong, positive scientific evidence…..

The good news is that we’re learning. Citizen science — the growing trend of involving non-professional scientists in the process of discovery — is proving to be a supremely effective tool. It now includes far more than birders and backyard astronomers, its first amateur champions. Over the past few years,the discipline has been gaining traction and popularity in academic circles too. Involving groups of amateur volunteers is now a proven strategy for collecting data over large geographic areas or over long periods of time.Online platforms like Zooniverse have shown that even an untrained human eye can spot anomalies in everything from wildebeest migrations to Martiansurfaces. For certain types of research, citizen science just works.

While a long list of peer-reviewed papers now backs up the efficacy of citizen science, and a series of papers has shown its positive impact on students’ view of science, we’re just beginning to understand the impact of that participation on the wider perception of science. Truthfully, for now,most of what we know so far about its public impact is anecdotal, as in the work in Flint, or even on our online platform for explorers, OpenExplorer….It makes sense that citizen science should affect public perception of science.The difference between “here are the results of a study” and “please help

It makes sense that citizen science should affect public perception of science.The difference between “here are the results of a study” and “please help us in the process of discovery” is profound. It’s the difference between a rote learning moment and an immersive experience. And even if not everyone is getting involved, the fact that this is possible and that some members of a community are engaging makes science instantly more relatable. It creates what Tim O’Reilly calls an “architecture of participation.” Citizen scientists create the best interface for convincing the rest of the populace.

A recent article in Nature argued that the DIY biology community was, in fact, ahead of the scientific establishment in terms of proactively thinking about the safety and ethics of rapidly advancing biotechnology tools. They had to be. For those people opening up community labs so that anyone can come and participate, public health issues can’t be pushed aside or dealt with later. After all, they are the public that will be affected….(More)”

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”

Foreign Policy has lost its creativity. Design thinking is the answer.


Elizabeth Radziszewski at The Wilson Quaterly: “Although the landscape of threats has changed in recent years, U.S. strategies bear striking resemblance to the ways policymakers dealt with crises in the past. Whether it involves diplomatic overtures, sanctions, bombing campaigns, or the use of special ops and covert operations, the range of responses suffers from innovation deficit. Even the use of drones, while a new tool of warfare, is still part of the limited categories of responses that focus mainly on whether or not to kill, cooperate, or do nothing. To meet the evolving nature of threats posed by nonstate actors such as ISIS, the United States needs a strategy makeover — a creative lift, so to speak.

Sanctions, diplomacy, bombing campaigns, special ops, covert operations — the range of our foreign policy responses suffers from an innovation deficit.

Enter the business world. Today’s top companies face an increasingly competitive marketplace where innovative approaches to product and service development are a necessity. Just as the market has changed for companies since the forces of globalization and the digital economy took over, so has the security landscape evolved for the world’s leading hegemon. Yet the responses of top businesses to these changes stand in stark contrast to the United States’ stagnant approaches to current national security threats. Many of today’s thriving businesses have embraced design thinking (DT), an innovative process that identifies consumer needs through immersive ethnographic experiences that are melded with creative brainstorming and quick prototyping.

What would happen if U.S. policymakers took cues from the business world and applied DT in policy development? Could the United States prevent the threats from metastasizing with more proactive rather than reactive strategies — by discovering, for example, how ideas from biology, engineering, and other fields could help analysts inject fresh perspective into tired solutions? Put simply, if U.S. policymakers want to succeed in managing future threats, then they need to start thinking more like business innovators who integrate human needs with technology and economic feasibility.

In his 1969 book The Sciences of the Artificial, Herbert Simon made the first connection between design and a way of thinking. But it was not until the 1980s and 1990s that Stanford scientists began to see the benefits of design practices used by industrial designers as a method for creative thinking. At the core of DT is the idea that solving a challenge requires a deeper understanding of the problem’s true nature and the processes and people involved. This approach contrasts greatly with more standard innovation styles, where a policy solution is developed and then resources are used to fit the solution to the problem. DT reverses the order.

DT encourages divergent thinking, the process of generating many ideas before converging to select the most feasible ones, including making connections between different-yet-related worlds. Finally, the top ideas are quickly prototyped and tested so that early solutions can be modified without investing many resources and risking the biggest obstacle to real innovation: the impulse to try fitting an idea, product, policy to the people, rather of the other way around…

If DT has reenergized the innovative process in the business and nonprofit sector, a systematic application of its methodology could just as well revitalize U.S. national security policies. Innovation in security and foreign policy is often framed around the idea of technological breakthroughs. Thanks toDefense Advanced Research Projects Agency (DARPA), the Department of Defense has been credited with such groundbreaking inventions as GPS, the Internet, and stealth fighters — all of which have created rich opportunities to explore new military strategies. Reflecting this infatuation with technology, but with a new edge, is Defense Secretary Ashton Carter’s unveiling of the Defense Innovation Unit Experimental, an initiative to scout for new technologies, improve outreach to startups, and form deeper relationships between the Pentagon and Silicon Valley. The new DIUE effort signals what businesses have already noticed: the need to be more flexible in establishing linkages with people outside of the government in search for new ideas.

Yet because the primary objective of DIUE remains technological prowess, the effort alone is unlikely to drastically improve the management of national security. Technology is not a substitute for an innovative process. When new invention is prized as the sole focus of innovation, it can, paradoxically, paralyze innovation. Once an invention is adopted, it is all too tempting to mold subsequent policy development around emergent technology, even if other solutions could be more appropriate….(More)”

E-Regulation and the Rule of Law: Smart Government, Institutional Information Infrastructures, and Fundamental Values


Rónán Kennedy in Information Polity: “Information and communications technology (ICT) is increasingly used in bureaucratic and regulatory processes. With the development of the ‘Internet of Things’, some researchers speak enthusiastically of the birth of the ‘Smart State’. However, there are few theoretical or critical perspectives on the role of ICT in these routine decision-making processes and the mundane work of government regulation of economic and social activity. This paper therefore makes an important contribution by putting forward a theoretical perspective on smartness in government and developing a values-based framework for the use of ICT as a tool in the internal machinery of government.

It critically reviews the protection of the rule of law in digitized government. As an addition to work on e-government, a new field of study, ‘e-regulation’ is proposed, defined, and critiqued, with particular attention to the difficulties raised by the use of models and simulation. The increasing development of e-regulation could compromise fundamental values by embedding biases, software errors, and mistaken assumptions deeply into government procedures. The article therefore discusses the connections between the ‘Internet of Things’, the development of ‘Ambient Law’, and how the use of ICT in e-regulation can be a support for or an impediment to the operation of the rule of law. It concludes that e-government research should give more attention to the processes of regulation, and that law should be a more central discipline for those engaged in this activity….(More)

The Alberta CoLab Story: Redesigning the policy development process in government


Alex Ryan at Medium: “Alberta CoLab is an evolving experiment built on three counter-intuitive ideas:

1. Culture shifts faster through collaborative project work than through a culture change initiative.

2. The way to accelerate policy development is to engage more perspectives and more complexity.

3. The best place to put a cross-ministry design team is in a line ministry.

I want to explain what CoLab is and why it has evolved the way it has. We don’t view CoLab as a best practice to be replicated, since our model is tailored to the specific culture and context of Alberta. Perhaps you are also trying to catalyze innovation inside a large bureaucratic organization. I hope you can learn something from our journey so far,….

….Both the successes and frustrations of Alberta CoLab are consequences of the way that we have mediated some key tensions and tradeoffs involved with setting up a public sector innovation lab. Practitioners in other labs will likely recognize these tensions and tradeoffs, although your successes and frustrations will be different depending on how your business model reconciles them.

  1. Where should the lab be? Public innovation labs can exist inside, outside, or on the edge of government. Dubai The Model Centre and Alberta CoLab operate inside government. Inside labs have the best access to senior decision makers and the authority to convene whole of government collaborations, but may find it harder to engage openly with citizens and stakeholders. Unicef Innovation Labs and NouLab exist outside of government. Outside labs have more freedom in who they convene, the kind of container they can create, and timelines to impact, but find it harder to connect with and affect policy change. MindLab and MaRS Solutions Lab are examples of labs on the edge of government. This positioning can offer the best of both worlds. However, edge labs are vulnerable to fluctuations in their relationship with government. Surviving and thriving on the edge means continually walking a tightrope between autonomy and integration. Labs can change their positioning. Alberta CoLab began as an external consulting project. The Behavioural Insights Team is a social purpose company that was spun-off from a lab inside the U.K. government. The location of the lab is unlikely to change often, so it is an important strategic choice.
  2. How deep should the lab go? Here the tension is between taking on small, tactical improvement projects that deliver tangible results, or tackling the big, strategic systems changes that will take years to manifest. Public sector innovation labs are a reaction to the almost total failure of traditional approaches to move the needle on systems change.Therefore, most labs have aspirations to the strategic and the systemic. Yet most labs are also operating in a dominant culture that demands quick wins and measures success by linear progress against a simple logic model theory of change. We believe that operating at either extreme of this spectrum is equally misguided. We use a portfolio approach and a barbell strategy to mediate this tension. Having a portfolio of projects allows us to invest energy in systems change and generate immediate value. It allows us to balance our projects across three horizons of innovation: sustaining innovations; disruptive innovations; and transformative innovations. A barbell strategy means avoiding the middle of the bell curve. We maintain a small number of long-term, flagship initiatives, combined with a rapid turnover of quick-win projects. This allows us to remind the organization of our immediate value without sacrificing long-term commitment to systems change.
  3. What relationship should the lab have with government? Even an inside lab must create some distance between itself and the broader government culture if it is to provide a safe space for innovation. There is a tension between being separate and being integrated. Developing novel ideas that get implemented requires the lab to be both separate and integrated at the same time. You need to decouple from regular policy cycles to enable divergence and creativity, yet provide input into key decisions at the right time. Sometimes these decision points are known in advance, but more often this means sensing and responding to a dynamic decision landscape. Underneath any effective lab is a powerful social network, which needs to cut across government silos and stratas and draw in external perspectives. I think of a lab as having a kind of respiratory rhythm. It starts by bringing fresh ideas into the organization, like a deep breath that provides the oxygen for new thinking. But new ideas are rarely welcome in old organizations. When the lab communicates outwards, these new ideas should be translated into familiar language and concepts, and then given a subtle twist. Often labs believe they have to differentiate their innovations — to emphasize novelty — to justify their existence as an innovation lab. But the more the output of the lab resembles the institutional culture, the more it appears obvious and familiar, the more likely it will be accepted and integrated into the mainstream.
  4. What relationship should the lab have with clients? Alberta CoLab is a kind of in-house consultancy that provides services to clients across all ministries. There is a tension in the nature of the relationship, which can span from consulting problem-solver to co-design facilitator to teacher. The main problem with a consulting model is it often builds dependency rather than capacity. The challenge with an educational relationship is that clients struggle to apply theory that is disconnected from practice. We often use facilitation as a ‘cover’ for our practice, because it allows us to design a process that enables both reflective practice and situated learning. By teaching systemic design and strategic foresight approaches through taking on live projects, we build capacity while doing the work our clients need to do anyway. This helps to break down barriers between theory and practice, learning and doing. Another tension is between doing what the client says she wants and what she needs but does not articulate. Unlike a customer, who is always right, the designer has a duty of care to their client. This involves pushing back when the client demands are unreasonable, reframing the challenge when the problem received is a symptom of a deeper issue, and clearly communicating the risks and potential side effects of policy options. As Denys Lasdun has said about designers: “Our job is to give the client, on time and on cost, not what he wants, but what he never dreamed he wanted; and when he gets it, he recognizes it as something he wanted all the time.”

Lessons Learned

These are our top lessons learned from our journey to date that may have broader applicability.

  1. Recruit outsiders and insiders. Bringing in outside experts elevates the lab’s status. Outsiders are essential to question and challenge organizational patterns that insiders take as given. Insiders bring an understanding of organizational culture. They know how to move files through the bureaucracy and they know where the landmines are.
  2. Show don’t tell. As lab practitioners, we tend to be process geeks with a strong belief in the superiority of our own methods. There is a temptation to cast oneself in the role of the missionary bringing the good word to the unwashed masses. Not only is this arrogant, it’s counter-productive. It’s much more effective to show your clients how your approach adds value by starting with a small collaborative project. If your approach really is as good as you believe it is, the results will speak for themselves. Once people are envious of the results you have achieved, they will be curious and open to learning how you did it, and they will demand more of it.
  3. Be a catalyst, not a bottleneck. Jess McMullin gave us this advice when we founded CoLab. It’s why we developed a six day training course to train over 80 systemic designers across the government. It’s why we run communities of practice on systemic design and strategic foresight. And it’s why we publish about our experiences and share the toolkits we develop. If the innovation lab is an ivory tower, it will not change the way government works. Think instead of the lab as the headquarters of a democratic grassroots movement.
  4. Select projects based on the potential for reframing. There are many criteria we apply when we decide whether to take on a new project. Is it a strategic priority? Is there commitment to implement? Are the client expectations realistic? Can our contribution have a positive impact? These are useful but apply to almost any service offering. The unique value a social innovation lab offers is discontinuous improvement. The source of discontinuous improvement is reframing — seeing a familiar challenge with new eyes, from a different perspective that opens up new potential for positive change. If a project ticks all the boxes, except that the client is certain they already know what the problem is, then that already limits the kind of solutions they will consider. Unless they are open to reframing, they will likely be frustrated by a lab approach, and would be better served by traditional facilitation or good project management.
  5. Prototyping is just the end of the beginning. After one year, we went around and interviewed the first 40 clients of Alberta CoLab. We wanted to know what they had achieved since our co-design sessions. Unfortunately, for most of them, the answer was “not much.” They were very happy with the quality of the ideas and prototypes generated while working with CoLab and were hopeful that the ideas would eventually see the light of day. But they also noted that once participants left the lab and went back to their desks, they found it difficult to sustain the momentum and excitement of the lab, and easy to snap back to business as usual. We had to pivot our strategy to take on fewer projects, but take on a greater stewardship role through to implementation.
  6. Find a rhythm. It’s not useful to create a traditional project plan with phases and milestones for a non-linear and open-ended discovery process like a lab. Yet without some kind of structure, it’s easy to lose momentum or become lost. The best projects I have participated in create a rhythm: an alternating movement between open collaboration and focused delivery. The lab opens up every few months to engage widely on what needs to be done and why. A core team then works between collaborative workshops on how to make it happen. Each cycle allows the group to frame key challenges, make progress, and receive feedback, which builds momentum and commitment.
  7. Be a good gardener. Most of the participants of our workshops arrive with a full plate. They are already 100% committed in their day jobs. Even when they are enthusiastic to ideate, they will be reluctant to take on any additional work. If we want our organizations to innovate, first we have to create the space for new work. We need to prune those projects that we have kept on life support — not yet declared dead but not priorities. This often means making difficult decisions. The flip side of pruning is to actively search for positive deviance and help it to grow. When you find something that’s already working, you just need to turn up the good…..(More)”

Innovation and Its Enemies: Why People Resist New Technologies


]Book by Calestous Juma: “The rise of artificial intelligence has rekindled a long-standing debate regarding the impact of technology on employment. This is just one of many areas where exponential advances in technology signal both hope and fear, leading to public controversy. This book shows that many debates over new technologies are framed in the context of risks to moral values, human health, and environmental safety. But it argues that behind these legitimate concerns often lie deeper, but unacknowledged, socioeconomic considerations. Technological tensions are often heightened by perceptions that the benefits of new technologies will accrue only to small sections of society while the risks will be more widely distributed. Similarly, innovations that threaten to alter cultural identities tend to generate intense social concern. As such, societies that exhibit great economic and political inequities are likely to experience heightened technological controversies.

Drawing from nearly 600 years of technology history, Innovation and Its Enemies identifies the tension between the need for innovation and the pressure to maintain continuity, social order, and stability as one of today’s biggest policy challenges. It reveals the extent to which modern technological controversies grow out of distrust in public and private institutions. Using detailed case studies of coffee, the printing press, margarine, farm mechanization, electricity, mechanical refrigeration, recorded music, transgenic crops, and transgenic animals, it shows how new technologies emerge, take root, and create new institutional ecologies that favor their establishment in the marketplace. The book uses these lessons from history to contextualize contemporary debates surrounding technologies such as artificial intelligence, online learning, 3D printing, gene editing, robotics, drones, and renewable energy. It ultimately makes the case for shifting greater responsibility to public leaders to work with scientists, engineers, and entrepreneurs to manage technological change, make associated institutional adjustments, and expand public engagement on scientific and technological matters….(More)”

What Should We Do About Big Data Leaks?


Paul Ford at the New Republic: “I have a great fondness for government data, and the government has a great fondness for making more of it. Federal elections financial data, for example, with every contribution identified, connected to a name and address. Or the results of the census. I don’t know if you’ve ever had the experience of downloading census data but it’s pretty exciting. You can hold America on your hard drive! Meditate on the miracles of zip codes, the way the country is held together and addressable by arbitrary sets of digits.

You can download whole books, in PDF format, about the foreign policy of the Reagan Administration as it related to Russia. Negotiations over which door the Soviet ambassador would use to enter a building. Gigabytes and gigabytes of pure joy for the ephemeralist. The government is the greatest creator of ephemera ever.

Consider the Financial Crisis Inquiry Commission, or FCIC, created in 2009 to figure out exactly how the global economic pooch was screwed. The FCIC has made so much data, and has done an admirable job (caveats noted below) of arranging it. So much stuff. There are reams of treasure on a single FCIC web site, hosted at Stanford Law School: Hundreds of MP3 files, for example, with interviews with Jamie Dimonof JPMorgan Chase and Lloyd Blankfein of Goldman Sachs. I am desperate to find  time to write some code that automatically extracts random audio snippets from each and puts them on top of a slow ambient drone with plenty of reverb, so that I can relax to the dulcet tones of the financial industry explaining away its failings. (There’s a Paul Krugman interview that I assume is more critical.)

The recordings are just the beginning. They’ve released so many documents, and with the documents, a finding aid that you can download in handy PDF format, which will tell you where to, well, find things, pointing to thousands of documents. That aid alone is 1,439 pages.

Look, it is excellent that this exists, in public, on the web. But it also presents a very contemporary problem: What is transparency in the age of massive database drops? The data is available, but locked in MP3s and PDFs and other documents; it’s not searchable in the way a web page is searchable, not easy to comment on or share.

Consider the WikiLeaks release of State Department cables. They were exhausting, there were so many of them, they were in all caps. Or the trove of data Edward Snowden gathered on aUSB drive, or Chelsea Manning on CD. And the Ashley Madison leak, spread across database files and logs of credit card receipts. The massive and sprawling Sony leak, complete with whole email inboxes. And with the just-released Panama Papers, we see two exciting new developments: First, the consortium of media organizations that managed the leak actually came together and collectively, well, branded the papers, down to a hashtag (#panamapapers), informational website, etc. Second, the size of the leak itself—2.5 terabytes!—become a talking point, even though that exact description of what was contained within those terabytes was harder to understand. This, said the consortia of journalists that notably did not include The New York Times, The Washington Post, etc., is the big one. Stay tuned. And we are. But the fact remains: These artifacts are not accessible to any but the most assiduous amateur conspiracist; they’re the domain of professionals with the time and money to deal with them. Who else could be bothered?

If you watched the movie Spotlight, you saw journalists at work, pawing through reams of documents, going through, essentially, phone books. I am an inveterate downloader of such things. I love what they represent. And I’m also comfortable with many-gigabyte corpora spread across web sites. I know how to fetch data, how to consolidate it, and how to search it. I share this skill set with many data journalists, and these capacities have, in some ways, become the sole province of the media. Organs of journalism are among the only remaining cultural institutions that can fund investigations of this size and tease the data apart, identifying linkages and thus constructing informational webs that can, with great effort, be turned into narratives, yielding something like what we call “a story” or “the truth.” 

Spotlight was set around 2001, and it features a lot of people looking at things on paper. The problem has changed greatly since then: The data is everywhere. The media has been forced into a new cultural role, that of the arbiter of the giant and semi-legal database. ProPublica, a nonprofit that does a great deal of data gathering and data journalism and then shares its findings with other media outlets, is one example; it funded a project called DocumentCloud with other media organizations that simplifies the process of searching through giant piles of PDFs (e.g., court records, or the results of Freedom of Information Act requests).

At some level the sheer boredom and drudgery of managing these large data leaks make them immune to casual interest; even the Ashley Madison leak, which I downloaded, was basically an opaque pile of data and really quite boring unless you had some motive to poke around.

If this is the age of the citizen journalist, or at least the citizen opinion columnist, it’s also the age of the data journalist, with the news media acting as product managers of data leaks, making the information usable, browsable, attractive. There is an uneasy partnership between leakers and the media, just as there is an uneasy partnership between the press and the government, which would like some credit for its efforts, thank you very much, and wouldn’t mind if you gave it some points for transparency while you’re at it.

Pause for a second. There’s a glut of data, but most of it comes to us in ugly formats. What would happen if the things released in the interest of transparency were released in actual transparent formats?…(More)”

Technology for Transparency: Cases from Sub-Saharan Africa


 at Havard Political Review: “Over the last decade, Africa has experienced previously unseen levels of economic growth and market vibrancy. Developing countries can only achieve equitable growth and reduce poverty rates, however, if they are able to make the most of their available resources. To do this, they must maximize the impact of aid from donor governments and NGOs and ensure that domestic markets continue to diversify, add jobs, and generate tax revenues. Yet, in most developing countries, there is a dearth of information available about industry profits, government spending, and policy outcomes that prevents efficient action.

ONE, an international advocacy organization, has estimated that $68.6 billion was lost in sub-Saharan Africa in 2012 due to a lack of transparency in government budgeting….

The Importance of Technology

Increased visibility of problems exerts pressure on politicians and other public sector actors to adjust their actions. This process is known as social monitoring, and it relies on citizens or public agencies using digital tools, such as mobile phones, Facebook, and other social media sites to spot public problems. In sub-Saharan Africa, however, traditional media companies and governments have not shown consistency in reporting on transparency issues.

New technologies offer a solution to this problem. Philip Thigo, the creator of an online and SMS platform that monitors government spending, said in an interview with Technology for Transparency, “All we are trying to do is enhance the work that [governments] do. We thought that if we could create a clear channel where communities could actually access data, then the work of government would be easier.” Networked citizen media platforms that rely on the volunteer contributions of citizens have become increasingly popular. Given that in most African countries less than 10 percent of the population has Internet access, mobile-device-based programs have proven the logical solution. About 30 percent of the population continent-wide has access to cell phones.

Lova Rakotomalala, a co-founder of an NGO in Madagascar that promotes online exposure of social grassroots projects, told the HPR, “most Malagasies will have a mobile phone and an FM radio because it helps them in their daily lives.” Rakotomalala works to provide workshops and IT training to people in regions of Madagascar where Internet access has been recently introduced. According to him, “the amount of data that we can collect from social monitoring and transparency projects will only grow in the near future. There is much room for improvement.”

Kenyan Budget Tracking Tool

The Kenyan Budget Tracking Tool is a prominent example of how social media technology can help obviate traditional transparency issues. Despite increased development assistance and foreign aid, the number of Kenyans classified as poor grew from 29 percent in the 1970s to almost 60 percent in 2000. Noticing this trend, Philip Thigo created an online and SMS platform called the Kenyan Budget Tracking Tool. The platform specifically focuses on the Constituencies Development Fund, through which members of the Kenyan parliament are able to allocate resources towards various projects, such as physical infrastructure, government offices, or new schools.

This social monitoring technology has exposed real government abuses. …

Another mobile tool, Question Box, allows Ugandans to call or message operators who have access to a database full of information on health, agriculture, and education.

But tools like Medic Mobile and the Kenyan Budget Tracking Tool are only the first steps in solving the problems that plague corrupt governments and underdeveloped communities. Improved access to information is no substitute for good leadership. However, as Rakotomalala argued, it is an important stepping-stone. “While legally binding actions are the hammer to the nail, you need to put the proverbial nail in the right place first. That nail is transparency.”…(More)

Data to the Rescue: Smart Ways of Doing Good


Nicole Wallace in the Chronicle of Philanthropy: “For a long time, data served one purpose in the nonprofit world: measuring program results. But a growing number of charities are rejecting the idea that data equals evaluation and only evaluation.

Of course, many nonprofits struggle even to build the simplest data system. They have too little money, too few analysts, and convoluted data pipelines. Yet some cutting-edge organizations are putting data to work in new and exciting ways that drive their missions. A prime example: The Polaris Project is identifying criminal networks in the human-trafficking underworld and devising strategies to fight back by analyzing its data storehouse along with public information.

Other charities dive deep into their data to improve services, make smarter decisions, and identify measures that predict success. Some have such an abundance of information that they’re even pruning their collection efforts to allow for more sophisticated analysis.

The groups highlighted here are among the best nationally. In their work, we get a sneak peek at how the data revolution might one day achieve its promise.

House Calls: Living Goods

Living Goods launched in eastern Africa in 2007 with an innovative plan to tackle health issues in poor families and reduce deaths among children. The charity provides loans, training, and inventory to locals in Uganda and Kenya — mostly women — to start businesses selling vitamins, medicine, and other health products to friends and neighbors.

Founder Chuck Slaughter copied the Avon model and its army of housewives-turned-sales agents. But in recent years, Living Goods has embraced a 21st-century data system that makes its entrepreneurs better health practitioners. Armed with smartphones, they confidently diagnose and treat major illnesses. At the same time, they collect information that helps the charity track health across communities and plot strategy….

Unraveling Webs of Wickedness: Polaris Project

Calls and texts to the Polaris Project’s national human-trafficking hotline are often heartbreaking, terrifying, or both.

Relatives fear that something terrible has happened to a missing loved one. Trafficking survivors suffering from their ordeal need support. The most harrowing calls are from victims in danger and pleading for help.

Last year more than 5,500 potential cases of exploitation for labor or commercial sex were reported to the hotline. Since it got its start in 2007, the total is more than 24,000.

As it helps victims and survivors get the assistance they need, the Polaris Project, a Washington nonprofit, is turning those phone calls and texts into an enormous storehouse of information about the shadowy world of trafficking. By analyzing this data and connecting it with public sources, the nonprofit is drawing detailed pictures of how trafficking networks operate. That knowledge, in turn, shapes the group’s prevention efforts, its policy work, and even law-enforcement investigations….

Too Much Information: Year Up

Year Up has a problem that many nonprofits can’t begin to imagine: It collects too much data about its program. “Predictive analytics really start to stink it up when you put too much in,” says Garrett Yursza Warfield, the group’s director of evaluation.

What Mr. Warfield describes as the “everything and the kitchen sink” problem started soon after Year Up began gathering data. The group, which fights poverty by helping low-income young adults land entry-level professional jobs, first got serious about measuring its work nearly a decade ago. Though challenged at first to round up even basic information, the group over time began tracking virtually everything it could: the percentage of young people who finish the program, their satisfaction, their paths after graduation through college or work, and much more.

Now the nonprofit is diving deeper into its data to figure out which measures can predict whether a young person is likely to succeed in the program. And halfway through this review, it’s already identified and eliminated measures that it’s found matter little. A small example: Surveys of participants early in the program asked them to rate their proficiency at various office skills. Those self-evaluations, Mr. Warfield’s team concluded, were meaningless: How can novice professionals accurately judge their Excel spreadsheet skills until they’re out in the working world?…

On the Wild Side: Wildnerness Society…Without room to roam, wild animals and plants breed among themselves and risk losing genetic diversity. They also fall prey to disease. And that’s in the best of times. As wildlife adapt to climate change, the chance to migrate becomes vital even to survival.

National parks and other large protected areas are part of the answer, but they’re not enough if wildlife can’t move between them, says Travis Belote, lead ecologist at the Wilderness Society.

“Nature needs to be able to shuffle around,” he says.

Enter the organization’s Wildness Index. It’s a national map that shows the parts of the country most touched by human activity as well as wilderness areas best suited for wildlife. Mr. Belote and his colleagues created the index by combining data on land use, population density, road location and size, water flows, and many other factors. It’s an important tool to help the nonprofit prioritize the locations it fights to protect.

In Idaho, for example, the nonprofit compares the index with information about known wildlife corridors and federal lands that are unprotected but meet the criteria for conservation designation. The project’s goal: determine which areas in the High Divide — a wild stretch that connects Greater Yellowstone with other protected areas — the charity should advocate to legally protect….(More)”

Open data and the API economy: when it makes sense to give away data


 at ZDNet: “Open data is one of those refreshing trends that flows in the opposite direction of the culture of fear that has developed around data security. Instead of putting data under lock and key, surrounded by firewalls and sandboxes, some organizations see value in making data available to all comers — especially developers.

The GovLab.org, a nonprofit advocacy group, published an overview of the benefits governments and organizations are realizing from open data, as well as some of the challenges. The group defines open data as “publicly available data that can be universally and readily accessed, used and redistributed free of charge. It is structured for usability and computability.”…

For enterprises, an open-data stance may be the fuel to build a vibrant ecosystem of developers and business partners. Scott Feinberg, API architect for The New York Times, is one of the people helping to lead the charge to open-data ecosystems. In a recent CXOTalk interview with ZDNet colleague Michael Krigsman, he explains how through the NYT APIs program, developers can sign up for access to 165 years worth of content.

But it requires a lot more than simply throwing some APIs out into the market. Establishing such a comprehensive effort across APIs requires a change in mindset that many organizations may not be ready for, Feinberg cautions. “You can’t be stingy,” he says. “You have to just give it out. When we launched our developer portal there’s a lot of questions like, are people going to be stealing our data, questions like that. Just give it away. You don’t have to give it all but don’t be stingy, and you will find that first off not that many people are going to use it at first. you’re going to find that out, but the people who do, you’re going to find those passionate people who are really interested in using your data in new ways.”

Feinberg clarifies that the NYT’s APIs are not giving out articles for free. Rather, he explains, “we give is everything but article content. You can search for articles. You can find out what’s trending. You can almost do anything you want with our data through our APIs with the exception of actually reading all of the content. It’s really about giving people the opportunity to really interact with your content in ways that you’ve never thought of, and empowering your community to figure out what they want. You know while we don’t give our actual article text away, we give pretty much everything else and people build a lot of really cool stuff on top of that.”

Open data sets, of course, have to worthy of the APIs that offer them. In his post, Borne outlines the seven qualities open data needs to have to be of value to developers and consumers. (Yes, they’re also “Vs” like big data.)

  1. Validity: It’s “critical to pay attention to these data validity concerns when your organization’s data are exposed to scrutiny and inspection by others,” Borne states.
  2. Value: The data needs to be the font of new ideas, new businesses, and innovations.
  3. Variety: Exposing the wide variety of data available can be “a scary proposition for any data scientist,” Borne observes, but nonetheless is essential.
  4. Voice: Remember that “your open data becomes the voice of your organization to your stakeholders.”
  5. Vocabulary: “The semantics and schema (data models) that describe your data are more critical than ever when you provide the data for others to use,” says Borne. “Search, discovery, and proper reuse of data all require good metadata, descriptions, and data modeling.”
  6. Vulnerability: Accept that open data, because it is so open, will be subjected to “misuse, abuse, manipulation, or alteration.”
  7. proVenance: This is the governance requirement behind open data offerings. “Provenance includes ownership, origin, chain of custody, transformations that been made to it, processing that has been applied to it (including which versions of processing software were used), the data’s uses and their context, and more,” says Borne….(More)”