Harnessing the Data Revolution for Sustainable Development


US State Department Fact Sheet on “U.S. Government Commitments and Collaboration with the Global Partnership for Sustainable Development Data”: “On September 27, 2015, the member states of the United Nations agreed to a set of Sustainable Development Goals (Global Goals) that define a common agenda to achieve inclusive growth, end poverty, and protect the environment by 2030. The Global Goals build on tremendous development gains made over the past decade, particularly in low- and middle-income countries, and set actionable steps with measureable indicators to drive progress. The availability and use of high quality data is essential to measuring and achieving the Global Goals. By harnessing the power of technology, mobilizing new and open data sources, and partnering across sectors, we will achieve these goals faster and make their progress more transparent.

Harnessing the data revolution is a critical enabler of the global goals—not only to monitor progress, but also to inclusively engage stakeholders at all levels – local, regional, national, global—to advance evidence-based policies and programs to reach those who need it most. Data can show us where girls are at greatest risk of violence so we can better prevent it; where forests are being destroyed in real-time so we can protect them; and where HIV/AIDS is enduring so we can focus our efforts and finish the fight. Data can catalyze private investment; build modern and inclusive economies; and support transparent and effective investment of resources for social good…..

The Global Partnership for Sustainable Development Data (Global Data Partnership), launched on the sidelines of the 70th United Nations General Assembly, is mobilizing a range of data producers and users—including governments, companies, civil society, data scientists, and international organizations—to harness the data revolution to achieve and measure the Global Goals. Working together, signatories to the Global Data Partnership will address the barriers to accessing and using development data, delivering outcomes that no single stakeholder can achieve working alone….The United States, through the U.S. President’s Emergency Plan for AIDS Relief (PEPFAR), is joining a consortium of funders to seed this initiative. The U.S. Government has many initiatives that are harnessing the data revolution for impact domestically and internationally. Highlights of our international efforts are found below:

Health and Gender

Country Data Collaboratives for Local Impact – PEPFAR and the Millennium Challenge Corporation(MCC) are partnering to invest $21.8 million in Country Data Collaboratives for Local Impact in sub-Saharan Africa that will use data on HIV/AIDS, global health, gender equality, and economic growth to improve programs and policies. Initially, the Country Data Collaboratives will align with and support the objectives of DREAMS, a PEPFAR, Bill & Melinda Gates Foundation, and Girl Effect partnership to reduce new HIV infections among adolescent girls and young women in high-burden areas.

Measurement and Accountability for Results in Health (MA4Health) Collaborative – USAID is partnering with the World Health Organization, the World Bank, and over 20 other agencies, countries, and civil society organizations to establish the MA4Health Collaborative, a multi-stakeholder partnership focused on reducing fragmentation and better aligning support to country health-system performance and accountability. The Collaborative will provide a vehicle to strengthen country-led health information platforms and accountability systems by improving data and increasing capacity for better decision-making; facilitating greater technical collaboration and joint investments; and developing international standards and tools for better information and accountability. In September 2015, partners agreed to a set of common strategic and operational principles, including a strong focus on 3–4 pathfinder countries where all partners will initially come together to support country-led monitoring and accountability platforms. Global actions will focus on promoting open data, establishing common norms and standards, and monitoring progress on data and accountability for the Global Goals. A more detailed operational plan will be developed through the end of the year, and implementation will start on January 1, 2016.

Data2X: Closing the Gender GapData2X is a platform for partners to work together to identify innovative sources of data, including “big data,” that can provide an evidence base to guide development policy and investment on gender data. As part of its commitment to Data2X—an initiative of the United Nations Foundation, Hewlett Foundation, Clinton Foundation, and Bill & Melinda Gates Foundation—PEPFAR and the Millennium Challenge Corporation (MCC) are working with partners to sponsor an open data challenge to incentivize the use of gender data to improve gender policy and practice….(More)”

See also: Data matters: the Global Partnership for Sustainable Development Data. Speech by UK International Development Secretary Justine Greening at the launch of the Global Partnership for Sustainable Development Data.

Researchers wrestle with a privacy problem


Erika Check Hayden at Nature: “The data contained in tax returns, health and welfare records could be a gold mine for scientists — but only if they can protect people’s identities….In 2011, six US economists tackled a question at the heart of education policy: how much does great teaching help children in the long run?

They started with the records of more than 11,500 Tennessee schoolchildren who, as part of an experiment in the 1980s, had been randomly assigned to high- and average-quality teachers between the ages of five and eight. Then they gauged the children’s earnings as adults from federal tax returns filed in the 2000s. The analysis showed that the benefits of a good early education last for decades: each year of better teaching in childhood boosted an individual’s annual earnings by some 3.5% on average. Other data showed the same individuals besting their peers on measures such as university attendance, retirement savings, marriage rates and home ownership.

The economists’ work was widely hailed in education-policy circles, and US President Barack Obama cited it in his 2012 State of the Union address when he called for more investment in teacher training.

But for many social scientists, the most impressive thing was that the authors had been able to examine US federal tax returns: a closely guarded data set that was then available to researchers only with tight restrictions. This has made the study an emblem for both the challenges and the enormous potential power of ‘administrative data’ — information collected during routine provision of services, including tax returns, records of welfare benefits, data on visits to doctors and hospitals, and criminal records. Unlike Internet searches, social-media posts and the rest of the digital trails that people establish in their daily lives, administrative data cover entire populations with minimal self-selection effects: in the US census, for example, everyone sampled is required by law to respond and tell the truth.

This puts administrative data sets at the frontier of social science, says John Friedman, an economist at Brown University in Providence, Rhode Island, and one of the lead authors of the education study “They allow researchers to not just get at old questions in a new way,” he says, “but to come at problems that were completely impossible before.”….

But there is also concern that the rush to use these data could pose new threats to citizens’ privacy. “The types of protections that we’re used to thinking about have been based on the twin pillars of anonymity and informed consent, and neither of those hold in this new world,” says Julia Lane, an economist at New York University. In 2013, for instance, researchers showed that they could uncover the identities of supposedly anonymous participants in a genetic study simply by cross-referencing their data with publicly available genealogical information.

Many people are looking for ways to address these concerns without inhibiting research. Suggested solutions include policy measures, such as an international code of conduct for data privacy, and technical methods that allow the use of the data while protecting privacy. Crucially, notes Lane, although preserving privacy sometimes complicates researchers’ lives, it is necessary to uphold the public trust that makes the work possible.

“Difficulty in access is a feature, not a bug,” she says. “It should be hard to get access to data, but it’s very important that such access be made possible.” Many nations collect administrative data on a massive scale, but only a few, notably in northern Europe, have so far made it easy for researchers to use those data.

In Denmark, for instance, every newborn child is assigned a unique identification number that tracks his or her lifelong interactions with the country’s free health-care system and almost every other government service. In 2002, researchers used data gathered through this identification system to retrospectively analyse the vaccination and health status of almost every child born in the country from 1991 to 1998 — 537,000 in all. At the time, it was the largest study ever to disprove the now-debunked link between measles vaccination and autism.

Other countries have begun to catch up. In 2012, for instance, Britain launched the unified UK Data Service to facilitate research access to data from the country’s census and other surveys. A year later, the service added a new Administrative Data Research Network, which has centres in England, Scotland, Northern Ireland and Wales to provide secure environments for researchers to access anonymized administrative data.

In the United States, the Census Bureau has been expanding its network of Research Data Centers, which currently includes 19 sites around the country at which researchers with the appropriate permissions can access confidential data from the bureau itself, as well as from other agencies. “We’re trying to explore all the available ways that we can expand access to these rich data sets,” says Ron Jarmin, the bureau’s assistant director for research and methodology.

In January, a group of federal agencies, foundations and universities created the Institute for Research on Innovation and Science at the University of Michigan in Ann Arbor to combine university and government data and measure the impact of research spending on economic outcomes. And in July, the US House of Representatives passed a bipartisan bill to study whether the federal government should provide a central clearing house of statistical administrative data.

Yet vast swathes of administrative data are still inaccessible, says George Alter, director of the Inter-university Consortium for Political and Social Research based at the University of Michigan, which serves as a data repository for approximately 760 institutions. “Health systems, social-welfare systems, financial transactions, business records — those things are just not available in most cases because of privacy concerns,” says Alter. “This is a big drag on research.”…

Many researchers argue, however, that there are legitimate scientific uses for such data. Jarmin says that the Census Bureau is exploring the use of data from credit-card companies to monitor economic activity. And researchers funded by the US National Science Foundation are studying how to use public Twitter posts to keep track of trends in phenomena such as unemployment.

 

….Computer scientists and cryptographers are experimenting with technological solutions. One, called differential privacy, adds a small amount of distortion to a data set, so that querying the data gives a roughly accurate result without revealing the identity of the individuals involved. The US Census Bureau uses this approach for its OnTheMap project, which tracks workers’ daily commutes. ….In any case, although synthetic data potentially solve the privacy problem, there are some research applications that cannot tolerate any noise in the data. A good example is the work showing the effect of neighbourhood on earning potential3, which was carried out by Raj Chetty, an economist at Harvard University in Cambridge, Massachusetts. Chetty needed to track specific individuals to show that the areas in which children live their early lives correlate with their ability to earn more or less than their parents. In subsequent studies5, Chetty and his colleagues showed that moving children from resource-poor to resource-rich neighbourhoods can boost their earnings in adulthood, proving a causal link.

Secure multiparty computation is a technique that attempts to address this issue by allowing multiple data holders to analyse parts of the total data set, without revealing the underlying data to each other. Only the results of the analyses are shared….(More)”

TurboVote


TurboVote is a software platform and an implementation program developed by Democracy Works, a nonpartisan, nonprofit 501(c)(3) organization that works to simplify the voting process.

The TurboVote tool is an online service that helps students vote in every election — local, state, and national — by helping them register where they want to vote, and by keeping them engaged with the elections in their local communities.

What does TurboVote give students?

  • helps them register to vote
  • helps them vote by sending election reminders via text and via email – that way they can stay in touch with local elections even from away
  • helps them vote by mail via absentee ballot request forms

What benefits are there for administrators, faculty, and other implementers?
TurboVote makes it possible for colleges and universities to conduct voter engagement efforts efficiently. Students have an array of personalized voting needs, from registration to ballot request requirements to deadlines – and it’s logistically prohibitive for an institution to meet these needs for every student. With TurboVote you can promote and monitor student registration and engagement by encouraging students to complete a short online process. ….(More)”

Sustainable Value of Open Government Data


Phd Thesis from Thorhildur Jetzek: “The impact of the digital revolution on our societies can be compared to the ripples caused by a stone thrown in water: spreading outwards and affecting a larger and larger part of our lives with every year that passes. One of the many effects of this revolution is the emergence of an already unprecedented amount of digital data that is accumulating exponentially. Moreover, a central affordance of digitization is the ability to distribute, share and collaborate, and we have thus seen an “open theme” gaining currency in recent years. These trends are reflected in the explosion of Open Data Initiatives (ODIs) around the world. However, while hundreds of national and local governments have established open data portals, there is a general feeling that these ODIs have not yet lived up to their true potential. This feeling is not without good reason; the recent Open Data Barometer report highlights that strong evidence on the impacts of open government data is almost universally lacking (Davies, 2013). This lack of evidence is disconcerting for government organizations that have already expended money on opening data, and might even result in the termination of some ODIs. This lack of evidence also raises some relevant questions regarding the nature of value generation in the context of free data and sharing of information over networks. Do we have the right methods, the right intellectual tools, to understand and reflect the value that is generated in such ecosystems?

This PhD study addresses the question of How is value generated from open data? through a mixed methods, macro-level approach. For the qualitative analysis, I have conducted two longitudinal case studies in two different contexts. The first is the case of the Basic Data Program (BDP), which is a Danish ODI. For this case, I studied the supply-side of open data publication, from the creation of open data strategy towards the dissemination and use of data. The second case is a demand-side study on the energy tech company Opower. Opower has been an open data user for many years and have used open data to create and disseminate personalized information on energy use. This information has already contributed to a measurable world-wide reduction in CO2 emissions as well as monetary savings. Furthermore, to complement the insights from these two cases I analyzed quantitative data from 76 countries over the years 2012 and 2013. I have used these diverse sources of data to uncover the most important relationships or mechanisms, that can explain how open data are used to generate sustainable value….(More)”

EveryPolitician


“The clue’s in the name. EveryPolitician aims to provide data about, well, every politician. In the world. It’s a simple but ambitious project to collect and share that data, in a consistent, open format that anyone can use.

Why? Because this resource doesn’t yet exist. And it would be incredibly useful, for a huge number of people and organisations all around the world.

When data is in a consistent, structured format, it can be reused by developers everywhere. You don’t have waste time scraping data and converting it into a format you can work with; instead, you can simply concentrate on making tools. And those tools can more easily be picked up, used and adapted to local needs anywhere in the world, saving everyone time and effort.

The data

The long term aim is to include every elected official in the world, but let’s start simple. Our first goal is to have data for all present-day national-level legislators.

To see how far we’ve got, pick a country.

There’s more to this data than you’ll see there, though. For most datasets there is richer information available, including contact details, photos, gender, and more.

If you want to use that data, you can download it in two useful formats:

  • CSV format (great for spreadsheets)
  • JSON in Popolo format (ideal for developers)

A note about the Popolo standard: it’s a rich, expressive format that, like a language, is used in many different ways by different authors. However, when we add data to EveryPolitician we always use Popolo according to the same, defined principles. It’s because of this consistency that the tools you build will work with EveryPolitician data from any country, for any country.

Want more detail? Interested in using this data in a web application or tool you’re building? See the technical overview of EveryPolitician.”

(US) Administration Announces New “Smart Cities” Initiative to Help Communities Tackle Local Challenges and Improve City Services


Factsheet from the White House: “Today, the Administration is announcing a new “Smart Cities” Initiative that will invest over $160 million in federal research and leverage more than 25 new technology collaborations to help local communities tackle key challenges such as reducing traffic congestion, fighting crime, fostering economic growth, managing the effects of a changing climate, and improving the delivery of city services. The new initiative is part of this Administration’s overall commitment to target federal resources to meet local needs and support community-led solutions.

Over the past six years, the Administration has pursued a place-based approach to working with communities as they tackle a wide range of challenges, from investing in infrastructure and filling open technology jobs to bolstering community policing. Advances in science and technology have the potential to accelerate these efforts. An emerging community of civic leaders, data scientists, technologists, and companies are joining forces to build “Smart Cities” – communities that are building an infrastructure to continuously improve the collection, aggregation, and use of data to improve the life of their residents – by harnessing the growing data revolution, low-cost sensors, and research collaborations, and doing so securely to protect safety and privacy.

As part of the initiative, the Administration is announcing:

  • More than $35 million in new grants and over $10 million in proposed investments to build a research infrastructure for Smart Cities by the National Science Foundation and National Institute of Standards and Technology.
  • Nearly $70 million in new spending and over $45 million in proposed investments to unlock new solutions in safety, energy, climate preparedness, transportation, health and more, by the Department of Homeland Security, Department of Transportation, Department of Energy, Department of Commerce, and the Environmental Protection Agency.
  • More than 20 cities participating in major new multi-city collaborations that will help city leaders effectively collaborate with universities and industry.

Today, the Administration is also hosting a White House Smart Cities Forum, coinciding with Smart Cities Week hosted by the Smart Cities Council, to highlight new steps and brainstorm additional ways that science and technology can support municipal efforts.

The Administration’s Smart Cities Initiative will begin with a focus on key strategies:

  • Creating test beds for “Internet of Things” applications and developing new multi-sector collaborative models: Technological advancements and the diminishing cost of IT infrastructure have created the potential for an “Internet of Things,” a ubiquitous network of connected devices, smart sensors, and big data analytics. The United States has the opportunity to be a global leader in this field, and cities represent strong potential test beds for development and deployment of Internet of Things applications. Successfully deploying these and other new approaches often depends on new regional collaborations among a diverse array of public and private actors, including industry, academia, and various public entities.
  • Collaborating with the civic tech movement and forging intercity collaborations: There is a growing community of individuals, entrepreneurs, and nonprofits interested in harnessing IT to tackle local problems and work directly with city governments. These efforts can help cities leverage their data to develop new capabilities. Collaborations across communities are likewise indispensable for replicating what works in new places.
  • Leveraging existing Federal activity: From research on sensor networks and cybersecurity to investments in broadband infrastructure and intelligent transportation systems, the Federal government has an existing portfolio of activities that can provide a strong foundation for a Smart Cities effort.
  • Pursuing international collaboration: Fifty-four percent of the world’s population live in urban areas. Continued population growth and urbanization will add 2.5 billion people to the world’s urban population by 2050. The associated climate and resource challenges demand innovative approaches. Products and services associated with this market present a significant export opportunity for the U.S., since almost 90 percent of this increase will occur in Africa and Asia.

Complementing this effort, the President’s Council of Advisors on Science and Technology is examining how a variety of technologies can enhance the future of cities and the quality of life for urban residents. The Networking and Information Technology Research and Development (NITRD) Program is also announcing the release of a new framework to help coordinate Federal agency investments and outside collaborations that will guide foundational research and accelerate the transition into scalable and replicable Smart City approaches. Finally, the Administration’s growing work in this area is reflected in the Science and Technology Priorities Memo, issued by the Office of Management and Budget and Office of Science and Technology Policy in preparation for the President’s 2017 budget proposal, which includes a focus on cyber-physical systems and Smart Cities….(More)”

Community-based Participatory Science is Changing the Way Research Happens—and What Happens Next


Judy Robinson at The Equation: “…Whereas in the past the public seemed content to hear about scientific progress from lab-coat-clad researchers on private crusades to advance their field, now people want science to improve their lives directly. They want progress faster, and a more democratic, participatory role in deciding what needs to change and which research questions will fuel a movement for those changes….

Coming Clean is a network of community, state, national and technical organizations focused on environmental health and justice. Often we’ve been at the forefront of community-based participatory science efforts to support healthier environments, less toxic products, and a more just and equitable society: all issues that deeply matter to the non-expert public.

….For instance, with environmental justice advocacy organizations in the lead, residents of low-income, minority communities collected products at neighborhood dollar stores to see what unnecessary and dangerous chemical exposures could occur as a result of product purchases. In laboratory results we found over 80% of the products tested contained toxic chemicals at potentially hazardous levels (as documented in our report; “A Day Late and a Dollar Short”). That information, along with their organizing around it, has since attracted over 146,700 people to support the nationalCampaign for Healthier Solutions. That’s local science at work.

Documented in Coming Clean’s report; “Warning Signs: Toxic Pollution Identified at Oil and Gas Development Sites and in the peer-reviewed journal Environmental Health, 38% of the samples collected by community volunteers contained concentrations of volatile compounds exceeding federal standards for health risks, some at levels thousands of times higher than what federal health and environmental agencies consider to be “safe.” Seven air samples from Wyoming contained hydrogen sulfide at levels between two and 660 times the concentration that is immediately dangerous to human life. Beyond the astonishing numbers, the research helped educate and engage the public on the problem and solutions communities seek, filled critical gaps in our understanding of the threat oil and gas development poses to public health, and was among the reasons cited in Governor Cuomo’s decision to ban fracking in New York State.

For Coming Clean and others across the country, this kind of community-based participatory science is changing the way science is conducted and, most importantly, what comes after the data collection and analysis is complete. In both the dollar store research and the oil and gas science, the effect of the science was to strengthen existing organizing campaigns for community-based solutions. The “good old days” when we waited for scientific proof to change the world are over, if they ever existed. Now science and citizen organizing together are changing the rules of the game, the outcome, and who gets to play….(More)”

Hacking the Obesity Epidemic


Press Release: “The de Beaumont Foundation, in collaboration with the Health Data Consortium and the Department of Health and Human Services (HHS), is pleased to announce the winners of the U.S. Obesity Data Challenge at NHS England’s Health and Care Innovation Expo 2015. The challenge is part of a joint U.S.-England initiative designed to harness the power of health data in tackling the epidemic of adult obesity in both countries….

The winning entries are:

  • Healthdata+Obesity (1st place) — This simple, curated dashboard helps health officials tell a powerful story about the root causes of obesity. The dashboard provides customizable data visualizations at the national, state, and local level as well as an interactive map, national benchmarks, and written content to contextualize the data. Developed by HealthData+, a partnership between the Public Health Institute and LiveStories.
  • The Neighborhood Map of U.S. Obesity (2nd Place) — This highly-detailed, interactive mapincorporates obesity data with a GIS database to provide a localized, high-resolution visualization of the prevalence of obesity. Additional data sources can also be added to the map to allow researchers and health officials greater flexibility in customizing the map to support analysis and decision-making on a community level. Developed by RTI International.
  • The Health Demographic Analysis Tool – Visualizing The Cross-Sector Relationship Between Obesity And Social Determinants (3rd Place) — This interactive database maps the relationship between the social determinants of health (factors like educational attainment, income, and lifestyle choices) and health outcomes in order to illustrate what plays a role in community health. The powerful images generated by this tool provide compelling material for new health interventions as well as a way to look retrospectively at the impact of existing public health campaigns. Developed by GeoHealth Innovations andCommunity Health Solutions….(More)

A data revolution is underway. Will NGOs miss the boat?


Opinion by Sophia Ayele at Oxfam: “The data revolution has arrived. ….The UN has even launched a Data Revolution Group (to ensure that the revolution penetrates into international development). The Group’s 2014 report suggests that harnessing the power of newly available data could ultimately lead to, “more empowered people, better policies, better decisions and greater participation and accountability, leading to better outcomes for people and the planet.”

But where do NGOs fit in?

NGOs are generating dozens (if not hundreds) of datasets every year. Over the last two decades, NGO have been collecting increasing amounts of research and evaluation data, largely driven by donor demands for more rigorous evaluations of programs. The quality and efficiency of data collection has also been enhanced by mobile data collection. However, a quick scan of UK development NGOs reveals that few, if any, are sharing the data that they collect. This means that NGOs are generating dozens (if not hundreds) of datasets every year that aren’t being fully exploited and analysed. Working on tight budgets, with limited capacity, it’s not surprising that NGOs often shy away from sharing data without a clear mandate.

But change is in the air. Several donors have begun requiring NGOs to publicise data and others appear to be moving in that direction. Last year, USAID launched its Open Data Policy which requires that grantees “submit any dataset created or collected with USAID funding…” Not only does USAID stipulate this requirement, it also hosts this data on its Development Data Library (DDL) and provides guidance on anonymisation to depositors. Similarly, Gates Foundation’s 2015 Open Access Policy stipulates that, “Data underlying published research results will be accessible and open immediately.” However, they are allowing a two-year transition period…..Here at Oxfam, we have been exploring ways to begin sharing research and evaluation data. We aren’t being required to do this – yet – but, we realise that the data that we collect is a public good with the potential to improve lives through more effective development programmes and to raise the voices of those with whom we work. Moreover, organizations like Oxfam can play a crucial role in highlighting issues facing women and other marginalized communities that aren’t always captured in national statistics. Sharing data is also good practice and would increase our transparency and accountability as an organization.

… the data that we collect is a public good with the potential to improve lives. However, Oxfam also bears a huge responsibility to protect the rights of the communities that we work with. This involves ensuring informed consent when gathering data, so that communities are fully aware that their data may be shared, and de-identifying data to a level where individuals and households cannot be easily identified.

As Oxfam has outlined in our, recently adopted, Responsible Data Policy,”Using data responsibly is not just an issue of technical security and encryption but also of safeguarding the rights of people to be counted and heard, ensuring their dignity, respect and privacy, enabling them to make an informed decision and protecting their right to not be put at risk… (More)”

The Art of Managing Complex Collaborations


Eric Knight, Joel Cutcher-Gershenfeld, and Barbara Mittleman at MIT Sloan Management Review: “It’s not easy for stakeholders with widely varying interests to collaborate effectively in a consortium. The experience of the Biomarkers Consortium offers five lessons on how to successfully navigate the challenges that arise….

Society’s biggest challenges are also its most complex. From shared economic growth to personalized medicine to global climate change, few of our most pressing problems are likely to have simple solutions. Perhaps the only way to make progress on these and other challenges is by bringing together the important stakeholders on a given issue to pursue common interests and resolve points of conflict.

However, it is not easy to assemble such groups or to keep them together. Many initiatives have stumbled and disbanded. The Biomarkers Consortium might have been one of them, but this consortium beat the odds, in large part due to the founding parties’ determination to make it work. Nine years after it was founded, this public-private partnership, which is managed by the Foundation for the National Institutes of Health and based in Bethesda, Maryland, is still working to advance the availability of biomarkers (biological indicators for disease states) as tools for drug development, including applications at the frontiers of personalized medicine.

The Biomarkers Consortium’s mandate — to bring together, in the group’s words, “the expertise and resources of various partners to rapidly identify, develop, and qualify potential high-impact biomarkers particularly to enable improvements in drug development, clinical care, and regulatory decision-making” — may look simple. However, the reality has been quite complex. The negotiations that led to the consortium’s formation in 2006 were complicated, and the subsequent balancing of common and competing interests remains challenging….

Many in the biomedical sector had seen the need to tackle drug discovery costs for a long time, with multiple companies concurrently spending millions, sometimes billions, of dollars only to hit common dead ends in the drug development process. In 2004 and 2005, then National Institutes of Health director Elias Zerhouni convened key people from the U.S. Food and Drug Administration, the NIH, and the Pharmaceutical Research and Manufacturers of America to create a multistakeholder forum.

Every member knew from the outset that their fellow stakeholders represented many divergent and sometimes opposing interests: large pharmaceutical companies, smaller entrepreneurial biotechnology companies, FDA regulators, NIH science and policy experts, university researchers and nonprofit patient advocacy organizations….(More)”