HeroX enables breakthroughs


About HeroX, “The World’s Problem Solver Community”: “On October 21, 2004, Scaled Composites’ SpaceShipOne reached the edge of space, an altitude of 100km, becoming the first privately built spacecraft to perform this feat, twice within two weeks.

In so doing, they won the $10 million Ansari XPRIZE, ushering in a new era of commercial space exploration and applications.

It was the inaugural incentive prize competition of the XPRIZE Foundation, which has gone on to create an incredible array of incentive prizes to solve the world’s Grand Challenges — ocean health, literacy, space exploration, among many others.

In 2011, City Light Capital partnered with XPRIZE to envision a platform that would make the power of incentive challenges available to anyone. The result was the spin-off of HeroX in 2013.

HeroX was co-founded in 2013 by XPRIZE founder Peter Diamandis, challenge designer Emily Fowler and entrepreneur Christian Cotichini as a means to democratize the innovation model of XPRIZE.

HeroX exists to enable anyone, anywhere in the world, to create a challenge that addresses any problem or opportunity, build a community around that challenge and activate the circumstances that can lead to a breakthrough innovation.

This innovation model has existed for centuries.

The Ansari XPRIZE was inspired by the 1927 Orteig Prize, in which Charles Lindbergh crossed the Atlantic in the Spirit of St. Louis. The $25,000 prize had been offered by hotelier Raymond Orteig to spur tourism. Lindbergh’s flight lead to a boom in air travel the world over.

A similar challenge had launched 200 years earlier with the 1716 Longitude Prize, which sought a technology to more accurately measure longitude at sea. Nearly 60 years later, a British clockmaker named John Harrison invented the chronometer, which spurred Trans-Atlantic migration on a massive scale.

In 1795, Napoleon offered a 12,000 franc prize for a better method of preserving food, which was often spoiled by the time it reached the front lines of his armies. The breakthrough innovation to Napoleon’s prize led to the creation of the canning industry.

HeroX incentive prize challenges are designed to do the same — to harness the collective mind power of a community to innovate upon any problem or opportunity. Anyone can change the world. HeroX can help.

The only question is, “What do you want to solve?”… (More)

Do Universities, Research Institutions Hold the Key to Open Data’s Next Chapter


Ben Miller at Government Technology: “Government produces a lot of data — reams of it, roomfuls of it, rivers of it. It comes in from citizen-submitted forms, fleet vehicles, roadway sensors and traffic lights. It comes from utilities, body cameras and smartphones. It fills up servers and spills into the cloud. It’s everywhere.

And often, all that data sits there not doing much. A governing entity might have robust data collection and it might have an open data policy, but that doesn’t mean it has the computing power, expertise or human capital to turn those efforts into value.

The amount of data available to government and the computing public promises to continue to multiply — the growing smart cities trend, for example, installs networks of sensors on everything from utility poles to garbage bins.

As all this happens, a movement — a new spin on an old concept — has begun to take root: partnerships between government and research institutes. Usually housed within universities and laboratories, these partnerships aim to match strength with strength. Where government has raw data, professors and researchers have expertise and analytics programs.

Several leaders in such partnerships, spanning some of the most tech-savvy cities in the country, see increasing momentum toward the concept. For instance, the John D. and Catherine T. MacArthur Foundation in September helped launch the MetroLab Network, an organization of more than 20 cities that have partnered with local universities and research institutes for smart-city-oriented projects….

Two recurring themes in projects that universities and research organizations take on in cooperation with government are project evaluation and impact analysis. That’s at least partially driven by the very nature of the open data movement: One reason to open data is to get a better idea of how well the government is operating….

Open data may have been part of the impetus for city-university partnerships, in that the availability of more data lured researchers wanting to work with it and extract value. But those partnerships have, in turn, led to government officials opening more data than ever before for useful applications.

Sort of.

“I think what you’re seeing is not just open data, but kind of shades of open — the desire to make the data open to university researchers, but not necessarily the broader public,” said Beth Noveck, co-founder of New York University’s GovLab.


shipping+crates

GOVLAB: DOCKER FOR DATA 

Much of what GovLab does is about opening up access to data, and that is the whole point of Docker for Data. The project aims to simplify and quicken the process of extracting and loading large data sets so they will respond to Structured Query Language commands by moving the computing power of that process to the cloud. The docker can be installed with a single line of code, and its website plays host to already-extracted data sets. Since its inception, the website has grown to include more than 100 gigabytes of data from more than 8,000 data sets. From Baltimore, for example, one can easily find information on public health, water sampling, arrests, senior centers and more. Photo via Shutterstock.


That’s partially because researchers are a controlled group who can be forced to sign memorandums of understanding and trained to protect privacy and prevent security breaches when government hands over sensitive data. That’s a top concern of agencies that manage data, and it shows in the GovLab’s work.

It was something Noveck found to be very clear when she started working on a project she simply calls “Arnold” because of project support from the Laura and John Arnold Foundation. The project involves building a better understanding of how different criminal justice jurisdictions collect, store and share data. The motivation is to help bridge the gaps between people who manage the data and people who should have easy access to it. When Noveck’s center conducted a survey among criminal justice record-keepers, the researchers found big differences between participants.

“There’s an incredible disparity of practices that range from some jurisdictions that have a very well established, formalized [memorandum of understanding] process for getting access to data, to just — you send an email to a guy and you hope that he responds, and there’s no organized way to gain access to data, not just between [researchers] and government entities, but between government entities,” she said….(More)

Beyond behaviour change: Key issues, interdisciplinary approaches and future directions


Book edited by Fiona Spotswood: “‘Behaviour change’ has become a buzz phrase of growing importance to policymakers and researchers. There is an increasing focus on exploring the relationship between social organisation and individual action, and on intervening to influence societal outcomes like population health and climate change. Researchers continue to grapple with methodologies, intervention strategies and ideologies around ‘social change’. Multidisciplinary in approach, this important book draws together insights from a selection of the principal thinkers in fields including public health, transport, marketing, sustainability and technology. The book explores the political and historical landscape of behaviour change, and trends in academic theory, before examining new innovations in both practice and research. It will be a valuable resource for academics, policy makers, practitioners, researchers and students wanting to locate their thinking within this rapidly evolving field….(More)”

Ideas Help No One on a Shelf. Take Them to the World


Tina Rosenberg at The New York Times: “Have you thought of a clever product to mitigate climate change? Did you invent an ingenious gadget to light African villages at night? Have you come up with a new kind of school, or new ideas for lowering the rate of urban shootings?

Thanks, but we have lots of those.

Whatever problem possesses you, we already have plenty of ways to solve it. Many have been rigorously tested and have a lot of evidence behind them — and yet they’re sitting on a shelf.

So don’t invent something new. If you want to make a contribution, choose one of those ideas — and spread it.

Spreading an idea can mean two different things. One is to take something that’s working in one place and introduce it somewhere else. If you want to reduce infant mortality in Cleveland, why not try what’s working in Baltimore?

Well, you might not know about what’s working because there’s no quick system for finding it.

Even when a few people do search out the answer, innovative ideas don’t spread by themselves. To become well known, they require effort from their originators. For example, a Bogotá, Colombia, maternity hospital invented Kangaroo Care — a method of keeping premature babies warm by strapping them 24/7 to Mom’s chest. It saved a lot of lives in Bogotá. But what allowed it to save lives around the world was a campaign to spread it to other countries.

The Colombians established Fundación Canguro and got grants from wealthy countries to bring groups of doctors and nurses from all over to visit Bogotá for two or three weeks.  Once the visitors had gone back and set up a program in their hospital, the foundation loaned them a doctor and nurse to help get them started. Save the Children now leads a global partnership to spread Kangaroo Care, with the goal of reaching half the world.

In short, this work requires dedicated organizations, a smart program and lots of money.

The other meaning of spreading an idea is creating ways to get new inventions out to people who need them.

“When I talk to college students or anyone who’s thinking about entrepreneurship or targeting global poverty, the gadget is where 99 percent of people start thinking,” said Nicholas Fusso, the director of D-Prize (its slogan: “Distribution is development”).  “That’s important — but the biggest problems in the poverty world aren’t a lack of gadgets or new products. It’s figuring out how people can have access to them.” So D-Prize gives seed money, in chunks of $10,000 to $20,000, to tiny new organizations that have good ideas for how to distribute useful things.

This analysis may be familiar to regular readers of Fixes. Indeed, the first Fixes column, more than five years ago, focused on distribution: getting health care to people in rural Africa by putting health care workers on motorcycles and keeping the bikes running….

Philanthropists and government aid agencies are only starting to get interested in the challenges of distribution — one new philanthropy that does have this focus is Good Ventures. As for academia, it still rewards invention almost exclusively. “There’s a lot of attention and award-giving and prize-giving and credit to people who come up with fancy new ideas instead of reaching people and having impact,” said Brodbar. “The incentives aren’t aligned. The culture of social entrepreneurship needs to change.”

Recognizing the true value of spreading an idea would also allow people who aren’t inventors (which is most of us) to get involved in social change. “The notion that if you want to engage in [social entrepreneurship] you have to have the big idea does a disservice to this space and people who want to play a role in it,” said Brodbar. “It’s a much wider front door.”…(More)

Citizen Science and the Flint Water Crisis


The Wilson Center’s Commons Lab: “In April 2014, the city of Flint, Michigan decided to switch its water supply source from the Detroit water system to a cheaper alternative, the Flint River. But in exchange for the cheaper price tag, the Flint residents paid a greater price with one of the worst public health crises of the past decade.

Despite concerns from Flint citizens about the quality of the water, the Michigan Department of Environmental Quality repeatedly attributed the problem to the plumbing system. It was 37-year-old mother of four, LeeAnne Walters who, after noticing physical and behavioral changes in her children and herself, set off a chain of events that exposed the national scandal. Eventually, with the support of Dr. Marc Edwards, an environmental engineering professor at Virginia Tech (VT), Walters discovered lead concentration levels of 13,200 parts per billion in her water, 880 times the maximum concentration allowed by law and more than twice the level the Environmental Protection Agency considers to be hazardous waste.

Citizen science emerged as an important piece of combating the Flint water crisis. Alarmed by the government’s neglect and the health issues spreading all across Flint, Edwards and Walters began the Flint Water Study, a collaboration between the Flint residents and research team from VT. Using citizen science, the VT researchers provided the Flint residents with kits to sample and test their homes’ drinking water and then analyzed the results to unearth the truth behind Flint’s water quality.

The citizen-driven project illustrates the capacity for nonprofessional scientists to use science in order to address problems that directly affect themselves and their community. While the VT team needed the Flint residents to provide water samples, the Flint residents in turn needed the VT team to conduct the analysis. In short, both parties achieved mutually beneficial results and the partnership helped expose the scandal. Surprisingly, the “traditional” problems associated with citizen science, including the inability to mobilize the local constituent base and the lack of collaboration between citizens and professional scientists, were not the obstacles in Flint….(More)”

Ebola: A Big Data Disaster


Study by Sean Martin McDonald: “…undertaken with support from the Open Society Foundation, Ford Foundation, and Media Democracy Fund, explores the use of Big Data in the form of Call Detail Record (CDR) data in humanitarian crisis.

It discusses the challenges of digital humanitarian coordination in health emergencies like the Ebola outbreak in West Africa, and the marked tension in the debate around experimentation with humanitarian technologies and the impact on privacy. McDonald’s research focuses on the two primary legal and human rights frameworks, privacy and property, to question the impact of unregulated use of CDR’s on human rights. It also highlights how the diffusion of data science to the realm of international development constitutes a genuine opportunity to bring powerful new tools to fight crisis and emergencies.

Analysing the risks of using CDRs to perform migration analysis and contact tracing without user consent, as well as the application of big data to disease surveillance is an important entry point into the debate around use of Big Data for development and humanitarian aid. The paper also raises crucial questions of legal significance about the access to information, the limitation of data sharing, and the concept of proportionality in privacy invasion in the public good. These issues hold great relevance in today’s time where big data and its emerging role for development, involving its actual and potential uses as well as harms is under consideration across the world.

The paper highlights the absence of a dialogue around the significant legal risks posed by the collection, use, and international transfer of personally identifiable data and humanitarian information, and the grey areas around assumptions of public good. The paper calls for a critical discussion around the experimental nature of data modelling in emergency response due to mismanagement of information has been largely emphasized to protect the contours of human rights….

See Sean Martin McDonald – “Ebola: A Big Data Disaster” (PDF).

 

A machine intelligence commission for the UK


Geoff Mulgan at NESTA: ” This paper makes the case for creating a Machine Intelligence Commission – a new public institution to help the development of new generations of algorithms, machine learning tools and uses of big data, ensuring that the public interest is protected.

I argue that new institutions of this kind – which can interrogate, inspect and influence technological development – are a precondition for growing informed public trust. That trust will, in turn, be essential if we are to reap the full potential public and economic benefits from new technologies. The proposal draws on lessons from fields such as human fertilisation, biotech and energy, which have shown how trust can be earned, and how new industries can be grown.  It also draws on lessons from the mistakes made in fields like GM crops and personal health data, where lack of trust has impeded progress….(More)”

Meet your Matchmaker: New crowdsourced sites for rare diseases


Carina Storrs at CNN: “Angela’s son Jacob was born with a number of concerning traits. He had an extra finger, and a foot and hip that were abnormally shaped. The doctors called in geneticists to try to diagnose his unusual condition. “That started our long, 12-year journey,” said Angela, who lives in the Baltimore area.

As geneticists do, they studied Jacob’s genes, looking for mutations in specific regions of the genome that could point to a problem. But there were no leads.

In the meantime, Jacob developed just about every kind of health problem there is. He has cognitive delays, digestive problems, muscle weakness, osteoporosis and other ailments.

“It was extremely frustrating, it was like being on a roller coaster. You wait six to eight weeks for the (gene) test and then it comes back as showing nothing,” recalled Angela, who asked that their last name not be used to protect her son’s privacy. “How do we go about treating until we get at what it is?”

Finally a test last year, which was able to take a broad look at all of Jacob’s genes, revealed a possible genetic culprit, but it still did not shed any light on his condition. “Nothing was known about the gene,” said Dr. Antonie Kline, director of pediatric genetics at the Greater Baltimore Medical Center, who had been following Jacob since birth.

Fortunately, Kline knew about an online program called GeneMatcher, which launched in December 2013. It would allow her to enter the new mystery gene into a database and search for other clinicians in the world who work with patients who have mutations in the same gene….

the search for “someone else on the planet” can be hard, Hamosh said. The diseases in GeneMatcher are rare, affecting fewer than 200,000 people in the United States, and it can be difficult for clinicians with similar patients to find each other just through word of mouth and professional connections. Au, the Canadian researcher with a patient similar to Jacob, is actually a friend of Kline’s, but the two had never realized their patients’ similarities.

It was not just Hamosh and her colleagues who were struck by the need for something like GeneMatcher. At the same time they were developing their program, researchers in Canada and the UK were creating PhenomeCentral and Decipher, respectively.

The three are collectively known as matchmaker programs. They connect patients with rare diseases which clinicians may never have seen before. In the case of PhenomeCentral, however, clinicians do not have to have a genetic culprit and can search only for other patients with similar traits or symptoms.

In the summer of 2015, it got much easier for clinicians all over the world to use these programs, when a clearinghouse site called Matchmaker Exchange was launched. They can now enter the patient information one time and search all three databases….(More)

Facebook Is Making a Map of Everyone in the World


Robinsion Meyer at The Atlantic: “Americans inhabit an intricately mapped world. Type “Burger King” into an online box, and Google will cough up a dozen nearby options, each keyed to a precise latitude and longitude.

But throughout much of the world, local knowledge stays local. While countries might conduct censuses, the data doesn’t go much deeper than the county or province level.

Take population data, for instance: More than 7.4 billion humans sprawl across this planet of ours. They live in dense urban centers, in small towns linked by farms, and alone on the outskirts of jungles. But no one’s sure where, exactly, many of them live.

Now, Facebook says it has mapped almost 2 billion people better than any previous project. The company’s Connectivity Labs announced this week that it created new, high-resolution population-distribution maps of 20 countries, most of which are developing. It won’t release most of the maps until later this year,but if they’re accurate, they will be the best-quality population maps ever made for most of those places.

The maps will be notable for another reason, too: If they’re accurate, they ‘ll signal the arrival of a new, AI-aided age of cartography.

In the rich world, reliable population information is taken for granted.  But elsewhere, population-distribution maps have dozens of applications in different fields. Urban planners need to estimate city density so they can place and improve roads. Epidemiologists and public-health workers use them to track outbreaks or analyze access to health care. And after a disaster, population maps can be used (along with crisis mapping) to prioritize where emergency aid gets sent….(More)

Data Collaboratives: Matching Demand with Supply of (Corporate) Data to solve Public Problems


Blog by Stefaan G. Verhulst, IrynaSusha and Alexander Kostura: “Data Collaboratives refer to a new form of collaboration, beyond the public-private partnership model, in which participants from different sectors (private companies, research institutions, and government agencies) share data to help solve public problems. Several of society’s greatest challenges — from climate change to poverty — require greater access to big (but not always open) data sets, more cross-sector collaboration, and increased capacity for data analysis. Participants at the workshop and breakout session explored the various ways in which data collaborative can help meet these needs.

Matching supply and demand of data emerged as one of the most important and overarching issues facing the big and open data communities. Participants agreed that more experimentation is needed so that new, innovative and more successful models of data sharing can be identified.

How to discover and enable such models? When asked how the international community might foster greater experimentation, participants indicated the need to develop the following:

· A responsible data framework that serves to build trust in sharing data would be based upon existing frameworks but also accommodates emerging technologies and practices. It would also need to be sensitive to public opinion and perception.

· Increased insight into different business models that may facilitate the sharing of data. As experimentation continues, the data community should map emerging practices and models of sharing so that successful cases can be replicated.

· Capacity to tap into the potential value of data. On the demand side,capacity refers to the ability to pose good questions, understand current data limitations, and seek new data sets responsibly. On the supply side, this means seeking shared value in collaboration, thinking creatively about public use of private data, and establishing norms of responsibility around security, privacy, and anonymity.

· Transparent stock of available data supply, including an inventory of what corporate data exist that can match multiple demands and that is shared through established networks and new collaborative institutional structures.

· Mapping emerging practices and models of sharing. Corporate data offers value not only for humanitarian action (which was a particular focus at the conference) but also for a variety of other domains, including science,agriculture, health care, urban development, environment, media and arts,and others. Gaining insight in the practices that emerge across sectors could broaden the spectrum of what is feasible and how.

In general, it was felt that understanding the business models underlying data collaboratives is of utmost importance in order to achieve win-win outcomes for both private and public sector players. Moreover, issues of public perception and trust were raised as important concerns of government organizations participating in data collaboratives….(More)”