The Opportunity Project: Utilizing Open Data to Build Stronger Ladders of Opportunity for All


White House Factsheet: “In the lead up to the President’s historic visit to SxSW, today the Administration is announcing the launch of “The Opportunity Project,” a new open data effort to improve economic mobility for all Americans. As the President said in his State of the Union address, we must harness 21st century technology and innovation to expand access to opportunity and tackle our greatest challenges.

The Opportunity Project will put data and tools in the hands of civic leaders, community organizations, and families to help them navigate information about critical resources such as access to jobs, housing, transportation, schools, and other neighborhood amenities. This project is about unleashing the power of data to help our children and our children’s children access the resources they need to thrive. Today, the Administration is releasing a unique package of Federal and local datasets in an easy-to-use format and accelerating a new way for the federal government to collaborate with local leaders, technologists, and community members to use data and technology to tackle inequities and strengthen their communities.

Key components of this announcement include:

·         The launch of “The Opportunity Project” and Opportunity.Census.gov to provide easy access to the new package of Opportunity Project data, a combination of Federal and local data, on key assets that determine access to opportunity at the neighborhood level. This data can now be used by technologists, community groups, and local governments in order to help families find affordable housing, help businesses identify services they need, and help policymakers see inequities in their communities and make investments to expand fair housing and increase economic mobility.

·         The release of a dozen new private sector and non-profit digital tools that were built in collaboration with eight cities and using the Opportunity Project data to help families, local leaders, advocates, and the media navigate information about access to jobs, housing, transportation, schools, neighborhood amenities, and other critical resources. Participating cities include Baltimore, Detroit, Kansas City, MO, New Orleans, New York, Philadelphia, San Francisco, and Washington, D.C., as well as organizations and companies such as Redfin, Zillow, GreatSchools, PolicyLink andStreetwyze.

·         More than thirty additional non-profits, community organizations, coding boot camps, academic institutions, and local governments have already committed to use the Opportunity Project data to build stronger ladders of opportunity in communities across the country.

·         The Administration is issuing a Call to Action to the public to develop new tools, offer additional sources of data, deepen community engagement through the use of the data, and other actions. We want to hear about what new steps you are taking or programs you are implementing to address these topics.

This project represents an important continuation of how the Federal government is working with communities and technologists to enhance the power of open data by making it more accessible to a wide variety of users across the country, and by facilitating collaborations between software developers and community members to build digital tools that make it easier for communities and families to solve their greatest challenges….(More)”

Ebola: A Big Data Disaster


Study by Sean Martin McDonald: “…undertaken with support from the Open Society Foundation, Ford Foundation, and Media Democracy Fund, explores the use of Big Data in the form of Call Detail Record (CDR) data in humanitarian crisis.

It discusses the challenges of digital humanitarian coordination in health emergencies like the Ebola outbreak in West Africa, and the marked tension in the debate around experimentation with humanitarian technologies and the impact on privacy. McDonald’s research focuses on the two primary legal and human rights frameworks, privacy and property, to question the impact of unregulated use of CDR’s on human rights. It also highlights how the diffusion of data science to the realm of international development constitutes a genuine opportunity to bring powerful new tools to fight crisis and emergencies.

Analysing the risks of using CDRs to perform migration analysis and contact tracing without user consent, as well as the application of big data to disease surveillance is an important entry point into the debate around use of Big Data for development and humanitarian aid. The paper also raises crucial questions of legal significance about the access to information, the limitation of data sharing, and the concept of proportionality in privacy invasion in the public good. These issues hold great relevance in today’s time where big data and its emerging role for development, involving its actual and potential uses as well as harms is under consideration across the world.

The paper highlights the absence of a dialogue around the significant legal risks posed by the collection, use, and international transfer of personally identifiable data and humanitarian information, and the grey areas around assumptions of public good. The paper calls for a critical discussion around the experimental nature of data modelling in emergency response due to mismanagement of information has been largely emphasized to protect the contours of human rights….

See Sean Martin McDonald – “Ebola: A Big Data Disaster” (PDF).

 

Data Collaboratives: Matching Demand with Supply of (Corporate) Data to solve Public Problems


Blog by Stefaan G. Verhulst, IrynaSusha and Alexander Kostura: “Data Collaboratives refer to a new form of collaboration, beyond the public-private partnership model, in which participants from different sectors (private companies, research institutions, and government agencies) share data to help solve public problems. Several of society’s greatest challenges — from climate change to poverty — require greater access to big (but not always open) data sets, more cross-sector collaboration, and increased capacity for data analysis. Participants at the workshop and breakout session explored the various ways in which data collaborative can help meet these needs.

Matching supply and demand of data emerged as one of the most important and overarching issues facing the big and open data communities. Participants agreed that more experimentation is needed so that new, innovative and more successful models of data sharing can be identified.

How to discover and enable such models? When asked how the international community might foster greater experimentation, participants indicated the need to develop the following:

· A responsible data framework that serves to build trust in sharing data would be based upon existing frameworks but also accommodates emerging technologies and practices. It would also need to be sensitive to public opinion and perception.

· Increased insight into different business models that may facilitate the sharing of data. As experimentation continues, the data community should map emerging practices and models of sharing so that successful cases can be replicated.

· Capacity to tap into the potential value of data. On the demand side,capacity refers to the ability to pose good questions, understand current data limitations, and seek new data sets responsibly. On the supply side, this means seeking shared value in collaboration, thinking creatively about public use of private data, and establishing norms of responsibility around security, privacy, and anonymity.

· Transparent stock of available data supply, including an inventory of what corporate data exist that can match multiple demands and that is shared through established networks and new collaborative institutional structures.

· Mapping emerging practices and models of sharing. Corporate data offers value not only for humanitarian action (which was a particular focus at the conference) but also for a variety of other domains, including science,agriculture, health care, urban development, environment, media and arts,and others. Gaining insight in the practices that emerge across sectors could broaden the spectrum of what is feasible and how.

In general, it was felt that understanding the business models underlying data collaboratives is of utmost importance in order to achieve win-win outcomes for both private and public sector players. Moreover, issues of public perception and trust were raised as important concerns of government organizations participating in data collaboratives….(More)”

Public-Private Partnerships for Statistics: Lessons Learned, Future Steps


Report by Nicholas Robin, Thilo Klein and Johannes Jütting for Paris 21: “Non-offcial sources of data, big data in particular, are currently attracting enormous interest in the world of official statistics. An impressive body of work focuses on how different types of big data (telecom data, social media, sensors, etc.) can be used to fll specifc data gaps, especially with regard to the post-2015 agenda and the associated technology challenges. The focus of this paper is on a different aspect, but one that is of crucial importance: what are the perspectives of the commercial operations and national statistical offces which respectively produce and might use this data and which incentives, business models and protocols are needed in order to leverage non-offcial data sources within the offcial statistics community?

Public-private partnerships (PPPs) offer signifcant opportunities such as cost effectiveness, timeliness, granularity, new indicators, but also present a range of challenges that need to be surmounted. These comprise technical diffculties, risks related to data confdentiality as well as a lack of incentives. Nevertheless, a number of collaborative projects have already emerged and can be

Nevertheless, a number of collaborative projects have already emerged and can be classified into four ideal types: namely the in-house production of statistics by the data provider, the transfer of private data sets to the end user, the transfer of private data sets to a trusted third party for processing and/or analysis, and the outsourcing of national statistical office functions (the only model which is not centred around a data-sharing dimension). In developing countries, a severe lack of resources and particular statistical needs (to adopt a system-wide approach within national statistical systems and fill statistical gaps which are relevant to national development plans) highlight the importance of harnessing the private sector’s resources and point to the most holistic models (in-house and third party) in which the private sector contributes to the processing and analysis of data. The following key lessons are drawn from four case studies….(More)”

The Geography of Cultural Ties and Human Mobility: Big Data in Urban Contexts


Wenjie Wu Jianghao Wang & Tianshi Dai  in Annals of the American Association of Geographers: “A largely unexplored big data application in urban contexts is how cultural ties affect human mobility patterns. This article explores China’s intercity human mobility patterns from social media data to contribute to our understanding of this question. Exposure to human mobility patterns is measured by big data computational strategy for identifying hundreds of millions of individuals’ space–time footprint trajectories. Linguistic data are coded as a proxy for cultural ties from a unique geographically coded atlas of dialect distributions. We find that cultural ties are associated with human mobility flows between city pairs, contingent on commuting costs and geographical distances. Such effects are not distributed evenly over time and space, however. These findings present useful insights in support of the cultural mechanism that can account for the rise, decline, and dynamics of human mobility between regions….(More)”

Civic hacking as data activism and advocacy: A history from publicity to open government data


Andrew R Schrock in New Media and Society: “The civic hacker tends to be described as anachronistic, an ineffective “white hat” compared to more overtly activist cousins. By contrast, I argue that civic hackers’ politics emerged from a distinct historical milieu and include potentially powerful modes of political participation. The progressive roots of civic data hacking can be found in early 20th-century notions of “publicity” and the right to information movement. Successive waves of activists saw the Internet as a tool for transparency. The framing of openness shifted in meaning from information to data, weakening of mechanisms for accountability even as it opened up new forms of political participation. Drawing on a year of interviews and participant observation, I suggest civic data hacking can be framed as a form of data activism and advocacy: requesting, digesting, contributing to, modeling, and contesting data. I conclude civic hackers are utopian realists involved in the crafting of algorithmic power and discussing ethics of technology design. They may be misunderstood because open data remediates previous forms of openness. In the process, civic hackers transgress established boundaries of political participation….(More)”

Hoaxmap: Debunking false rumours about refugee ‘crimes’


Teo Kermeliotis at AlJazeera: “Back in the summer of 2015, at the height of the ongoing refugee crisis, Karolin Schwarz started noticing a disturbing pattern.

Just as refugee arrivals in her town of Leipzig, eastern Germany, began to rise, so did the frequency of rumours over supposed crimes committed by those men, women and children who had fled war and hardship to reach Europe.

As months passed by, the allegations became even more common, increasingly popping up in social media feeds and often reproduced by mainstream news outlets.

The online map featured some 240 incidents in its first week [Source: Hoaxmap/Al Jazeera]

 

“The stories seemed to be [orchestrated] by far-right parties and organisations and I wanted to try to find some way to help organise this – maybe find patterns and give people a tool to look up these stories [when] they were being confronted with new ones.”

And so she did.

Along with 35-year-old developer Lutz Helm, Schwarz launched last week Hoaxmap, an online platform that allows people to separate fact from fiction by debunking false rumours about supposed crimes committed by refugees.

Using an interactive system of popping dots, the map documents and categorises where those “crimes” allegedly took place. It then counters that false information with official statements from the police and local authorities, as well as news reports in which the allegations have been disproved. The debunked cases marked on the map range from thefts and assaults to manslaughter – but one of the most common topics is rape, Schwarz said….(More)”

Global fact-checking up 50% in past year


Mark Stencel at Duke Reporters’ Lab: “The high volume of political truth-twisting is driving demand for political fact-checkers around the world, with the number of fact-checking sites up 50 percent since last year.

The Duke Reporters’ Lab annual census of international fact-checking currently counts 96 active projects in 37 countries. That’s up from 64 active fact-checkers in the 2015 count. (Map and List)

Active Fact-checkers 2016A bumper crop of new fact-checkers across the Western Hemisphere helped increase the ranks of journalists and government watchdogs who verify the accuracy of public statements and track political promises. The new sites include 14 in the United States, two in Canada as well as seven additional fact-checkers in Latin America.There also were new projects in 10 other countries, from North Africa to Central Europe to East Asia…..

The growing numbers have even spawned a new global association, the International Fact-Checking Network hosted by the Poynter Institute, a media training center in St. Petersburg, Florida.

Promises, Promises

Some of the growth has come in the form of promise-tracking. Since January 2015, fact-checkers launched six sites in five countries devoted to tracking the status of pledges candidates and party leaders made in political campaigns. In Tunisia, there are two new sites dedicated to promise-tracking — one devoted to the country’s president and the other to its prime minister.

There are another 20 active fact-checkers elsewhere that track promises,…

Nearly two-thirds of the active fact-checkers (61 of 96, or 64 percent) are directly affiliated with a new organization. However this breakdown reflects the dominant business structure in the United States, where 90 percent of fact-checkers are part of a news organization. That includes nine of 11 national projects and 28 of 30 state/local fact-checkers…The story is different outside the United States, where less than half of the active fact-checking projects (24 of 55, or 44 percent) are affiliated with news organizations.

The other fact-checkers are typically associated with non-governmental, non-profit and activist groups focused on civic engagement, government transparency and accountability. A handful are partisan, especially in conflict zones and in countries where the lines between independent media, activists and opposition parties are often blurry and where those groups are aligned against state-controlled media or other governmental and partisan entities….(More)

How Citizen Science Changed the Way Fukushima Radiation is Reported


Ari Beser at National Geographic: “It appears the world-changing event didn’t change anything, and it’s disappointing,”said Pieter Franken, a researcher at Keio University in Japan (Wide Project), the MIT Media Lab (Civic Media Centre), and co-founder of Safecast, a citizen-science network dedicated to the measurement and distribution of accurate levels of radiation around the world, especially in Fukushima. “There was a chance after the disaster for humanity to innovate our thinking about energy, and that doesn’t seem like it’s happened.  But what we can change is the way we measure the environment around us.”

Franken and his founding partners found a way to turn their email chain, spurred by the tsunami, into Safecast; an open-source network that allows everyday people to contribute to radiation-monitoring.

“We literally started the day after the earthquake happened,” revealed Pieter. “A friend of mine, Joi Ito, the director of MIT Media Lab, and I were basically talking about what Geiger counter to get. He was in Boston at the time and I was here in Tokyo, and like the rest of the world, we were worried, but we couldn’t get our hands on anything. There’s something happening here, we thought. Very quickly as the disaster developed, we wondered how to get the information out. People were looking for information, so we saw that there was a need. Our plan became: get information, put it together and disseminate it.”

An e-mail thread between Franken, Ito, and Sean Bonner, (co-founder of CRASH Space, a group that bills itself as Los Angeles’ first hackerspace), evolved into a network of minds, including members of Tokyo Hackerspace, Dan Sythe, who produced high-quality Geiger counters, and Ray Ozzie, Microsoft’s former Chief Technical Officer. On April 15, the group that was to become Safecast sat down together for the first time. Ozzie conceived the plan to strap a Geiger counter to a car and somehow log measurements in motion. This would became the bGeigie, Safecast’s future model of the do-it-yourself Geiger counter kit.

Armed with a few Geiger counters donated by Sythe, the newly formed team retrofitted their radiation-measuring devices to the outside of a car.  Safecast’s first volunteers drove up to the city of Koriyama in Fukushima Prefecture, and took their own readings around all of the schools. Franken explained, “If we measured all of the schools, we covered all the communities; because communities surround schools. It was very granular, the readings changed a lot, and the levels were far from academic, but it was our start. This was April 24, 6 weeks after the disaster. Our thinking changed quite a bit through this process.”

DSC_0358
With the DIY kit available online, all anyone needs to make their own Geiger counter is a soldering iron and the suggested directions.

Since their first tour of Koriyama, with the help of a successful Kickstarter campaign, Safecast’s team of volunteers have developed the bGeigie handheld radiation monitor, that anyone can buy on Amazon.com and construct with suggested instructions available online. So far over 350 users have contributed 41 million readings, using around a thousand fixed, mobile, and crowd-sourced devices….(More)

Big data’s big role in humanitarian aid


Mary K. Pratt at Computerworld: “Hundreds of thousands of refugees streamed into Europe in 2015 from Syria and other Middle Eastern countries. Some estimates put the number at nearly a million.

The sheer volume of people overwhelmed European officials, who not only had to handle the volatile politics stemming from the crisis, but also had to find food, shelter and other necessities for the migrants.

Sweden, like many of its European Union counterparts, was taking in refugees. The Swedish Migration Board, which usually sees 2,500 asylum seekers in an average month, was accepting 10,000 per week.

“As you can imagine, with that number, it requires a lot of buses, food, registration capabilities to start processing all the cases and to accommodate all of those people,” says Andres Delgado, head of operational control, coordination and analysis at the Swedish Migration Board.

Despite the dramatic spike in refugees coming into the country, the migration agency managed the intake — hiring extra staff, starting the process of procuring housing early, getting supplies ready. Delgado credits a good part of that success to his agency’s use of big data and analytics that let him predict, with a high degree of accuracy, what was heading his way.

“Without having that capability, or looking at the tool every day, to assess every need, this would have crushed us. We wouldn’t have survived this,” Delgado says. “It would have been chaos, actually — nothing short of that.”

The Swedish Migration Board has been using big data and analytics for several years, as it seeks to gain visibility into immigration trends and what those trends will mean for the country…./…

“Can big data give us peace? I think the short answer is we’re starting to explore that. We’re at the very early stages, where there are shining examples of little things here and there. But we’re on that road,” says Kalev H. Leetaru, creator of the GDELT Project, or the Global Database of Events, Language and Tone, which describes itself as a comprehensive “database of human society.”

The topic is gaining traction. A 2013 report, “New Technology and the Prevention of Violence and Conflict,” from the International Peace Institute, highlights uses of telecommunications technology, including data, in several crisis situations around the world. The report emphasizes the potential these technologies hold in helping to ease tensions and address problems.

The report’s conclusion offers this idea: “Big data can be used to identify patterns and signatures associated with conflict — and those associated with peace — presenting huge opportunities for better-informed efforts to prevent violence and conflict.”

That’s welcome news to Noel Dickover. He’s the director of PeaceTech Data Networks at the PeaceTech Lab, which was created by the U.S. Institute of Peace (USIP) to advance USIP’s work on how technology, media and data help reduce violent conflict around the world.

Such work is still in the nascent stages, Dickover says, but people are excited about its potential. “We have unprecedented amounts of data on human sentiment, and we know there’s value there,” he says. “The question is how to connect it.”

Dickover is working on ways to do just that. One example is the Open Situation Room Exchange (OSRx) project, which aims to “empower greater collective impact in preventing or mitigating serious violent conflicts in particular arenas through collaboration and data-sharing.”…(More)