All of Us Research Program Expands Data Collection Efforts with Fitbit


NIH Press Release: “The All of Us Research Program has launched the Fitbit Bring-Your-Own-Device (BYOD) project. Now, in addition to providing health information through surveys, electronic health records, and biosamples, participants can choose to share data from their Fitbit accounts to help researchers make discoveries. The project is a key step for the program in integrating digital health technologies for data collection.

Digital health technologies, like mobile apps and wearable devices, can gather data outside of a hospital or clinic. This data includes information about physical activity, sleep, weight, heart rate, nutrition, and water intake, which can give researchers a more complete picture of participants’ health. The All of Us Research Program is now gathering this data in addition to surveys, electronic health record information, physical measurements, and blood and urine samples, working to make the All of Us resource one of the largest and most diverse data sets of its kind for health research.

“Collecting real-world, real-time data through digital technologies will become a fundamental part of the program,” said Eric Dishman, director of the All of Us Research Program. “This information, in combination with many other data types, will give us an unprecedented ability to better understand the impact of lifestyle and environment on health outcomes and, ultimately, develop better strategies for keeping people healthy in a very precise, individualized way.”…

All of Us is developing additional plans to incorporate digital health technologies. A second project with Fitbit is expected to launch later in the year. It will include providing devices to a limited number of All of Us participants who will be randomly invited to take part, to enable them to share wearable data with the program. And All of Us will add connections to other devices and apps in the future to further expand data collection efforts and engage participants in new ways….(More)”.

Swarm AI Outperforms in Stanford Medical Study


Press Release: “Stanford University School of Medicine and Unanimous AI presented a new study today showing that a small group of doctors, connected by intelligence algorithms that enable them to work together as a “hive mind,” could achieve higher diagnostic accuracy than the individual doctors or machine learning algorithms alone.  The technology used is called Swarm AI and it empowers networked human groups to combine their individual insights in real-time, using AI algorithms to converge on optimal solutions.

As presented at the 2018 SIIM Conference on Machine Intelligence in Medical Imaging, the study tasked a group of experienced radiologists with diagnosing the presence of pneumonia in chest X-rays. This is one of the most widely performed imaging procedures in the US, with more than 1 million adults hospitalized with pneumonia each year. But, despite this prevalence, accurately diagnosing X-rays is highly challenging with significant variability across radiologists. This makes it both an optimal task for applying new AI technologies, and an important problem to solve for the medical community.

When generating diagnoses using Swarm AI technology, the average error rate was reduced by 33% compared to traditional diagnoses by individual practitioners.  This is an exciting result, showing the potential of AI technologies to amplify the accuracy of human practitioners while maintaining their direct participation in the diagnostic process.

Swarm AI technology was also compared to the state-of-the-art in automated diagnosis using software algorithms that do not employ human practitioners.  Currently, the best system in the world for the automated diagnosing of pneumonia from chest X-rays is the CheXNet system from Stanford University, which made headlines in 2017 by significantly outperforming individual practitioners using deep-learning derived algorithms.

The Swarm AI system, which combines real-time human insights with AI technology, was 22% more accurate in binary classification than the software-only CheXNet system.  In other words, by connecting a group of radiologists into a medical “hive mind”, the hybrid human-machine system was able to outperform individual human doctors as well as the state-of-the-art in deep-learning derived algorithms….(More)”.

On International Day of Democracy, International Leaders Call for More Open Public Institutions


Press Release: “As the United Nations celebrates the International Day of Democracy on September 15 with its theme of “Democracy Under Strain,” The Governance Lab (The GovLab) at the NYU Tandon School of Engineering will unveil its CrowdLaw Manifesto to strengthen public participation in lawmaking by encouraging citizens to help build, shape, and influence the laws and policies that affect their daily lives.

Among its 12 calls to action to individuals, legislatures, researchers and technology designers, the manifesto encourages the public to demand and institutions to create new mechanisms to harness collective intelligence to improve the quality of lawmaking as well as more research on what works to build a global movement for participatory democracy.

The CrowdLaw Manifesto emerged from a collaborative effort of 20 international experts and CrowdLaw community members. At a convening held earlier this year by The GovLab at The Rockefeller Foundation Bellagio Center in Italy, government leaders, academics, NGOs, and technologists formulated the CrowdLaw Manifesto to detail the initiative’s foundational principles and to encourage greater implementation of CrowdLaw practices to improve governance through 21st century technology and tools….

“The successes of the CrowdLaw concept – and its remarkably rapid adoption across the world by citizens seeking to affect change – exemplify the powerful force that academia can exert when working in concert with government and citizens,” said NYU Tandon Dean Jelena Kovačević. “On behalf of the NYU Tandon School of Engineering, I proudly sign the CrowdLaw Manifesto and congratulate The GovLab and its collaborators for creating these digital tools and momentum for good government.”…(More)”.

How Mobile Network Operators Can Help Achieve the Sustainable Development Goals Profitably


Press Release: “Today, the Digital Impact Alliance (DIAL) released its second paper in a series focused on the promise of data for development (D4D). The paper, Leveraging Data for Development to Achieve Your Triple Bottom Line: Mobile Network Operators with Advanced Data for Good Capabilities See Stronger Impact to Profits, People and the Planet, will be presented at GSMA’s Mobile 360 Africa in Kigali.

“The mobile industry has already taken a driving seat in helping reach the Sustainable Development Goals by 2030 and this research reinforces the role mobile network operators in lower-income economies can play to leverage their network data for development and build a new data business safely and securely,” said Kate Wilson, CEO of the Digital Impact Alliance. “Mobile network operators (MNOs) hold unique data on customers’ locations and behaviors that can help development efforts. They have been reluctant to share data because there are inherent business risks and to do so has been expensive and time consuming.  DIAL’s research illustrates a path forward for MNOs on which data is useful to achieve the SDGs and why acting now is critical to building a long-term data business.”

DIAL worked with Altai Consulting on both primary and secondary research to inform this latest paper.  Primary research included one-on-one in-depth interviews with more than 50 executives across the data for development value chain, including government officials, civil society leaders, mobile network operators and other private sector representatives from both developed and emerging markets. These interviews help inform how operators can best tap into the shared value creation opportunities data for development provides.

Key findings from the in-depth interviews include:

  • There are several critical barriers that have prevented scaled use of mobile data for social good – including 1) unclear market opportunities, 2) not enough collaboration among MNOs, governments and non-profit stakeholders and 3) regulatory and privacy concerns;
  • While it may be an ideal time for MNOs to increase their involvement in D4D efforts given the unique data they have that can inform development, market shifts suggest the window of opportunity to implement large-scale D4D initiatives will likely not remain open for much longer;
  • Mobile Network Operators with advanced data for good capabilities will have the most success in establishing sustainable D4D efforts; and as a result, achieving triple bottom line mandates; and
  • Mobile Network Operators should focus on providing value-added insights and services rather than raw data and drive pricing and product innovation to meet the sector’s needs.

“Private sector data availability to drive public sector decision-making is a critical enabler for meeting SDG targets,” said Syed Raza, Senior Director of the Data for Development Team at the Digital Impact Alliance.  “Our data for development paper series aims to elevate the efforts of our industry colleagues with the information, insights and tools they need to help drive ethical innovation in this space….(More)”.

Research Shows Political Acumen, Not Just Analytical Skills, is Key to Evidence-Informed Policymaking


Press Release: “Results for Development (R4D) has released a new study unpacking how evidence translators play a key and somewhat surprising role in ensuring policymakers have the evidence they need to make informed decisions. Translators — who can be evidence producers, policymakers, or intermediaries such as journalists, advocates and expert advisors — identify, filter, interpret, adapt, contextualize and communicate data and evidence for the purposes of policymaking.

The study, Translators’ Role in Evidence-Informed Policymaking, provides a better understanding of who translators are and how different factors influence translators’ ability to promote the use of evidence in policymaking. This research shows translation is an essential function and that, absent individuals or organizations taking up the translator role, evidence translation and evidence-informed policymaking often do not take place.

“We began this research assuming that translators’ technical skills and analytical prowess would prove to be among the most important factors in predicting when and how evidence made its way into public sector decision making,” Nathaniel Heller, executive vice president for integrated strategies at Results for Development, said. “Surprisingly, that turned out not to be the case, and other ‘soft’ skills play a far larger role in translators’ efficacy than we had imagined.”

Key findings include:

  • Translator credibility and reputation are crucial to the ability to gain access to policymakers and to promote the uptake of evidence.
  • Political savvy and stakeholder engagement are among the most critical skills for effective translators.
  • Conversely, analytical skills and the ability to adapt, transform and communicate evidence were identified as being less important stand-alone translator skills.
  • Evidence translation is most effective when initiated by those in power or when translators place those in power at the center of their efforts.

The study includes a definitional and theoretical framework as well as a set of research questions about key enabling and constraining factors that might affect evidence translators’ influence. It also focuses on two cases in Ghana and Argentina to validate and debunk some of the intellectual frameworks around policy translators that R4D and others in the field have already developed. The first case focuses on Ghana’s blue-ribbon commission formed by the country’s president in 2015, which was tasked with reviewing Ghana’s national health insurance scheme. The second case looks at Buenos Aires’ 2016 government-led review of the city’s right to information regime….(More)”.

Unlocking of government’s mapping and location data to boost economy by £130m a year


UK Government Press Release: “…the government has announced that key parts of the OS MasterMap will be made openly available for the public and businesses to use.

It is estimated that this will boost the UK economy by at least £130m each year, as innovative companies and startups use the data.

The release of OS MasterMap data is one of the first projects to be delivered by the new Geospatial Commission, in conjunction with Ordnance Survey. The aim is to continue to drive forward the UK as a world leader in location data, helping to grow the UK’s digital economy by an estimated £11bn each year.

This is a step on a journey towards more open geospatial data infrastructure for the UK.

Chancellor of the Duchy of Lancaster and Minister for the Cabinet Office, David Lidington, said

Opening up OS MasterMap underlines this Government’s commitment to ensuring the UK continues to lead the way in digital innovation. Releasing this valuable government data for free will help stimulate innovation in the economy, generate jobs and improve public services.

Location-aware technologies – using geospatial data – are revolutionising our economy. From navigating public transport to tracking supply chains and planning efficient delivery routes, these digital services are built on location data that has become part of everyday life and business.

The newly available data should be particularly useful to small firms and entrepreneurs to realise their ideas and compete with larger organisations, encouraging greater competition and innovation….(More)”.

New Repository of Government Data Visualizations and Maps


Press Release: “Data-Smart City Solutions, a program of Harvard Kennedy School’s Ash Center for Democratic Governance and Innovation, today launched a searchable public database comprising cutting-edge examples of public sector data use. The “Solutions Search” indexes interactive maps and visualizations, spanning civic issue areas such as transportation, public health, and housing, that are helping data innovators more accurately understand and illustrate challenges, leading to optimized solutions.

The new user-friendly public database includes 200 data-driven models for civic technologists, community organizations, and government employees. “By showcasing successful data-driven initiatives from across the country, we have the opportunity to help city leaders learn from each other and avoid reinventing the wheel,” noted Stephen Goldsmith, Daniel Paul Professor of the Practice of Government and faculty director of the Innovations in Government Program at the Ash Center, who also leads the Civic Analytics Network, a national network of municipal chief data officers.

This new Harvard database spans city, county, state, and federal levels, and features a wide variety of interventions and initiatives, including maps, data visualizations, and dashboards. Examples include the California Report Card and GradeDC.gov, dashboards that measurecommunity health – and run on citizen input, allowing residents to rank various city services and agencies. Users can also find Redlining Louisville: The History of Race, Class, and Real Estate, a visualization that explores the impact of disinvestment in Louisville neighborhoods….(More)”.

Linux Foundation Debuts Community Data License Agreement


Press Release: “The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, today announced the Community Data License Agreement(CDLA) family of open data agreements. In an era of expansive and often underused data, the CDLA licenses are an effort to define a licensing framework to support collaborative communities built around curating and sharing “open” data.

Inspired by the collaborative software development models of open source software, the CDLA licenses are designed to enable individuals and organizations of all types to share data as easily as they currently share open source software code. Soundly drafted licensing models can help people form communities to assemble, curate and maintain vast amounts of data, measured in petabytes and exabytes, to bring new value to communities of all types, to build new business opportunities and to power new applications that promise to enhance safety and services.

The growth of big data analytics, machine learning and artificial intelligence (AI) technologies has allowed people to extract unprecedented levels of insight from data. Now the challenge is to assemble the critical mass of data for those tools to analyze. The CDLA licenses are designed to help governments, academic institutions, businesses and other organizations open up and share data, with the goal of creating communities that curate and share data openly.

For instance, if automakers, suppliers and civil infrastructure services can share data, they may be able to improve safety, decrease energy consumption and improve predictive maintenance. Self-driving cars are heavily dependent on AI systems for navigation, and need massive volumes of data to function properly. Once on the road, they can generate nearly a gigabyte of data every second. For the average car, that means two petabytes of sensor, audio, video and other data each year.

Similarly, climate modeling can integrate measurements captured by government agencies with simulation data from other organizations and then use machine learning systems to look for patterns in the information. It’s estimated that a single model can yield a petabyte of data, a volume that challenges standard computer algorithms, but is useful for machine learning systems. This knowledge may help improve agriculture or aid in studying extreme weather patterns.

And if government agencies share aggregated data on building permits, school enrollment figures, sewer and water usage, their citizens benefit from the ability of commercial entities to anticipate their future needs and respond with infrastructure and facilities that arrive in anticipation of citizens’ demands.

“An open data license is essential for the frictionless sharing of the data that powers both critical technologies and societal benefits,” said Jim Zemlin, Executive Director of The Linux Foundation. “The success of open source software provides a powerful example of what can be accomplished when people come together around a resource and advance it for the common good. The CDLA licenses are a key step in that direction and will encourage the continued growth of applications and infrastructure.”…(More)”.

TfL’s free open data boosts London’s economy


Press Release by Transport for London: “Research by Deloitte shows that the release of open data by TfL is generating annual economic benefits and savings of up to £130m a year…

TfL has worked with a wide range of professional and amateur developers, ranging from start-ups to global innovators, to deliver new products in the form that customers want. This has led to more than 600 apps now being powered specifically using TfL’s open data feeds, used by 42 per cent of Londoners.

The report found that TfL’s data provides the following benefits:

  • Saved time for passengers. TfL’s open data allows customers to plan journeys more accurately using apps with real-time information and advice on how to adjust their routes. This provides greater certainty on when the next bus/Tube will arrive and saves time – estimated at between £70m and £90m per year.
  • Better information to plan journeys, travel more easily and take more journeys. Customers can use apps to better plan journeys, enabling them to use TfL services more regularly and access other services. Conservatively, the value of these journeys is estimated at up to £20m per year.
  • Creating commercial opportunities for third party developers. A wide range of companies now use TfL’s open data commercially to help generate revenue, many of whom are based in London. Having free and up-to-date access to this data increases the ‘Gross Value Add’ (analogous to GDP) that these companies contribute to the London economy, both directly and across the supply chain and wider economy, of between £12m and £15m per year.
  • Leveraging value and savings from partnerships with major customer facing technology platform owners. TfL receives back significant data on areas it does not itself collect data (e.g. crowdsourced traffic data). This allows TfL to get an even better understanding of journeys in London and improve its operations….(More).

A Guide to Tactical Data Engagement


Sunlight Foundation Press Release: “A Guide to Tactical Data Engagement is a brand new resource released today designed to help city leaders and residents collaborate on increasing the social impact of open government data. Based on the core concepts of human-centered design and tactical urbanism, this approach challenges city halls to make open data programs more transparent, accountable, and participatory by actively helping residents use open government data to improve their communities.

The new guide outlines a four-step process to help readers complete a resident-informed project, product, or tool that addresses a specific community need:

  • Find a focus area by observing the community
  • Refine use cases by interviewing stakeholders
  • Design a plan by coordinating with target users
  • Implement an intervention by collaborating with actual users

Readers can carry out each of these steps “tactically” — using lightweight, adaptable, and inexpensive tactics that can realistically fit within a city hall’s or community’s unique constraints and capacities. The tactics are drawn from examples of good resident engagement around the country, and ensure that in every step, residents are collaborators in determining promising opportunities for impact through the community use of open government data.

Read the new guide to see the full process, including specific ideas at each step of the way to help your community come together and use open data to solve problems, together….”