Using Crowdsourcing to Track the Next Viral Disease Outbreak


The TakeAway: “Last year’s Ebola outbreak in West Africa killed more than 11,000 people. The pandemic may be diminished, but public health officials think that another major outbreak of infectious disease is fast-approaching, and they’re busy preparing for it.

Boston public radio station WGBH recently partnered with The GroundTruth Project and NOVA Next on a series called “Next Outbreak.” As part of the series, they reported on an innovative global online monitoring system called HealthMap, which uses the power of the internet and crowdsourcing to detect and track emerging infectious diseases, and also more common ailments like the flu.

Researchers at Boston Children’s Hospital are the ones behind HealthMap (see below), and they use it to tap into tens of thousands of sources of online data, including social media, news reports, and blogs to curate information about outbreaks. Dr. John Brownstein, chief innovation officer at Boston Children’s Hospital and co-founder of HealthMap, says that smarter data collection can help to quickly detect and track emerging infectious diseases, fatal or not.

“Traditional public health is really slowed down by the communication process: People get sick, they’re seen by healthcare providers, they get laboratory confirmed, information flows up the channels to state and local health [agencies], national governments, and then to places like the WHO,” says Dr. Brownstein. “Each one of those stages can take days, weeks, or even months, and that’s the problem if you’re thinking about a virus that can spread around the world in a matter of days.”

The HealthMap team looks at a variety of communication channels to undo the existing hierarchy of health information.

“We make everyone a stakeholder when it comes to data about outbreaks, including consumers,” says Dr. Brownstein. “There are a suite of different tools that public health officials have at their disposal. What we’re trying to do is think about how to communicate and empower individuals to really understand what the risks are, what the true information is about a disease event, and what they can do to protect themselves and their families. It’s all about trying to demystify outbreaks.”

In addition to the map itself, the HealthMap team has a number of interactive tools that individuals can both use and contribute to. Dr. Brownstein hopes these resources will enable the public to care more about disease outbreaks that may be happening around them—it’s a way to put the “public” back in “public health,” he says.

“We have a app called Outbreaks Near Me that allows people to know about what disease outbreaks are happening in their neighborhood,” Dr. Brownstein says. “Flu Near You is a an app that people use to self report on symptoms; Vaccine Finder is a tool that allows people to know what vaccines are available to them and their community.”

In addition to developing their own app, the HealthMap has partnered with existing tech firms like Uber to spread the word about public health.

“We worked closely with Uber last year and actually put nurses in Uber cars and delivered vaccines to people,” Dr. Brownstein says. “The closest vaccine location might still be only a block away for people, but people are still hesitant to get it done.”…(More)”

Mobile data: Made to measure


Neil Savage in Nature: “For decades, doctors around the world have been using a simple test to measure the cardiovascular health of patients. They ask them to walk on a hard, flat surface and see how much distance they cover in six minutes. This test has been used to predict the survival rates of lung transplant candidates, to measure the progression of muscular dystrophy, and to assess overall cardiovascular fitness.

The walk test has been studied in many trials, but even the biggest rarely top a thousand participants. Yet when Euan Ashley launched a cardiovascular study in March 2015, he collected test results from 6,000 people in the first two weeks. “That’s a remarkable number,” says Ashley, a geneticist who heads Stanford University’s Center for Inherited Cardiovascular Disease. “We’re used to dealing with a few hundred patients, if we’re lucky.”

Numbers on that scale, he hopes, will tell him a lot more about the relationship between physical activity and heart health. The reason they can be achieved is that millions of people now have smartphones and fitness trackers with sensors that can record all sorts of physical activity. Health researchers are studying such devices to figure out what sort of data they can collect, how reliable those data are, and what they might learn when they analyse measurements of all sorts of day-to-day activities from many tens of thousands of people and apply big-data algorithms to the readings.

By July, more than 40,000 people in the United States had signed up to participate in Ashley’s study, which uses an iPhone application called MyHeart Counts. He expects the numbers to surge as the app becomes more widely available around the world. The study — designed by scientists, approved by institutional review boards, and requiring informed consent — asks participants to answer questions about their health and risk factors, and to use their phone’s motion sensors to collect data about their activities for seven days. They also do a six-minute walk test, and the phone measures the distance they cover. If their own doctors have ordered blood tests, users can enter information such as cholesterol or glucose measurements. Every three months, the app checks back to update their data.

Physicians know that physical activity is a strong predictor of long-term heart health, Ashley says. But it is less clear what kind of activity is best, or whether different groups of people do better with different types of exercise. MyHeart Counts may open a window on such questions. “We can start to look at subgroups and find differences,” he says.

“You can take pretty noisy data, but if you have enough of it, you can find a signal.”

It is the volume of the data that makes such studies possible. In traditional studies, there may not be enough data to find statistically significant results for such subgroups. And rare events may not occur in the smaller samples, or may produce a signal so weak that it is lost in statistical noise. Big data can overcome those problems, and if the data set is big enough, small errors can be smoothed out. “You can take pretty noisy data, but if you have enough of it, you can find a signal,” Ashley says….(More)”.

How big data and The Sims are helping us to build the cities of the future


The Next Web: “By 2050, the United Nations predicts that around 66 percent of the world’s population will be living in urban areas. It is expected that the greatest expansion will take place in developing regions such as Africa and Asia. Cities in these parts will be challenged to meet the needs of their residents, and provide sufficient housing, energy, waste disposal, healthcare, transportation, education and employment.

So, understanding how cities will grow – and how we can make them smarter and more sustainable along the way – is a high priority among researchers and governments the world over. We need to get to grips with the inner mechanisms of cities, if we’re to engineer them for the future. Fortunately, there are tools to help us do this. And even better, using them is a bit like playing SimCity….

Cities are complex systems. Increasingly, scientists studying cities have gone from thinking about “cities as machines”, to approaching “cities as organisms”. Viewing cities as complex, adaptive organisms – similar to natural systems like termite mounds or slime mould colonies – allows us to gain unique insights into their inner workings. …So, if cities are like organisms, it follows that we should examine them from the bottom-up, and seek to understand how unexpected large-scale phenomena emerge from individual-level interactions. Specifically, we can simulate how the behaviour of individual “agents” – whether they are people, households, or organisations – affect the urban environment, using a set of techniques known as “agent-based modelling”….These days, increases in computing power and the proliferation of big datagive agent-based modelling unprecedented power and scope. One of the most exciting developments is the potential to incorporate people’s thoughts and behaviours. In doing so, we can begin to model the impacts of people’s choices on present circumstances, and the future.

For example, we might want to know how changes to the road layout might affect crime rates in certain areas. By modelling the activities of individuals who might try to commit a crime, we can see how altering the urban environment influences how people move around the city, the types of houses that they become aware of, and consequently which places have the greatest risk of becoming the targets of burglary.

To fully realise the goal of simulating cities in this way, models need a huge amount of data. For example, to model the daily flow of people around a city, we need to know what kinds of things people spend their time doing, where they do them, who they do them with, and what drives their behaviour.

Without good-quality, high-resolution data, we have no way of knowing whether our models are producing realistic results. Big data could offer researchers a wealth of information to meet these twin needs. The kinds of data that are exciting urban modellers include:

  • Electronic travel cards that tell us how people move around a city.
  • Twitter messages that provide insight into what people are doing and thinking.
  • The density of mobile telephones that hint at the presence of crowds.
  • Loyalty and credit-card transactions to understand consumer behaviour.
  • Participatory mapping of hitherto unknown urban spaces, such as Open Street Map.

These data can often be refined to the level of a single person. As a result, models of urban phenomena no longer need to rely on assumptions about the population as a whole – they can be tailored to capture the diversity of a city full of individuals, who often think and behave differently from one another….(More)

The Human Face of Big Data


A film by Sandy Smolan [56 minutes]: “Big Data is defined as the real time collection, analyses, and visualization of vast amounts of information. In the hands of Data Scientists this raw information is fueling a revolution which many people believe may have as big an impact on humanity going forward as the Internet has over the past two decades. Its enable us to sense, measure, and understand aspects of our existence in ways never before possible.

The Human Face of Big Data captures an extraordinary revolution sweeping, almost invisibly, through business, academia, government, healthcare, and everyday life. It’s already enabling us to provide a healthier life for our children. To provide our seniors with independence while keeping them safe. To help us conserve precious resources like water and energy. To alert us to tiny changes in our health, weeks or years before we develop a life—threatening illness. To peer into our own individual genetic makeup. To create new forms of life. And soon, as many predict, to re—engineer our own species. And we’ve barely scratched the surface…

This massive gathering and analyzing of data in real time is allowing us to address some of humanities biggest challenges. Yet, as Edward Snowden and the release of the NSA documents has shown, the accessibility of all this data can come at a steep price….(More)”

Strengthening the Connective Links in Government


John M. Kamensky at the IBM Center for The Business of Government: “Over the past five years, the Obama administration has pursued a host of innovation-fostering initiatives that work to strengthen the connective links among and within federal agencies.

Many factors contribute to the rise of such efforts, including presidential support, statutory encouragement, and an ongoing evolution in the way government does its business. The challenge now is how to solidify the best of them so they remain in place beyond the upcoming 2017 presidential transition.

Increased Use of Collaborative Governance

Dr. Rosemary O’Leary, an astute observer of trends in government, describes how government has steadily increased its use of collaborative approaches in lieu of the traditional hierarchical, bureaucratic approach. According to O’Leary, there are several explanations for this shift:

  • First, “most public challenges are larger than one organization, requiring new approaches to addressing public issues” such as housing, pollution, transportation, and healthcare.
  • Second, collaboration helps to improve the effectiveness and performance of programs “by encouraging new ways of providing services.”
  • Third, technology advances in recent years have helped “organizations and their employees to share information in a way that is integrative and interoperable.”
  • Finally, “citizens are seeking additional avenues for engaging in governance, resulting in new and different forms of collaborative problem solving and decision making.”

Early in his administration, President Barack Obama publicly placed a premium on the use of collaboration. One of his first directives to federal agencies set the tone for how he envisioned his administration would govern, directing agencies to be “collaborative” and “use innovative tools, methods, and systems to cooperate among themselves, across levels of government, and with nonprofits, businesses and individuals.” To that end, the Obama administration undertook a series of supporting actions, including establishing crossagency priority goals around issues such as reducing veteran homelessness, data sharing, and streamlining the sharing of social media licenses between agencies. Tackling many of these issues successfully involved the transformative intersection of innovation and technology.

In 2010, when Congress passed a series of amendments to the Government Performance and Results Act (GPRA), it provided the statutory basis for a broader, more consistent use of collaboration as a way of implementing policies and programs. These changes put in place a series of administrative processes:

  • The designation of agency and cross-agency priority goals
  • The naming of goal leaders
  • The convening of a set of regular progress reviews

Taken together, these legislative changes embedded the value of collaboration into the administrative fabric of the governing bureaucracy. In addition, the evolution of technology tools and the advances in the use of social media has dramatically lowered the technical and bureaucratic barriers to working in a more collaborative environment….(More)”

The Federal Advisory Committee Act: Analysis of Operations and Costs


Wendy Ginsberg at CRS: “Federal advisory committees are established to allow experts from outside the federal government to provide advice and recommendations to executive branch agencies or the President. Federal advisory committees can be created either by Congress, the President, or an executive branch agency. The Federal Advisory Committee Act (FACA) requires agencies to report on the structure, operations, and costs of qualifying federal advisory committees. The General Services Administration (GSA) is authorized to collect, retain, and verify the reported information, and does so using an online tool called the FACA Database.

This report provides an overview of the data that populates the FACA Database, which details the costs and operations of all active federal advisory committees. This report examines the data from FY2004-FY2014, with additional in-depth analysis of FY2014. Generally, the data show that the number of active FACA committees has remained relatively stable over time, hovering around 1,000 committees in any given fiscal year. The Department of Health and Human Services consistently operates the most federal advisory committees, with 264 active committees in FY2014. The Department of Agriculture had the second most active committees in FY2014 with 166. In any given year, around half of the active FACA committees were required to be established by statute. In FY2014, Congress established 10 new FACA committees by statute.

Generally, around 70,000 people serve as members on FACA committees and subcommittees in any given year. In FY2014, 68,179 members served. In FY2014, 825 federal advisory committees held 7,173 meetings and cost more than $334 million to operate. The report provides an in-depth examination of FACA committee operations, using the data collected by GSA. The report concludes by providing a list of policy options that Congress can consider when deliberating current or future legislation to amend FACA….(More)”

Interdisciplinary Perspectives on Trust


Book edited by Shockley, E., Neal, T.M.S., PytlikZillig, L.M., and Bornstein, B.H.:  “This timely collection explores trust research from many angles while ably demonstrating the potential of cross-discipline collaboration to deepen our understanding of institutional trust. Citing, among other things, current breakdowns of trust in prominent institutions, the book presents a multilevel model identifying universal aspects of trust as well as domain- and context-specific variations deserving further study. Contributors analyze similarities and differences in trust across public domains from politics and policing to medicine and science, and across languages and nations. Innovative strategies for measuring and assessing trust also shed new light on this essentially human behavior.

Highlights of the coverage:

  • Consensus on conceptualizations and definitions of trust: are we there yet?
  • Differentiating between trust and legitimacy in public attitudes towards legal authority.
  • Examining the relationship between interpersonal and institutional trust in political and health care contexts.
  • Trust as a multilevel phenomenon across contexts.
  • Institutional trust across cultures.
  • The “dark side” of institutional trust….(more)”

Cleaning Up Lead Poisoning One Tweet at a Time


WorldPolicy Blog: “At first, no one knew why the children of Bagega in Zamfara state were dying. In the spring of 2010, hundreds of kids in and around the northern Nigerian village were falling ill, having seizures and going blind, many of them never to recover. A Médecins Sans Frontières‎ team soon discovered the causes: gold and lead.

With the global recession causing the price of precious metals to soar, impoverished villagers had turned to mining the area’s gold deposits. But the gold veins were mingled with lead, and as a result the villagers’ low-tech mining methods were sending clouds of lead-laced dust into the air. The miners, unknowingly carrying the powerful toxin on their clothes and skin, brought it into their homes where their children breathed it in.

The result was perhaps the worst outbreak of lead poisoning in history, killing over 400 children in Bagega and neighboring villages. In response, the Nigerian government pledged to cleanup the lead-contaminated topsoil and provide medical care to the stricken children. But by mid-2012, there was no sign of the promised funds. Digitally savvy activists with the organization Connected Development (CODE) stepped in to make sure that the money was disbursed.

A group of young Nigerians founded CODE in 2010 in the capital Abuja, with the mission of empowering local communities to hold the government to account by improving their access to information and helping their voices to be heard. “In 2010, we were working to connect communities with data for advocacy programs,” says CODE co-founder Oludotun Babayemi, a former country director of a World Wildlife Fund project in Nigeria. “When we heard about Bagega, we thought this was an opportunity for us.”

In 2012, CODE launched a campaign dubbed ‘Follow the Money Nigeria’ aimed at applying pressure on the government to release the promised funds. “Eighty percent of the less developed parts of Nigeria have zero access to Twitter, let alone Facebook, so it’s difficult for them to convey their stories,” says Babayemi. “We collect all the videos and testimonies and take it global.”

CODE members travelled to the lead-afflicted area to gather information. They then posted their findings online, and publicized them with a #SaveBagegahashtag, which they tweeted to members of the government, local and international organizations and the general public. CODE hosted a 48-hour ‘tweet-a-thon’, joined by a senator, to support the campaign….

By July 2014, CODE reported that the clean-up was complete and that over 1,000 children had been screened and enrolled in lead treatment programs. Bagega’s health center has also been refurbished and the village’s roads improved. “There are thousands of communities like Bagega,” says Babayemi. “They just need someone to amplify their voice.”….

Key lessons

  • Revealing information is not enough; change requires a real-world campaign driven by that information and civil society champions who can leverage their status and networks to draw international attention to the issues and maintain pressure.
  • Building relationships with sympathetic members of government is key.
  • Targeted online campaigns can help amplify the message of marginalized communities offline to achieve impact (More)”

Advancing Open and Citizen-Centered Government


The White House: “Today, the United States released our third Open Government National Action Plan, announcing more than 40 new or expanded initiatives to advance the President’s commitment to an open and citizen-centered government….In the third Open Government National Action Plan, the Administration both broadens and deepens efforts to help government become more open and more citizen-centered. The plan includes new and impactful steps the Administration is taking to openly and collaboratively deliver government services and to support open government efforts across the country. These efforts prioritize a citizen-centric approach to government, including improved access to publicly available data to provide everyday Americans with the knowledge and tools necessary to make informed decisions.

One example is the College Scorecard, which shares data through application programming interfaces (APIs) to help students and families make informed choices about education. Open APIs help create an ecosystem around government data in which civil society can provide useful visual tools, making this data more accessible and commercial developers can enable even more value to be extracted to further empower students and their families. In addition to these newer approaches, the plan also highlights significant longstanding open government priorities such as access to information, fiscal transparency, and records management, and continues to push for greater progress in that work.

The plan also focuses on supporting implementation of the landmark 2030 Agenda for Sustainable Development, which sets out a vision and priorities for global development over the next 15 years and was adopted last month by 193 world leaders including President Obama. The plan includes commitments to harness open government and progress toward the Sustainable Development Goals (SDGs) both in the United States and globally, including in the areas of education, health, food security, climate resilience, science and innovation, justice and law enforcement. It also includes a commitment to take stock of existing U.S. government data that relates to the 17 SDGs, and to creating and using data to support progress toward the SDGs.

Some examples of open government efforts newly included in the plan:

  • Promoting employment by unlocking workforce data, including training, skill, job, and wage listings.
  • Enhancing transparency and participation by expanding available Federal services to theOpen311 platform currently available to cities, giving the public a seamless way to report problems and request assistance.
  • Releasing public information from the electronically filed tax forms of nonprofit and charitable organizations (990 forms) as open, machine-readable data.
  • Expanding access to justice through the White House Legal Aid Interagency Roundtable.
  • Promoting open and accountable implementation of the Sustainable Development Goals….(More)”

Statactivism: Forms of Action between Disclosure and Affirmation


Paper by Bruno Isabelle, Didier Emmanuel and Vitale Tommaso: “This article introduces the special issue on statactivism, a particular form of action within the repertoire used by contemporary social movements: the mobilization of statistics. Traditionally, statistics has been used by the worker movement within the class conflicts. But in the current configuration of state restructuring, new accumulation regimes, and changes in work organization in capitalists societies, the activist use of statistics is moving. This first article seeks to show the use of statistics and quantification in contentious performances connected with state restructuring, main transformations of the varieties of capitalisms, and changes in work organization regimes. The double role of statistics in representing as well as criticizing reality is considered. After showing how important statistical tools are in producing a shared reading of reality, we will discuss the two main dimensions of statactivism – disclosure and affirmation. In other words, we will see the role of stat-activists in denouncing a certain state of reality, and then the efforts to use statistics in creating equivalency among disparate conditions and in cementing emerging social categories. Finally, we present the main contributions of the various research papers in this special issue regarding the use of statistics as a form of action within a larger repertoire of contentious action. Six empirical papers focus on statactivism against the penal machinery in the early 1970s (Grégory Salle), on the mobilisation on the price index in Guadalupe in 2009 (Boris Samuel), and in Argentina in 2007 (Celia Lury and Ana Gross), on the mobilisations of experts to consolidate a link between working conditions and health issues (Marion Gilles), on the production of activity data for disability policy in France (Pierre-Yves Baudot), and on the use of statistics in social mobilizations for gender equality (Eugenia De Rosa). Alain Desrosières wrote the last paper, coping with mobilizations proposing innovations in the way of measuring inflation, unemployment, poverty, GDP, and climate change. This special issue is dedicated to him, in order to honor his everlasting intellectual legacy….(More)”