Citizens’ voices for better health and social policies


Olivia Biermann et al at PLOS Blog Speaking of Medicine: “Citizen engagement is important to make health and social policies more inclusive and equitable, and to contribute to learning and responsive health and social systems. It is also valuable in understanding the complexities of the social structure and how to adequately respond to them with policies. By engaging citizens, we ensure that their tacit knowledge feeds into the policy-making process. What citizens know can be valuable in identifying feasible policy options, understanding contextual factors, and putting policies into practice. In addition, the benefit of citizen engagement extends much beyond improving health policy-making processes by making them more participatory and inclusive; being engaged in policy-making processes can build patients’ capacity and empower them to speak up for their own and their families’ health and social needs, and to hold policy-makers accountable. Moreover, apart from being involved in their own care, citizen-patients can contribute to quality improvement, research and education.

Most studies on citizen engagement to date originate from high-income countries. The engagement methods used are not necessarily applicable in low- and middle-income countries, and even the political support, the culture of engagement and established citizen engagement processes might be different. Still, published processes of engaging citizens can be helpful in identifying key components across different settings, e.g. in terms of levels of engagement, communication channels and methods of recruitment. Contextualizing the modes of engagement between and within countries is a must.

Examples of citizen engagement

There are many examples of ad hoc citizen engagement initiatives at local, national and international levels. Participedia, a repository of public participation initiatives around the globe, showcases that the field of citizen engagement is extremely vibrant.  In the United Kingdom, the Citizens’ Council of the National Institute for Health and Clinical Excellence (NICE) provides NICE with a public perspective on overarching moral and ethical issues that NICE has to take into account when producing guidance. In the United States of America, the National Issues Forum supports the implementation of deliberative forums on pressing national policy issues. Yet, there are few examples that have long-standing programs of engagement and that engage citizens in evidence-informed policymaking.

A pioneer in engaging citizens in health policy-making processes is the McMaster Health Forum in Hamilton, Canada. The citizens who are invited to engage in a “citizen panel” first receive a pre-circulated, plain-language briefing document to spark deliberation about a pressing health and social-system issue. During the panels, citizens then discuss the problem and its causes, options to address it and implementation considerations. The values that they believe should underpin action to address the issue are captured in a panel summary which is used to inform a policy dialogue on the same topic, also organized by the McMaster Health Forum….(More)”.

‘Digital colonialism’: why some countries want to take control of their people’s data from Big Tech


Jacqueline Hicks at the Conversation: “There is a global standoff going on about who stores your data. At the close of June’s G20 summit in Japan, a number of developing countries refused to sign an international declaration on data flows – the so-called Osaka Track. Part of the reason why countries such as India, Indonesia and South Africa boycotted the declaration was because they had no opportunity to put their own interests about data into the document.

With 50 other signatories, the declaration still stands as a statement of future intent to negotiate further, but the boycott represents an ongoing struggle by some countries to assert their claim over the data generated by their own citizens.

Back in the dark ages of 2016, data was touted as the new oil. Although the metaphor was quickly debunked it’s still a helpful way to understand the global digital economy. Now, as international negotiations over data flows intensify, the oil comparison helps explain the economics of what’s called “data localisation” – the bid to keep citizens’ data within their own country.

Just as oil-producing nations pushed for oil refineries to add value to crude oil, so governments today want the world’s Big Tech companies to build data centres on their own soil. The cloud that powers much of the world’s tech industry is grounded in vast data centres located mainly around northern Europe and the US coasts. Yet, at the same time, US Big Tech companies are increasingly turning to markets in the global south for expansion as enormous numbers of young tech savvy populations come online….(More)”.

How cities can leverage citizen data while protecting privacy


MIT News: “India is on a path with dual — and potentially conflicting — goals related to the use of citizen data.

To improve the efficiency their municipal services, many Indian cities have started enabling government-service requests, which involves collecting and sharing citizen data with government officials and, potentially, the public. But there’s also a national push to protect citizen privacy, potentially restricting data usage. Cities are now beginning to question how much citizen data, if any, they can use to track government operations.

In a new study, MIT researchers find that there is, in fact, a way for Indian cities to preserve citizen privacy while using their data to improve efficiency.

The researchers obtained and analyzed data from more than 380,000 government service requests by citizens across 112 cities in one Indian state for an entire year. They used the dataset to measure each city government’s efficiency based on how quickly they completed each service request. Based on field research in three of these cities, they also identified the citizen data that’s necessary, useful (but not critical), or unnecessary for improving efficiency when delivering the requested service.

In doing so, they identified “model” cities that performed very well in both categories, meaning they maximized privacy and efficiency. Cities worldwide could use similar methodologies to evaluate their own government services, the researchers say. …(More)”.

Community Data Dialogues


Sunlight foundation: “Community Data Dialogues are in-person events designed to share open data with community members in the most digestible way possible to start a conversation about a specific issue. The main goal of the event is to give residents who may not have technical expertise but have local experience a chance to participate in data-informed decision-making. Doing this work in-person can open doors and let facilitators ask a broader range of questions. To achieve this, the event must be designed to be inclusive of people without a background in data analysis and/or using statistics to understand local issues. Carrying out this event will let decision-makers in government use open data to talk with residents who can add to data’s value with their stories of lived experience relevant to local issues.

These events can take several forms, and groups both in and outside of government have designed creative and innovative events tailored to engage community members who are actively interested in helping solve local issues but are unfamiliar with using open data. This guide will help clarify how exactly to make Community Data Dialogues non-technical, interactive events that are inclusive to all participants….

A number of groups both in and outside of government have facilitated accessible open data events to great success. Here are just a few examples from the field of what data-focused events tailored for a nontechnical audience can look like:

Data Days Cleveland

Data Days Cleveland is an annual one-day event designed to make data accessible to all. Programs are designed with inclusivity and learning in mind, making it a more welcoming space for people new to data work. Data experts and practitioners direct novices on the fundamentals of using data: making maps, reading spreadsheets, creating data visualizations, etc….

The Urban Institute’s Data Walks

The Urban Institute’s Data Walks are an innovative example of presenting data in an interactive and accessible way to communities. Data Walks are events gathering community residents, policymakers, and others to jointly review and analyze data presentations on specific programs or issues and collaborate to offer feedback based on their individual experiences and expertise. This feedback can be used to improve current projects and inform future policies….(More)“.

How Nontraditional Innovation is Rejuvenating Public Housing


Blog by Jamal Gauthier: “The crisis of affordable public housing can be felt across America on a large scale. Many poor and impoverished families that reside in public housing projects are consistently unable to pay rent for their dwellings while dealing with a host of other social complications that make living in public housing even more difficult. Creating affordable public housing involves the use of innovative processes that reduce construction cost and maximize livable square footage so that rents can remain affordable. Through the rising popularity of nontraditional approaches to innovation, many organizations tasked with addressing these difficult housing challenges are adopting such methods to uncover previously unthought of solutions.

The concept of crowdsourcing especially is paving the way for federal agencies (such as HUD), nonprofits, and private housing companies alike to gain new perspectives and approaches to complex public housing topics from unlikely and/or underutilized sources. Crowdsourcing proponents and stakeholders hope to add fresh ideas and new insights to the shared pool of public knowledge, augmenting innovation and productivity in the current public housing landscape.

The federal government could particularly benefit from these nontraditional forms of innovation by implementing these practices into standard government processes. The struggling affordable public housing system in America, for example, points to a glaring flaw in standard government process that makes applying the best ideas for real-world implementation by the government virtually impossible….(More)”.

Traffic Data Is Good for More than Just Streets, Sidewalks


Skip Descant at Government Technology: “The availability of highly detailed daily traffic data is clearly an invaluable resource for traffic planners, but it can also help officials overseeing natural lands or public works understand how to better manage those facilities.

The Natural Communities Coalition, a conservation nonprofit in southern California, began working with the traffic analysis firm StreetLight Data in early 2018 to study the impacts from the thousands of annual visitors to 22 parks and natural lands. StreetLight Data’s use of de-identified cellphone data held promise for the project, which will continue into early 2020.

“You start to see these increases,” Milan Mitrovich, science director for the Natural Communities Coalition, said of the uptick in visitor activity the data showed. “So being able to have this information, and share it with our executive committee… these folks, they’re seeing it for the first time.”…

Officials with the Natural Communities Coalition were able to use the StreetLight data to gain insights into patterns of use not only per day, but at different times of the day. The data also told researchers where visitors were traveling from, a detail park officials found “jaw-dropping.”

“What we were able to see is, these resources, these natural areas, cast an incredible net across southern California,” said Mitrovich, noting visitors come from not only Orange County, but Los Angeles, San Bernardino and San Diego counties as well, a region of more than 20 million residents.

The data also allows officials to predict traffic levels during certain parts of the week, times of day or even holidays….(More)”.

Real-time flu tracking. By monitoring social media, scientists can monitor outbreaks as they happen.


Charles Schmidt at Nature: “Conventional influenza surveillance describes outbreaks of flu that have already happened. It is based on reports from doctors, and produces data that take weeks to process — often leaving the health authorities to chase the virus around, rather than get on top of it.

But every day, thousands of unwell people pour details of their symptoms and, perhaps unknowingly, locations into search engines and social media, creating a trove of real-time flu data. If such data could be used to monitor flu outbreaks as they happen and to make accurate predictions about its spread, that could transform public-health surveillance.

Powerful computational tools such as machine learning and a growing diversity of data streams — not just search queries and social media, but also cloud-based electronic health records and human mobility patterns inferred from census information — are making it increasingly possible to monitor the spread of flu through the population by following its digital signal. Now, models that track flu in real time and forecast flu trends are making inroads into public-health practice.

“We’re becoming much more comfortable with how these models perform,” says Matthew Biggerstaff, an epidemiologist who works on flu preparedness at the US Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia.

In 2013–14, the CDC launched the FluSight Network, a website informed by digital modelling that predicts the timing, peak and short-term intensity of the flu season in ten regions of the United States and across the whole country. According to Biggerstaff, flu forecasting helps responders to plan ahead, so they can be ready with vaccinations and communication strategies to limit the effects of the virus. Encouraged by progress in the field, the CDC announced in January 2019 that it will spend US$17.5 million to create a network of influenza-forecasting centres of excellence, each tasked with improving the accuracy and communication of real-time forecasts.

The CDC is leading the way on digital flu surveillance, but health agencies elsewhere are following suit. “We’ve been working to develop and apply these models with collaborators using a range of data sources,” says Richard Pebody, a consultant epidemiologist at Public Health England in London. The capacity to predict flu trajectories two to three weeks in advance, Pebody says, “will be very valuable for health-service planning.”…(More)”.

The Internet Relies on People Working for Free


Owen Williams at OneZero: “When you buy a product like Philips Hue’s smart lights or an iPhone, you probably assume the people who wrote their code are being paid. While that’s true for those who directly author a product’s software, virtually every tech company also relies on thousands of bits of free code, made available through “open-source” projects on sites like GitHub and GitLab.

Often these developers are happy to work for free. Writing open-source software allows them to sharpen their skills, gain perspectives from the community, or simply help the industry by making innovations available at no cost. According to Google, which maintains hundreds of open-source projects, open source “enables and encourages collaboration and the development of technology, solving real-world problems.”

But when software used by millions of people is maintained by a community of people, or a single person, all on a volunteer basis, sometimes things can go horribly wrong. The catastrophic Heartbleed bug of 2014, which compromised the security of hundreds of millions of sites, was caused by a problem in an open-source library called OpenSSL, which relied on a single full-time developer not making a mistake as they updated and changed that code, used by millions. Other times, developers grow bored and abandon their projects, which can be breached while they aren’t paying attention.

It’s hard to demand that programmers who are working for free troubleshoot problems or continue to maintain software that they’ve lost interest in for whatever reason — though some companies certainly try. Not adequately maintaining these projects, on the other hand, makes the entire tech ecosystem weaker. So some open-source programmers are asking companies to pay, not for their code, but for their support services….(More)”.

The promise and peril of a digital ecosystem for the planet


Blog post by Jillian Campbell and David E Jensen: “A range of frontier and digital technologies have dramatically boosted the ways in which we can monitor the health of our planet. And sustain our future on it (Figure 1).

Figure 1. A range of frontier an digital technologies can be combined to monitor our planet and the sustainable use of natural resources (1)

If we can leverage this technology effectively, we will be able to assess and predict risks, increase transparency and accountability in the management of natural resources and inform markets as well as consumer choice. These actions are all required if we are to stand a better chance of achieving the Sustainable Development Goals (SDGs).

However, for this vision to become a reality, public and private sector actors must take deliberate action and collaborate to build a global digital ecosystem for the planet — one consisting of data, infrastructure, rapid analytics, and real-time insights. We are now at a pivotal moment in the history of our stewardship of this planet. A “tipping point” of sorts. And in order to guide the political action which is required to counter the speed, scope and severity of the environmental and climate crises, we must acquire and deploy these data sets and frontier technologies. Doing so can fundamentally change our economic trajectory and underpin a sustainable future.

This article shows how such a global digital ecosystem for the planet can be achieved — as well as what we risk if we do not take decisive action within the next 12 months….(More)”.

How big data can affect your bank account – and life


Alena Buyx, Barbara Prainsack and Aisling McMahon at The Conversation: “Mustafa loves good coffee. In his free time, he often browses high-end coffee machines that he cannot currently afford but is saving for. One day, travelling to a friend’s wedding abroad, he gets to sit next to another friend on the plane. When Mustafa complains about how much he paid for his ticket, it turns out that his friend paid less than half of what he paid, even though they booked around the same time.

He looks into possible reasons for this and concludes that it must be related to his browsing of expensive coffee machines and equipment. He is very angry about this and complains to the airline, who send him a lukewarm apology that refers to personalised pricing models. Mustafa feels that this is unfair but does not challenge it. Pursuing it any further would cost him time and money.

This story – which is hypothetical, but can and does occur – demonstrates the potential for people to be harmed by data use in the current “big data” era. Big data analytics involves using large amounts of data from many sources which are linked and analysed to find patterns that help to predict human behaviour. Such analysis, even when perfectly legal, can harm people.

Mustafa, for example, has likely been affected by personalised pricing practices whereby his search for high-end coffee machines has been used to make certain assumptions about his willingness to pay or buying power. This in turn may have led to his higher priced airfare. While this has not resulted in serious harm in Mustafa’s case, instances of serious emotional and financial harm are, unfortunately, not rare, including the denial of mortgages for individuals and risks to a person’s general credit worthiness based on associations with other individuals. This might happen if an individual shares some similar characteristics to other individuals who have poor repayment histories….(More)”.