Five ways to ensure that models serve society: a manifesto


Andrea Saltelli et al at Nature: “The COVID-19 pandemic illustrates perfectly how the operation of science changes when questions of urgency, stakes, values and uncertainty collide — in the ‘post-normal’ regime.

Well before the coronavirus pandemic, statisticians were debating how to prevent malpractice such as p-hacking, particularly when it could influence policy1. Now, computer modelling is in the limelight, with politicians presenting their policies as dictated by ‘science’2. Yet there is no substantial aspect of this pandemic for which any researcher can currently provide precise, reliable numbers. Known unknowns include the prevalence and fatality and reproduction rates of the virus in populations. There are few estimates of the number of asymptomatic infections, and they are highly variable. We know even less about the seasonality of infections and how immunity works, not to mention the impact of social-distancing interventions in diverse, complex societies.

Mathematical models produce highly uncertain numbers that predict future infections, hospitalizations and deaths under various scenarios. Rather than using models to inform their understanding, political rivals often brandish them to support predetermined agendas. To make sure predictions do not become adjuncts to a political cause, modellers, decision makers and citizens need to establish new social norms. Modellers must not be permitted to project more certainty than their models deserve; and politicians must not be allowed to offload accountability to models of their choosing2,3.

This is important because, when used appropriately, models serve society extremely well: perhaps the best known are those used in weather forecasting. These models have been honed by testing millions of forecasts against reality. So, too, have ways to communicate results to diverse users, from the Digital Marine Weather Dissemination System for ocean-going vessels to the hourly forecasts accumulated by weather.com. Picnickers, airline executives and fishers alike understand both that the modelling outputs are fundamentally uncertain, and how to factor the predictions into decisions.

Here we present a manifesto for best practices for responsible mathematical modelling. Many groups before us have described the best ways to apply modelling insights to policies, including for diseases4 (see also Supplementary information). We distil five simple principles to help society demand the quality it needs from modelling….(More)”.

UN Data Strategy


United Nations: “As structural UN reforms consolidate, we are focused on building the data, digital, technology and innovation capabilities that the UN needs to succeed in the 21st century. The Secretary General’s “Data Strategy for Action by Everyone, Everywhere” is our agenda for the data-driven transformation.

Data permeates all aspects of our work, and its power—harnessed responsibly—is critical to the global agendas we serve. The UN family’s footprint, expertise and connectedness create unique opportunities to advance global “data action” with insight, impact and integrity. To help unlock more potential, 50 UN entities jointly designed this Strategy as a comprehensive playbook for data-driven change based on global best practice…

Our strategy pursues a simple idea: we focus not on process, but on learning, iteratively, to deliver data use cases that add value for stakeholders based on our vision, outcomes and principles. Use cases – purposes for which data is used – already permeate our organization. We will systematically identify and deliver them through dedicated data action portfolios. While new capabilities will in part emerge through “learning by doing”, we will also strengthen organizational enablers to deliver on our vision, including shifts in people and culture, partnerships, data governance and technology….(More)”.

United Nations Data Strategy

Opportunities of Artificial Intelligence


Report for the European Parliament: “A vast range of AI applications are being implemented by European industry, which can be broadly grouped into two categories: i) applications that enhance the performance and efficiency of processes through mechanisms such as intelligent monitoring, optimisation and control; and ii) applications that enhance human-machine collaboration.

At present, such applications are being implemented across a broad range of European industrial sectors. However, some sectors (e.g. automotive, telecommunications, healthcare) are more advanced in AI deployment than others (e.g. paper and pulp, pumps, chemicals). The types of AI applications
implemented also differ across industries. In less digitally mature sectors, clear barriers to adoption have been identified, including both internal (e.g. cultural resistance, lack of skills, financial considerations) and external (e.g. lack of venture capital) barriers. For the most part, and especially for SMEs, barriers to the adoption of AI are similar to those hindering digitalisation. The adoption of such AI applications is anticipated to deliver a wide range of positive impacts, for individual firms, across value chains, as well as at the societal and macroeconomic levels. AI applications can bring efficiency, environmental and economic benefits related to increased production output and quality, reduced maintenance costs, improved energy efficiency, better use of raw materials and reduced waste. In addition, AI applications can add value through product personalisation, improve customer service and contribute to the development of new product classes, business models and even sectors. Workforce benefits (e.g. improved workplace safety) are also being delivered by AI applications.

Alongside these firm-level benefits and opportunities, significant positive societal and economy-wide impacts are envisaged. More specifically, substantial increases in productivity, innovation, growth and job creation have been forecasted. For example, one estimate anticipates labour productivity increases of 11-37% by 2035. In addition, AI is expected to positively contribute to the UN Sustainable Development Goals and the capabilities of AI and machine learning to address major health challenges, such as the current COVID-19 health pandemic, are also noteworthy. For instance, AI systems have the potential to accelerate the lead times for the development of vaccines and drugs.

However, AI adoption brings a range of challenges…(More)”.

Digital Technology and the Resurrection of Trust


Report by the Select Committee on Democracy and Digital Technologies (UK Parliament): “Democracy faces a daunting new challenge. The age where electoral activity was conducted through traditional print media, canvassing and door knocking, is rapidly vanishing. Instead it is dominated by digital and social media. They are now the source from which voters get most of their information and political messaging.

The digital and social media landscape is dominated by two behemoths–Facebook and Google. They largely pass under the radar, operating outside the rules that govern electoral politics. This has become acutely obvious in the current COVID-19 pandemic where online misinformation poses not only a real and present danger to our democracy but also to our lives. Governments have been dilatory in adjusting regulatory regimes to capture these new realities. The result is a crisis of trust.

Yet our profound belief is that this can change. Technology is not a force of nature. Online platforms are not inherently ungovernable. They can and should be bound by the same restraints that we apply to the rest of society. If this is done well, in the ways we spell out in this Report, technology can become a servant of democracy rather than its enemy. There is a need for Government leadership and regulatory capacity to match the scale and pace of challenges and opportunities that the online world presents.

The Government’s Online Harms programme presents a significant first step towards this goal. It needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.

Well drafted Online Harms legislation can do much to protect our democracy. Issues such as misinformation and disinformation must be included in the Bill. The Government must make sure that online platforms bear ultimate responsibility for the content that their algorithms promote. Where harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do.

Individual users need greater protection. They must have redress against large platforms through an ombudsman tasked with safeguarding the rights of citizens.

Transparency of online platforms is essential if democracy is to flourish. Platforms like Facebook and Google seek to hide behind ‘black box’ algorithms which choose what content users are shown. They take the position that their decisions are not responsible for harms that may result from online activity. This is plain wrong. The decisions platforms make in designing and training these algorithmic systems shape the conversations that happen online. For this reason, we recommend that platforms be mandated to conduct audits to show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers….(More)”.

The Data Assembly


Press Release: “The Governance Lab (The GovLab), an action research center at New York University Tandon School of Engineering, with the support of the Henry Luce Foundation, announced the creation of The Data Assembly. Beginning in New York City, the effort will explore how communities perceive the risks and benefits of data re-use for COVID-19. Understanding that policymakers often lack information about the concerns of different stakeholders, The Data Assembly’s deliberations will inform the creation of a responsible data re-use framework to guide the use of data and technology at the city and state level to fight COVID-19’s many consequences.

The Data Assembly will hold deliberations with civil rights organizations, key data holders and policymakers, and the public at large. Consultations with these stakeholders will take place through a series of remote engagements, including surveys and an online town hall meeting. This work will allow the project to consider the perspectives of people from different strata of society and how they might exercise some control over the flow of data.

After the completion of these data re-use deliberations, The Data Assembly will create a path forward for using data responsibly to solve public challenges. The first phases of the project will commence in New York City, seeking to engage with city residents and their leaders on data governance issues. 

“Data is increasingly the primary format for sharing information to understand crises and plan recovery efforts; empowering everyone to better understand how data is collected and how it should be used is paramount,” said Adrienne Schmoeker, Director of Civic Engagement & Strategy and Deputy Chief Analytics Officer at the NYC Mayor’s Office of Data Analytics. “We look forward to learning from the insights gathered by the GovLab through The Data Assembly work they are conducting in New York City.”…(More)”.

COVID Response Alliance for Social Entrepreneurs


Article by François Bonnici: “…Social innovators and social entrepreneurs have been working to solve market failures and demonstrate more sustainable models to build inclusive economies for years. The Schwab Foundation 2020 Impact Report “Two Decades of Impact” demonstrated how the network of 400 leading social innovators and entrepreneurs it supports have improved the lives of more than 622 million people, protecting livelihoods, driving movements for social inclusion and environmental sustainability, and providing improved access to health, sanitation, education and energy.

From providing reliable information, services and care for the most vulnerable, to developing community tracing initiatives or mental health support through mobile phones, the work of social entrepreneurs is even more critical during the COVID-19 pandemic, as they reach those who the market and governments are unable to account for.

But right now, these front-line organizations face severe constraints or even bankruptcy. Decades of work in the impact sector are at stake.

Over the past four decades, a sophisticated impact ecosystem has emerged to support the work of social innovators and impact enterprises. This includes funding provided by capital sources ranging from philanthropy and impact investing, intermediaries providing certification and standards, peer networks of learning and policy and regulation of this new “social economy” seeking to embed inclusive and sustainable organizational approaches imbued with principles of equality, justice and respect for our planet.

From this ecosystem, 40 leading global organizations collectively supporting more than 15,000 social entrepreneurs have united to launch the COVID Response Alliance for Social Entrepreneurs. The aim is to share knowledge, experience and resources to coordinate and amplify social entrepreneurs’ response to COVID-19….(More)”.

The Data Dividend Project


About: “The Data Dividend Project is a movement dedicated to taking back control of our personal data: our data is our property, and if we allow companies to use it, we should get paid for it. The DDP is the brainchild of former presidential candidate Andrew Yang. Its primary objective is to establish and enforce data property rights under laws such as the California Consumer Privacy Act (CCPA), which went into effect on January 1, 2020.

Every day, people are generating data simply by going about the business of living in an ever connected and digital world. Unbeknownst to most people, technology companies are tracking their every move online, extracting this data, and then buying and selling it for big money. The sale and resale of consumer data is called data brokering, which is itself a $200 billion industry.

For example, technology companies can extract location data from your mobile phone and sell it to advertisers who can then turn around and post local ads to you in real time. Until recently, the data collector – in this case, the technology company – was deemed to own the data. As the owner, the technology company could sell that data and profit handsomely. Meanwhile, you generated the data but received no share of those profits. DDP plans to change that.

Until this year, you, as the American consumer, had little recourse against technology companies who were profiting off your data without your consent or knowledge. Now, under the CCPA, Californians are endowed with a collection of unalienable data rights: the right to know what information is being collected on you, the right to delete that information, and the right to opt-out from technology companies collecting your data. These rights, however, are ignored and abused by technology companies. And unfortunately, individual consumers don’t have the leverage to be able to go up against these companies. That’s where DDP comes in….(More)

Best Practices to Cover Ad Information Used for Research, Public Health, Law Enforcement & Other Uses


Press Release: “The Network Advertising Initiative (NAI) released privacy Best Practices for its members to follow if they use data collected for Tailored Advertising or Ad Delivery and Reporting for non-marketing purposes, such as sharing with research institutions, public health agencies, or law enforcement entities.

“Ad tech companies have data that can be a powerful resource for the public good if they follow this set of best practices for consumer privacy,” said Leigh Freund, NAI President and CEO. “During the COVID-19 pandemic, we’ve seen the opportunity for substantial public health benefits from sharing aggregate and de-identified location data.”

The NAI Code of Conduct – the industry’s premier self-regulatory framework for privacy, transparency, and consumer choice – covers data collected and used for Tailored Advertising or Ad Delivery and Reporting. The NAI Code has long addressed certain non-marketing uses of data collected for Tailored Advertising and Ad Delivery and Reporting by prohibiting any
eligibility uses of such data, including uses for credit, insurance, healthcare, and employment decisions.

The NAI has always firmly believed that data collected for advertising purposes should not have a negative effect on consumers in their daily lives. However, over the past year, novel data uses have been introduced, especially during the recent health crisis. In the case of opted-in data such as Precise Location Information, a company may determine a user would benefit from more detailed disclosure in a just-in-time notice about non-marketing uses of the data being collected….(More)”.

Changing Citizen Behaviour: An Investigation on Nudge Approach in Developing Society


Paper by Dimas Budi Prasetyo: “It is widely explored that problems in developing society related to think and act logically and reflectively in a social context positively correlates with the cognition skill. In most developing societies, people are busy with problems that they face daily (i.e. working overtime), limits their cognitive capacity to properly process a social stimulus, which mostly asked their thoughtful response. Thus, a better design in social stimulus to tackle problematic behaviour, such as littering, to name a few, becomes more prominent. During the last decade, nudge has been famous for its subtle approach for behaviour change – however, there is relatively little known of the method applied in the developing society. The current article reviews the nudge approach to change human behaviour from two perspectives: cognitive science and consumer psychology. The article concludes that intervention using the nudge approach could be beneficial for current problematic behaviour…(More)”.

What Nobel Laureate Elinor Ostrom’s early work tells us about defunding the police


Blog by Aaron Vansintjan: “…As she concluded in her autobiographical reflections published two years before she died in 2012, “For policing, increasing the size of governmental units consistently had a negative impact on the level of output generated as well as on efficiency of service provision… smaller police departments… consistently outperformed their better trained and better financed larger neighbors.”

But why did this happen? To explain this, Ostrom showed how, in small communities with small police forces, citizens are more active in monitoring their neighborhoods. Officers in smaller police forces also have more knowledge of the local area and better connections with the community. 

She also found that larger, more centralized police forces also have a negative effect on other public services. With a larger police bureaucracy, other local frontline professionals with less funding — social workers, mental health support centers, clinics, youth support services — have less of a say in how to respond to a community’s issues  such as drug use or domestic violence. The bigger the police department, the less citizens — especially those that are already marginalized, like migrants or Black communities — have a say in how policing should be conducted.

This finding became a crucial step in Ostrom’s groundbreaking work on how communities manage their resources sustainably without outside help — through deliberation, resolving conflict and setting clear community agreements. This is what she ended up becoming famous for, and what won her the Nobel Memorial Prize in Economic Sciences, placing her next to some of the foremost economists in the world.

But her research on policing shouldn’t be forgotten: It shows that, when it comes to safer communities, having more funding or larger services is not important. What’s important is the connections and trust between the community and the service provider….(More)”.