Trusted smart statistics: Motivations and principles


Paper by Fabio Ricciato et al : “In this contribution we outline the concept of Trusted Smart Statistics as the natural evolution of official statistics in the new datafied world. Traditional data sources, namely survey and administrative data, represent nowadays a valuable but small portion of the global data stock, much thereof being held in the private sector. The availability of new data sources is only one aspect of the global change that concerns official statistics. Other aspects, more subtle but not less important, include the changes in perceptions, expectations, behaviours and relations between the stakeholders. The environment around official statistics has changed: statistical offices are not any more data monopolists, but one prominent species among many others in a larger (and complex) ecosystem. What was established in the traditional world of legacy data sources (in terms of regulations, technologies, practices, etc.) is not guaranteed to be sufficient any more with new data sources.

Trusted Smart Statistics is not about replacing existing sources and processes, but augmenting them with new ones. Such augmentation however will not be only incremental: the path towards Trusted Smart Statistics is not about tweaking some components of the legacy system but about building an entirely new system that will coexist with the legacy one. In this position paper we outline some key design principles for the new Trusted Smart Statistics system. Taken collectively they picture a system where the smart and trust aspects enable and reinforce each other. A system that is more extrovert towards external stakeholders (citizens, private companies, public authorities) with whom Statistical Offices will be sharing computation, control, code, logs and of course final statistics, without necessarily sharing the raw input data….(More)”.

Towards adaptive governance in big data health research: implementing regulatory principles


Chapter by Alessandro Blasimme and Effy Vayena: “While data-enabled health care systems are in their infancy, biomedical research is rapidly adopting the big data paradigm. Digital epidemiology for example, already employs data generated outside the public health care system – that is, data generated without the intent of using them for epidemiological research – to understand and prevent patterns of diseases in populations (Salathé 2018)(Salathé 2018). Precision medicine – pooling together genomic, environmental and lifestyle data – also represents a prominent example of how data integration can drive both fundamental and translational research in important medical domains such as oncology (D. C. Collins et al. 2017). All of this requires the collection, storage, analysis and distribution of massive amounts of personal information as well as the use of state-of-the art data analytics tools to uncover healthand disease related patterns.


The realization of the potential of big data in health evokes a necessary commitment to a sense of “continuity” articulated in three distinct ways: a) from data generation to use (as in the data enabled learning health care ); b) from research to clinical practice e.g. discovery of new mutations in the context of diagnostics; c) from strictly speaking health data (Vayena and Gasser 2016) e.g. clinical records, to less so e.g. tweets used in digital epidemiology. These continuities face the challenge of regulatory and governance approaches that were designed for clear data taxonomies, for a less blurred boundary between research and clinical practice, and for rules that focused mostly on data generation and less on their eventual and multiple uses.

The result is significant uncertainty about how responsible use of such large amounts of sensitive personal data could be fostered. In this chapter we focus on the uncertainties surrounding the use of biomedical big data in the context of health research. Are new criteria needed to review biomedical big data research projects? Do current mechanisms, such as informed consent, offer sufficient protection to research participants’ autonomy and privacy in this new context? Do existing oversight mechanisms ensure transparency and accountability in data access and sharing? What monitoring tools are available to assess how personal data are used over time? Is the equitable distribution of benefits accruing from such data uses considered, or can it be ensured? How is the public being involved – if at all – with decisions about creating and using large data
repositories for research purposes? What is the role that IT (information technology) players, and especially big ones, acquire in research? And what regulatory instruments do we have to ensure that such players do not undermine the independence of research?…(More)”.

Responsible data sharing in a big data-driven translational research platform: lessons learned


Paper by S. Kalkman et al: “The sharing of clinical research data is increasingly viewed as a moral duty [1]. Particularly in the context of making clinical trial data widely available, editors of international medical journals have labeled data sharing a highly efficient way to advance scientific knowledge [2,3,4]. The combination of even larger datasets into so-called “Big Data” is considered to offer even greater benefits for science, medicine and society [5]. Several international consortia have now promised to build grand-scale, Big Data-driven translational research platforms to generate better scientific evidence regarding disease etiology, diagnosis, treatment and prognosis across various disease areas [6,7,8].

Despite anticipated benefits, large-scale sharing of health data is charged with ethical questions. Stakeholders have been urged to consider how to manage privacy and confidentiality issues, ensure valid informed consent, and determine who gets to decide about data access [9]. More fundamentally, new data sharing activities prompt questions about social justice and public trust [10]. To balance potential benefits and ethical considerations, data sharing platforms require guidance for the processes of interaction and decision-making. In the European Union (EU), legal norms specified for the sharing of personal data for health research, most notably those set out in the General Data Protection Regulation (GDPR) (EU 2016/679), remain open to interpretation and offer limited practical guidance to researchers [12,12,13]. Striking in this regard is that the GDPR itself stresses the importance of adherence to ethical standards, when broad consent is put forward as a legal basis for the processing of personal data. For example, Recital 33 of the GDPR states that data subjects should be allowed to give “consent to certain areas of scientific research when in keeping with recognised ethical standards for scientific research” [14]. In fact, the GDPR actually encourages data controllers to establish self-regulating mechanisms, such as a code of conduct. To foster responsible and sustainable data sharing in translational research platforms, ethical guidance and governance is therefore necessary. Here, we define governance as ‘the processes of interaction and decision-making among the different stakeholders that are involved in a collective problem that lead to the creation, reinforcement, or reproduction of social norms and institutions’…(More)”.

Peopling Europe through Data Practices


Introduction to Special Issue of Science, Technology & Human Values by Baki Cakici, Evelyn Ruppert and Stephan Scheel: “Politically, Europe has been unable to address itself to a constituted polity and people as more than an agglomeration of nation-states. From the resurgence of nationalisms to the crisis of the single currency and the unprecedented decision of a member state to leave the European Union (EU), core questions about the future of Europe have been rearticulated: Who are the people of Europe? Is there a European identity? What does it mean to say, “I am European?” Where does Europe begin and end? and Who can legitimately claim to be a part of a “European” people?

The special issue (SI) seeks to contest dominant framings of the question “Who are the people of Europe?” as only a matter of government policies, electoral campaigns, or parliamentary debates. Instead, the contributions start from the assumption that answers to this question exist in data practices where people are addressed, framed, known, and governed as European. The central argument of this SI is that it is through data practices that the EU seeks to simultaneously constitute its population as a knowable, governable entity, and as a distinct form of peoplehood where common personhood is more important than differences….(More)”.

Dissent in Consensusland: An Agonistic Problematization of Multi-stakeholder Governance


Martin Fougère and Nikodemus Solitander at the Journal of Business Ethics: “Multi-stakeholder initiatives involve actors from several spheres of society (market, civil society and state) in collaborative arrangements to reach objectives typically related to sustainable development. In political CSR literature, these arrangements have been framed as improvements to transnational governance and as being somehow democratic.

We draw on Mouffe’s works on agonistic pluralism to problematize the notion that consensus-led multi-stakeholder initiatives bring more democratic control on corporate power. We examine two initiatives which address two very different issue areas: the Roundtable on Sustainable Palm Oil (RSPO) and the Bangladesh Accord on Fire and Building Safety (The Accord).

We map the different kinds of adversarial relations involved in connection with the issues meant to be governed by the two initiatives, and find those adversarial relations to take six main shapes, affecting the initiatives in different ways: (1) competing regulatory initiatives; (2) pressure-response relations within multi-stakeholder initiatives; (3) pressure-response relations between NGOs and states through multi-stakeholder initiatives; (4) collaboration and competition between multi-stakeholder initiatives and states; (5) pressure-response relations between civil society actors and multi-stakeholder initiatives; and (6) counter-hegemonic movements against multi-stakeholder initiatives as hegemonic projects.

We conclude that multi-stakeholder initiatives cannot be democratic by themselves, and we argue that business and society researchers should not look at democracy or politics only internally to these initiatives, but rather study how issue areas are regulated through interactions between a variety of actors—both within and without the multi-stakeholder initiatives—who get to have a legitimate voice in this regulation….(More)”.

Handbook of Democratic Innovation and Governance


Book edited by Stephen Elstub and Oliver Escobar: “Democracies are currently undergoing a period of both challenge and renewal. Democratic innovations are proliferating in politics, governance, policy, and public administration. This Handbook of Democratic Innovation and Governance advances understanding of democratic innovations by critically reviewing their importance throughout the world. The overarching themes are a focus on citizens and their relationship to these innovations, and the resulting effects on political equality and policy impact.

The Handbook covers different types of democratic innovations; their potential to combat current problems with democracy; the various actors involved; their use in different areas of policy and governance; their application in different parts of the world; and the methods used to research them. Contributors therefore offer a definitive overview of existing research on democratic innovations, while also setting the agenda for future research and practice.

Featuring a critical combination of theoretical, empirical and methodological work on democratic innovations, this insightful Handbook balances depth, originality and accessibility to make it an ideal research companion for scholars and students of democratic governance alike. Public administrators and participation practitioners will also benefit from its guidance on citizen engagement processes….(More)”.

Open Science, Open Data, and Open Scholarship: European Policies to Make Science Fit for the Twenty-First Century


Paper by Jean-Claude Burgelman et al: “Open science will make science more efficient, reliable, and responsive to societal challenges. The European Commission has sought to advance open science policy from its inception in a holistic and integrated way, covering all aspects of the research cycle from scientific discovery and review to sharing knowledge, publishing, and outreach. We present the steps taken with a forward-looking perspective on the challenges laying ahead, in particular the necessary change of the rewards and incentives system for researchers (for which various actors are co-responsible and which goes beyond the mandate of the European Commission). Finally, we discuss the role of artificial intelligence (AI) within an open science perspective….(More)”.

Why the Global South should nationalise its data


Ulises Ali Mejias at AlJazeera: “The recent coup in Bolivia reminds us that poor countries rich in resources continue to be plagued by the legacy of colonialism. Anything that stands in the way of a foreign corporation’s ability to extract cheap resources must be removed.

Today, apart from minerals and fossil fuels, corporations are after another precious resource: Personal data. As with natural resources, data too has become the target of extractive corporate practices.

As sociologist Nick Couldry and I argue in our book, The Costs of Connection: How Data is Colonizing Human Life and Appropriating It for Capitalism, there is a new form of colonialism emerging in the world: data colonialism. By this, we mean a new resource-grab whereby human life itself has become a direct input into economic production in the form of extracted data.

We acknowledge that this term is controversial, given the extreme physical violence and structures of racism that historical colonialism employed. However, our point is not to say that data colonialism is the same as historical colonialism, but rather to suggest that it shares the same core function: extraction, exploitation, and dispossession.

Like classical colonialism, data colonialism violently reconfigures human relations to economic production. Things like land, water, and other natural resources were valued by native people in the precolonial era, but not in the same way that colonisers (and later, capitalists) came to value them: as private property. Likewise, we are experiencing a situation in which things that were once primarily outside the economic realm – things like our most intimate social interactions with friends and family, or our medical records – have now been commodified and made part of an economic cycle of data extraction that benefits a few corporations.

So what could countries in the Global South do to avoid the dangers of data colonialism?…(More)”.

Exploring digital government transformation in the EU


Analysis of the state of the art and review of literature by Gianluca Misuraca et al: “This report presents the… results of the review of literature, based on almost 500 academic and grey literature sources, as well as the analysis of digital government policies in the EU Member States provide a synthetic overview of the main themes and topics of the digital government discourse.

The report depicts the variety of existing conceptualisations and definitions of the digital government phenomenon, measured and expected effects of the application of more disruptive innovations and emerging technologies in government, as well as key drivers and barriers for transforming the public sector. Overall, the literature review shows that many sources appear overly optimistic with regard to the impact of digital government transformation, although the majority of them are based on normative views or expectations, rather than empirically tested insights.

The authors therefore caution that digital government transformation should be researched empirically and with a due differentiation between evidence and hope. In this respect, the report paves the way to in-depth analysis of the effects that can be generated by digital innovation in public sector organisations. A digital transformation that implies the redesign of the tools and methods used in the machinery of government will require in fact a significant change in the institutional frameworks that regulate and help coordinate the governance systems in which such changing processes are implemented…(More)”.

Dreamocracy – Collective Intelligence for the Common Good


About: “Dreamocracy is a think-and-do-tank that fosters collective intelligence / creativity for the common good through analysis, advice to organisations, and by developing and implementing innovative stakeholder management experiments.  

Dreamocracy aims to contribute to democracy’s reinvention and future. As Harvard scholar Yascha Mounk stresses, democracy in many parts of the world is at risk of “deconsolidation.” Possible collapse is signalled by the convergence of people’s dissatisfaction with democracy; their willingness to consider non-democratic forms of government as possible alternatives; and the rise in populist parties, anti-system movements and demagogues in government.

In order to ensure a bright future for democracy in service to society, Dreamocracy believes collective intelligence done well is essential to address the following three terms of our proposed “trust-in-government equation”:

TRUST = Process legitimacy + Output legitimacy + Emotions legitimacy….(More)”.