Digital rights as a security objective: New gateways for attacks


Yannic Blaschke at EDRI: “Violations of human rights online, most notably the right to data protection, can pose a real threat to electoral security and societal polarisation. In this series of blogposts, we’ll explain how and why digital rights must be treated as a security objective instead. The second part of the series explains how encroaching on digital rights could create new gateways for attacks against our security.

In the first part of this series, we analysed the failure of the Council of the European Union to connect the obvious dots between ePrivacy and disinformation online, leaving open a security vulnerability through a lack of protection of citizens. However, a failure to act is not the only front on which the EU is potentially weakening our security on- and offline: on the contrary, some of the EU’s more actively pursued digital policies could have unintended, yet serious consequences in the future. Nowhere is this trend more visible than in the recent trust in filtering algorithms, which seem to be the new “censorship machine” that is proposed as a solution for almost everything, from copyright infringements to terrorist content online.

Article 13 of the Copyright Directive proposal and the Terrorist Content Regulation proposal are two examples of the attempt to regulate the online world via algorithms. While having different motivations, both share the logic of outsourcing accountability and enforcement of public rules to private entities who will be the ones deciding about the availability of speech online. They, explicitly or implicitly, advocate for the introduction of technologies that detect and remove certain types of content: upload filters. They empower internet companies to decide which content will stay online, based on their terms of service (and not law). In a nutshell, public institutions are encouraging Google, Facebook and other platform giants to become the judge and the police of the internet. In turn, they undermine the presumption that it should be democratically legitimise states, not private entities, who are tasked with the heavy burden of balancing the right to freedom of expression.

Even more chilling is the outlook of upload filters creating new entry points for forces that seek to influence societal debates in their favour. If algorithms will be the judges of what can or cannot be published, they could become the target of the next wave of election interference campaigns, with attackers instigating them to take down critical or liberal voices to influence debates on the internet. Despite continuous warnings about the misuse of personal data on Facebook, it only took us a few years to arrive at the point of Cambridge Analytica. How long will it take us to arrive at a similar point of election interference through upload filters in online platforms?

If we let this pre-emptive and extra-judicial censorship happen, it would likely result in severe detriments to the freedom of speech and right to information of European citizens, and the free flow of information would, in consequence, be stifled. The societal effects of this could be further aggravated by the introduction of a press publishers right (Article 11 of the Copyright Directive) that is vividly opposed by the academic world, as it will concentrate the power over what appears in the news in ever fewer hands. Especially in Member States where media plurality and independence of bigger outlets from state authorities are no longer guaranteed, a decline in societal resilience to authoritarian tendencies is unfortunately easy to imagine.

We have to be very clear about what machines are good at and what they are bad at: Algorithms are incredibly well suited to detect patterns and trends, but cannot and will not be able perform the delicate act of balancing our rights and freedoms in accordance with the law any time soon….(More)”

IBM aims to use crowdsourced sensor data to improve local weather forecasting globally


Larry Dignan at ZDN: “IBM is hoping that mobile barometric sensors from individuals opting in, supercomputing ,and the Internet of Things can make weather forecasting more local globally.

Big Blue, which owns The Weather Company, will outline the IBM Global High-Resolution Atmospheric Forecasting System (GRAF). GRAF incorporates IoT data in its weather models via crowdsourcing.

While hyper local weather forecasts are available in the US, Japan, and some parts of Western Europe, many regions in the world lack an accurate picture of weather.

Mary Glackin, senior vice president of The Weather Company, said the company is “trying to fill in the blanks.” She added, “In a place like India, weather stations are kilometers away. We think this can be as significant as bringing satellite data into models.”

For instance, the developing world gets forecasts based on global data that are updated every 6 hours and resolutions at 10km to 15km. By using GRAF, IBM said it can offer forecasts for the day ahead that are updated hourly on average and have a 3km resolution….(More)”.

A Study of the Implications of Advanced Digital Technologies (Including AI Systems) for the Concept of Responsibility Within a Human Rights Framework


Report by Karen Yeung: “This study was commissioned by the Council of Europe’s Committee of experts on human rights dimensions of automated data processing and different forms of artificial intelligence (MSI-AUT). It was prompted by concerns about the potential adverse consequences of advanced digital technologies (including artificial intelligence (‘AI’)), particularly their impact on the enjoyment of human rights and fundamental freedoms. This draft report seeks to examine the implications of these technologies for the concept of responsibility, and this includes investigating where responsibility should lie for their adverse consequences. In so doing, it seeks to understand (a) how human rights and fundamental freedoms protected under the ECHR may be adversely affected by the development of AI technologies and (b) how responsibility for those risks and consequences should be allocated. 

Its methodological approach is interdisciplinary, drawing on concepts and academic scholarship from the humanities, the social sciences and, to a more limited extent, from computer science. It concludes that, if we are to take human rights seriously in a hyperconnected digital age, we cannot allow the power of our advanced digital technologies and systems, and those who develop and implement them, to be accrued and exercised without responsibility. Nations committed to protecting human rights must therefore ensure that those who wield and derive benefits from developing and deploying these technologies are held responsible for their risks and consequences. This includes obligations to ensure that there are effective and legitimate mechanisms that will operate to prevent and forestall violations to human rights which these technologies may threaten, and to attend to the health of the larger collective and shared socio-technical environment in which human rights and the rule of law are anchored….(More)”.

Societal costs and benefits of high-value open government data: a case study in the Netherlands


Paper by F.M. Welle Donker and B. van Loenen: “Much research has emphasised the benefits of open government data, and especially high-value data. The G8 Open Data Charter defines high-value data as data that improve democracy and encourage the innovative reuse of the particular data. Thus, governments worldwide invest resources to identify potential high-value datasets and to publish these data as open data. However, while the benefits of open data are well researched, the costs of publishing data as open data are less researched. This research examines the relationship between the costs of making data suitable for publication as (linked) open data and the societal benefits thereof. A case study of five high-value datasets was carried out in the Netherlands to provide a societal cost-benefit analysis of open high-value data. Different options were investigated, ranging from not publishing the dataset at all to publishing the dataset as linked open data.

In general, it can be concluded that the societal benefits of (linked) open data are higher than the costs. The case studies show that there are differences between the datasets. In many cases, costs for open data are an integral part of general data management costs and hardly lead to additional costs. In certain cases, however, the costs to anonymize /aggregate the data are high compared to the potential value of an open data version of the dataset. Although, for these datasets, this leads to a less favourable relationship between costs and benefits, the societal benefits would still be higher than without an open data version….(More)”.

Index: Open Data


By Alexandra Shaw, Michelle Winowatan, Andrew Young, and Stefaan Verhulst

The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on open data and was originally published in 2018.

Value and Impact

  • The projected year at which all 28+ EU member countries will have a fully operating open data portal: 2020

  • Between 2016 and 2020, the market size of open data in Europe is expected to increase by 36.9%, and reach this value by 2020: EUR 75.7 billion

Public Views on and Use of Open Government Data

  • Number of Americans who do not trust the federal government or social media sites to protect their data: Approximately 50%

  • Key findings from The Economist Intelligence Unit report on Open Government Data Demand:

    • Percentage of respondents who say the key reason why governments open up their data is to create greater trust between the government and citizens: 70%

    • Percentage of respondents who say OGD plays an important role in improving lives of citizens: 78%

    • Percentage of respondents who say OGD helps with daily decision making especially for transportation, education, environment: 53%

    • Percentage of respondents who cite lack of awareness about OGD and its potential use and benefits as the greatest barrier to usage: 50%

    • Percentage of respondents who say they lack access to usable and relevant data: 31%

    • Percentage of respondents who think they don’t have sufficient technical skills to use open government data: 25%

    • Percentage of respondents who feel the number of OGD apps available is insufficient, indicating an opportunity for app developers: 20%

    • Percentage of respondents who say OGD has the potential to generate economic value and new business opportunity: 61%

    • Percentage of respondents who say they don’t trust governments to keep data safe, protected, and anonymized: 19%

Efforts and Involvement

  • Time that’s passed since open government advocates convened to create a set of principles for open government data – the instance that started the open data government movement: 10 years

  • Countries participating in the Open Government Partnership today: 79 OGP participating countries and 20 subnational governments

  • Percentage of “open data readiness” in Europe according to European Data Portal: 72%

    • Open data readiness consists of four indicators which are presence of policy, national coordination, licensing norms, and use of data.

  • Number of U.S. cities with Open Data portals: 27

  • Number of governments who have adopted the International Open Data Charter: 62

  • Number of non-state organizations endorsing the International Open Data Charter: 57

  • Number of countries analyzed by the Open Data Index: 94

  • Number of Latin American countries that do not have open data portals as of 2017: 4 total – Belize, Guatemala, Honduras and Nicaragua

  • Number of cities participating in the Open Data Census: 39

Demand for Open Data

  • Open data demand measured by frequency of open government data use according to The Economist Intelligence Unit report:

    • Australia

      • Monthly: 15% of respondents

      • Quarterly: 22% of respondents

      • Annually: 10% of respondents

    • Finland

      • Monthly: 28% of respondents

      • Quarterly: 18% of respondents

      • Annually: 20% of respondents

    •  France

      • Monthly: 27% of respondents

      • Quarterly: 17% of respondents

      • Annually: 19% of respondents

        •  
    • India

      • Monthly: 29% of respondents

      • Quarterly: 20% of respondents

      • Annually: 10% of respondents

    • Singapore

      • Monthly: 28% of respondents

      • Quarterly: 15% of respondents

      • Annually: 17% of respondents 

    • UK

      • Monthly: 23% of respondents

      • Quarterly: 21% of respondents

      • Annually: 15% of respondents

    • US

      • Monthly: 16% of respondents

      • Quarterly: 15% of respondents

      • Annually: 20% of respondents

  • Number of FOIA requests received in the US for fiscal year 2017: 818,271

  • Number of FOIA request processed in the US for fiscal year 2017: 823,222

  • Distribution of FOIA requests in 2017 among top 5 agencies with highest number of request:

    • DHS: 45%

    • DOJ: 10%

    • NARA: 7%

    • DOD: 7%

    • HHS: 4%

Examining Datasets

  • Country with highest index score according to ODB Leaders Edition: Canada (76 out of 100)

  • Country with lowest index score according to ODB Leaders Edition: Sierra Leone (22 out of 100)

  • Number of datasets open in the top 30 governments according to ODB Leaders Edition: Fewer than 1 in 5

  • Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition: 19%

  • Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition by sector/subject:

    • Budget: 30%

    • Companies: 13%

    • Contracts: 27%

    • Crime: 17%

    • Education: 13%

    • Elections: 17%

    • Environment: 20%

    • Health: 17%

    • Land: 7%

    • Legislation: 13%

    • Maps: 20%

    • Spending: 13%

    • Statistics: 27%

    • Trade: 23%

    • Transport: 30%

  • Percentage of countries that release data on government spending according to ODB Leaders Edition: 13%

  • Percentage of government data that is updated at regular intervals according to ODB Leaders Edition: 74%

  • Number of datasets available through:

  • Number of datasets classed as “open” in 94 places worldwide analyzed by the Open Data Index: 11%

  • Percentage of open datasets in the Caribbean, according to Open Data Census: 7%

  • Number of companies whose data is available through OpenCorporates: 158,589,950

City Open Data

  • New York City

  • Singapore

    • Number of datasets published in Singapore: 1,480

    • Percentage of datasets with standardized format: 35%

    • Percentage of datasets made as raw as possible: 25%

  • Barcelona

    • Number of datasets published in Barcelona: 443

    • Open data demand in Barcelona measured by:

      • Number of unique sessions in the month of September 2018: 5,401

    • Quality of datasets published in Barcelona according to Tim Berners Lee 5-star Open Data: 3 stars

  • London

    • Number of datasets published in London: 762

    • Number of data requests since October 2014: 325

  • Bandung

    • Number of datasets published in Bandung: 1,417

  • Buenos Aires

    • Number of datasets published in Buenos Aires: 216

  • Dubai

    • Number of datasets published in Dubai: 267

  • Melbourne

    • Number of datasets published in Melbourne: 199

Sources

  • About OGP, Open Government Partnership. 2018.  

The Future of Government 2030+ : A Citizen Centric Perspective on New Government Models


EU Policy Lab: “The Future of Government scenarios were developed through a bottom-up process on the basis of open dialogue workshops in Europe with about 130 citizens and 25 civil society and think tank representatives. The Joint Research Centre then reviewed these discussions and synthesised them into four scenarios. Together they highlight some of the key uncertainties about the relationships between citizens, governments and business and explore, through the eyes of European citizens, how government will look in the future. The four scenarios are: 

Citizen science for environmental policy: Development of an EU-wide inventory and analysis of selected practices


EU Science Hub: “Citizen science is the non-professional involvement of volunteers in the scientific process, whether in the data collection phase or in other phases of the research.

It can be a powerful tool for environmental management that has the potential to inform an increasingly complex environmental policy landscape and to meet the growing demands from society for more participatory decision-making.

While there is growing interest from international bodies and national governments in citizen science, the evidence that it can successfully contribute to environmental policy development, implementation, evaluation or compliance remains scant.

Central to elucidating this question is a better understanding of the benefits delivered by citizen science, that is to determine to what extent these benefits can contribute to environmental policy, and to establish whether projects that provide policy support also co-benefit science and encourage meaningful citizen engagement.

EU-wide inventory 

In order to get an evidence base of citizen science activities that can support environmental policies in the European Union (EU), the European Commission (DG ENV, with the support of DG JRC) contracted Bio Innovation Service (FR), in association with Fundacion Ibercivis (ES) and The Natural History Museum (UK), to perform a “Study on an inventory of citizen science activities for environmental policies”.

The first objective was to develop an inventory of citizen science projects relevant for environmental policy and assess how these projects contribute to the Sustainable Development Goals (SDGs) set by the United Nations (UN) General Assembly.

To this end, a desk-research and an EU-wide survey were used to identify 503 citizen science projects of relevance to environmental policy.

The study demonstrates the breadth of citizen science that can be of relevance to environmental policy....Three salient features were found:

  • Government support, not only in the funding, but also through active participation in the design and implementation of the project appears to be a key factor for the successful uptake of citizen science in environmental policy.
  • When there is easy engagement process for the citizens, that is, with projects requiring limited efforts and a priori skills, this facilitates their policy uptake.
  • Scientific aspects on the other hand did not appear to affect the policy uptake of the analysed projects, but they were a strong determinant of how well the project could serve policy: projects with high scientific standards and endorsed by scientists served more phases of the environmental policy cycle.

In conclusion, this study demonstrates that citizen science has the potential to be a cost-effective way to contribute to policy and highlights the importance of fostering a diversity of citizen science activities and their innovativeness….(More)”.

Data scores


Data-scores.org: “Data scores that combine data from a variety of both online and offline activities are becoming a way to categorize citizens, allocating services, and predicting future behavior. Yet little is still known about the implementation of data-driven systems and algorithmic processes in public services and how citizens are increasingly ‘scored’ based on the collection and combination of data.

As part of our project ‘Data Scores as Governance’ we have developed a tool to map and investigate the uses of data analytics and algorithms in public services in the UK. This tool is designed to facilitate further research and investigation into this topic and to advance public knowledge and understanding.

The tool is made up of a collection of documents from different sources that can be searched and mapped according to different categories. The database consists of more than 5300 unverified documents that have been scraped based on a number of search terms relating to data systems in government. This is an incomplete and on-going data-set. You can read more in our Methodology section….(More)”.

Draft Ethics guidelines for trustworthy AI


Working document by the European Commission’s High-Level Expert Group on Artificial Intelligence (AI HLEG): “…Artificial Intelligence (AI) is one of the most transformative forces of our time, and is bound to alter the fabric of society. It presents a great opportunity to increase prosperity and growth, which Europe must strive to achieve. Over the last decade, major advances were realised due to the availability of vast amounts of digital data, powerful computing architectures, and advances in AI techniques such as machine learning. Major AI-enabled developments in autonomous vehicles, healthcare, home/service robots, education or cybersecurity are improving the quality of our lives every day. Furthermore, AI is key for addressing many of the grand challenges facing the world, such as global health and wellbeing, climate change, reliable legal and democratic systems and others expressed in the United Nations Sustainable Development Goals.

Having the capability to generate tremendous benefits for individuals and society, AI also gives rise to certain risks that should be properly managed. Given that, on the whole, AI’s benefits outweigh its risks, we must ensure to follow the road that maximises the benefits of AI while minimising its risks. To ensure that we stay on the right track, a human-centric approach to AI is needed, forcing us to keep in mind that the development and use of AI should not be seen as a means in itself, but as having the goal to increase human well-being. Trustworthy AI will be our north star, since human beings will only be able to confidently and fully reap the benefits of AI if they can trust the technology.

Trustworthy AI has two components: (1) it should respect fundamental rights, applicable regulation and core principles and values, ensuring an “ethical purpose” and (2) it should be technically robust and reliable since, even with good intentions, a lack of technological mastery can cause unintentional harm.

These Guidelines therefore set out a framework for Trustworthy AI:

  • Chapter I deals with ensuring AI’s ethical purpose, by setting out the fundamental rights, principles and values that it should comply with.
  • From those principles, Chapter II derives guidance on the realisation of Trustworthy AI, tackling both ethical purpose and technical robustness. This is done by listing the requirements for Trustworthy AI and offering an overview of technical and non-technical methods that can be used for its implementation.
  • Chapter III subsequently operationalises the requirements by providing a concrete but nonexhaustive assessment list for Trustworthy AI. This list is then adapted to specific use cases. …(More)”

The Yellow Vests movement and the urge to update democracy


Paula Forteza at OGP: “…The Yellow Vests movement in France is a complex social movement that points out social injustices from a political system that has excluded voices for decades. The movement shows the negative effects of the lack of participatory mechanisms in our institutional architecture. If the Yellow Vests are protesting in the streets today, it is certainly because an institutional dialogue was not possible, because their claims did not find an official channel of communication to reach the decision makers.

The inception of this movement is also symptomatic of the need to update our democracies. Organized through Facebook groups, the Yellow Vests is a leaderless movement that is challenging the hierarchical and vertical organization of the decision-making process. We need a more horizontal, agile and decentralized democracy to match the way civil society is getting organized on the internet. Social media platforms are not made for political mobilisation, as the rise of fake news, polarisation and foreign intervention have showed. Learning from these social media flaws, we can back an institutional change with the creation of dedicated platforms for political expression that are transparent, accountable and democratically governed.

Our reaction to this crisis needs to match the expectations. It is urgent to revitalise our democracies through a robust and impactful set of participatory initiatives. We have in our hands the future of the social contract and, in a way, the future of our democracy. Some initiatives have emerged in France: citizen questions to the government, legislative consultations, a collaborative space in the Parliament, more than 80 local participatory budgets and dozens of participatory experimentations. We need to scale up many local initiatives and include impactful and continuous participatory mechanisms into the institutional decision-making process. A constitutional reform is expected in France next January – let’s take this opportunity to institutionalize these mechanisms….(More)”.