Open Access Book edited by C. Kuner, L.A. Bygrave and C. Docksey et al: ” provides an update for selected articles of the GDPR Commentary published in 2020 by Oxford University Press. It covers developments between the last date of coverage of the Commentary (1 August 2019) and 1 January 2021 (with a few exceptions when later developments are taken into account). Edited by Christopher Kuner, Lee A. Bygrave, Chris Docksey, Laura Drechsler, and Luca Tosoni, it covers 49 articles of the GDPR, and is being made freely accessible with the kind permission of Oxford University Press. It also includes two appendices that cover the same period as the rest of this update: the first deals with judgments of the European courts and some selected judgments of particular importance from national courts, and the second with EDPB papers…(More)”
Jer Thorp at Literary Hub: “Public” is a word that has, in the last decade, become bound tightly to data. Loosely defined, any data that is available in the public domain falls into this category, but the term is most often used to describe data that might serve some kind of civic purpose: census data or environmental data or health data, along with transparency-focused data like government budgets and reports. Often sidled up to “public” is the word “open.” Although the Venn diagram between the two words has ample overlap (public data is often open, and vice versa), the word “open” typically refers to if and how the data is accessible, rather than toward what ends it might be put to use.
Both words—“public” and “open”—invite a question: For whom? Despite the efforts of Mae and Gareth, and Tom Grundner and many others, the internet as it exists is hardly a public space. Many people still find themselves excluded from full participation. Access to anything posted on a city web page or on a .gov domain is restricted by barriers of cost and technical ability. Getting this data can be particularly hard for communities that are already marginalized, and both barriers—financial and technical—can be nearly impassable in places with limited resources and literacies.
Data.gov, the United States’ “open data portal,” lists nearly 250,000 data sets, an apparent bounty of free information. Spend some time on data.gov and other portals, though, and you’ll find out that public data as it exists is messy and often confusing. Many hosted “data sets” are links to URLs that are no longer active. Trying to access data about Native American communities from the American Community Survey on data.gov brought me first to a census site with an unlabeled list of file folders. Downloading a zip file and unpacking it resulted in 64,086 cryptically named text files each containing zero kilobytes of data. As someone who has spent much of the last decade working with these kinds of data, I can tell you that this is not an uncommon experience. All too often, working with public data feels like assembling particularly complicated Ikea furniture with no tools, no instructions, and an unknown number of missing pieces.
Today’s public data serves a particular type of person and a specific type of purpose. Mostly, it supports technically adept entrepreneurs. Civic data initiatives haven’t been shy about this; on data.gov’s impact page you’ll find a kind of hall-of-fame list of companies that are “public data success stories”: Kayak, Trulia, Foursquare, LinkedIn, Realtor.com, Zillow, Zocdoc, AccuWeather, Carfax. All of these corporations have, in some fashion, built profit models around public data, often charging for access to the very information that the state touts as “accessible, discoverable, and usable.”…(More)”.
The Data Stewards Academy…A self-directed learning program from the Open Data Policy Lab (The GovLab): “Communities across the world face unprecedented challenges. Strained by climate change, crumbling infrastructure, growing economic inequality, and the continued costs of the COVID-19 pandemic, institutions need new ways of solving public problems and improving how they operate.
In recent years, data has been increasingly used to inform policies and interventions targeted at these issues. Yet, many of these data projects, data collaboratives, and open data initiatives remain scattered. As we enter into a new age of data use and re-use, a third wave of open data, it is more important than ever to be strategic and purposeful, to find new ways to connect the demand for data with its supply to meet institutional objectives in a socially responsible way.
This self-directed learning program, adapted from a selective executive education course, will help data stewards (and aspiring data stewards) develop a data re-use strategy to solve public problems. Noting the ways data resources can inform their day-to-day and strategic decision-making, the course provides learners with ways they can use data to improve how they operate and pursue goals in the public’s interests. By working differently—using agile methods and data analytics—public, private, and civil sector leaders can promote data re-use and reduce data access inequities in ways that advance their institution’s goals.
In this self-directed learning program, we will teach participants how to develop a 21st century data strategy. Participants will learn:
- Why It Matters: A discussion of the three waves of open data and how data re-use has proven to be transformative;
- The Current State of Play: Current practice around data re-use, including deficits of current approaches and the need to shift from ad hoc engagements to more systematic, sustainable, and responsible models;
- Defining Demand: Methodologies for how organizations can formulate questions that data can answer; and make data collaboratives more purposeful;
- Mapping Supply: Methods for organizations to discover and assess the open and private data needed to answer the questions at hand that potentially may be available to them;
- Matching Supply with Demand: Operational models for connecting and meeting the needs of supply- and demand-side actors in a sustainable way;
- Identifying Risks: Overview of the risks that can emerge in the course of data re-use;
- Mitigating Risks and Other Considerations: Technical, legal and contractual issues that can be leveraged or may arise in the course of data collaboration and other data work; and
- Institutionalizing Data Re-use: Suggestions for how organizations can incorporate data re-use into their organizational structure and foster future collaboration and data stewardship.
The Data Stewardship Executive Education Course was designed and implemented by program leads Stefaan Verhulst, co-founder and chief research development officer at the GovLab, and Andrew Young, The GovLab’s knowledge director, in close collaboration with a global network of expert faculty and advisors. It aims to….(More)”.
Paper by Sofia Ranchordas: “Recent EU legislative and policy initiatives aim to offer flexible, innovation-friendly, and future-proof regulatory frameworks. Key examples are the EU Coordinated Plan on AI and the recently published EU AI Regulation Proposal which refer to the importance of experimenting with regulatory sandboxes so as to balance innovation in AI against its potential risks. Originally developed in the Fintech sector, regulatory sandboxes create a testbed for a selected number of innovative projects, by waiving otherwise applicable rules, guiding compliance, or customizing enforcement. Despite the burgeoning literature on regulatory sandboxes and the regulation of AI, the legal, methodological, and ethical challenges of regulatory sandboxes have remained understudied. This exploratory article delves into the some of the benefits and intricacies of employing experimental legal instruments in the context of the regulation of AI. This article’s contribution is twofold: first, it contextualizes the adoption of regulatory sandboxes in the broader discussion on experimental approaches to regulation; second, it offers a reflection on the steps ahead for the design and implementation of AI regulatory sandboxes….(More)”.
Article by Lucrezia Lozza: “Marta has lived with a bad smell lingering in her hometown in central Spain, Villanueva del Pardillo, for a long time. Fed up, in 2017 she and her neighbors decided to pursue the issue. “The smell is disgusting,” Marta says, pointing a finger at a local yeast factory.
Originally, she thought of recording the “bad smell days” on a spreadsheet. When this didn’t work out, after some research she found Odour Collect, a crowdsourced map that allows users to enter a geolocalized timestamp of bad smells in their neighborhood.
After noise, odor nuisances are the second cause of environmental complaints. Odor regulations vary among countries and there’s little legislation about how to manage smells. For instance, in Spain some municipalities regulate odors, but others do not. In the United States, the Environmental Protection Agency does not regulate odor as a pollutant, so states and local jurisdictions are in charge of the issue.
Only after Marta started using Odour Collect to record the unpleasant smells in her town did she discover that the map was part of ‘D-NOSES’, a European project aimed at bringing citizens, industries and local authorities together to monitor and minimize odor nuisances. D-NOSES relies heavily on citizen science: Affected communities gather odor observations through two maps — Odour Collect and Community Maps — with the goal of implementing new policies in their area. D-NOSES launched several pilots in Europe — in Spain, Greece, Bulgaria, and Portugal — and two outside the continent in Uganda and in Chile.
“Citizen science promotes transparency between all the actors,” said Nora Salas Seoane, Social Sciences Researcher at Fundación Ibercivis, one of the partners of D-NOSES…(More)”.
Article by Mary Blankenship and Carol Graham: “Mass shootings that result in mass casualties are almost a weekly occasion in the United States, which—not coincidentally—also has the most guns per capita in the world. Viewed from outside the U.S., it seems that Americans are not bothered by the constant deadly gun violence and have simply adapted to it. Yet, our analysis of the well-being costs of gun violence—using Twitter data to track real-time responses throughout the course of these appalling events—suggest that is not necessarily the case. We focus on the two March 2021 shootings in Atlanta and Boulder, and compare to similar data for the “1 October” (Las Vegas) and El Paso shootings a few years prior. (Details on our methodology can be found at the end of this blog.)
A reason for the one-sided debate on guns is that beyond the gruesome body counts, we do not have many tools for assessing the large—but unobservable—effects of this violence on family members, friends, and neighbors of the victims, as well as on society in general. By assessing how emotions evolve over time, real changes can be seen in Twitter messages. Our analysis shows that society is increasingly angered by gun violence, rather than simply adapting to it.
A striking characteristic of the response to the 1 October shooting is the immediate influx of users sending their thoughts and players to the victims and the Las Vegas community. Figure 1 shows the top emoji usage and “praying hands” being the most frequently used emoji. Although that is still the most used emoji in response to the other shootings, the margin between “praying hands” and other emojis has substantially decreased in recent responses to Atlanta and Boulder. Our focus is on the “yellow face” emojis, which can correlate to six primary emotions categories: surprise, sadness, disgust, fear, anger, and neutral. While the majority of face emojis reflect emotions of sadness in the 1 October and El Paso shooting, new emojis like the “red angry face” show greater feelings of anger in the Atlanta and Boulder shootings shown in Figure 3….(More)”.
Figure 1. Top 10 emojis used in response to the 1 October shooting
Report by Deloitte: “…This “digital divide” was first noted more than 25 years ago as consumer communications needs shifted from landline voice to internet access. The economics of broadband spawned availability, adoption, and affordability disparities between rural and urban geographies and between lower- and higher-income segments. Today, the digital divide still presents a significant gap after more than $100 billion of infrastructure investment has been allocated by the US government over the past decade to address this issue. The current debate regarding additional funds for broadband deployment implies that further examination is warranted regarding how to get to broadband for all and achieve the resulting economic prosperity.
Quantifying the economic impact of bridging the digital divide clearly shows the criticality of broadband infrastructure to the US economy. Deloitte developed economic models to evaluate the relationship between broadband and economic growth. Our models indicate that a 10-percentage-point increase of broadband penetration in 2016 would have resulted in more than 806,000 additional jobs in 2019, or an average annual increase of 269,000 jobs. Moreover, we found a strong correlation between broadband availability and jobs and GDP growth. A 10-percentage-point increase of broadband access in 2014 would have resulted in more than 875,000 additional US jobs and $186B more in economic output in 2019. The analysis also showed that higher broadband speeds drive noticeable improvements in job growth, albeit with diminishing returns. As an example, the gain in jobs from 50 to 100 Mbps is more than the gain in jobs from 100 to 150 Mbps….(More)”.
Book edited by Stefan Berger, Susanne Fengler, Dimitrij Owetschkin, and Julia Sittmann: “This volume addresses the major questions surrounding a concept that has become ubiquitous in the media and in civil society as well as in political and economic discourses in recent years, and which is demanded with increasing frequency: transparency.
How can society deal with increasing and often diverging demands and expectations of transparency? What role can different political and civil society actors play in processes of producing, or preventing, transparency? Where are the limits of transparency and how are these boundaries negotiated? What is the relationship of transparency to processes of social change, as well as systems of social surveillance and control? Engaging with transparency as an interrelated product of law, politics, economics and culture, this interdisciplinary volume explores the ambiguities and contradictions, as well as the social and political dilemmas, that the age of transparency has unleashed….(More)”.
Press Release: “The World Health Organization (WHO) and the Federal Republic of Germany will establish a new global hub for pandemic and epidemic intelligence, data, surveillance and analytics innovation. The Hub, based in Berlin and working with partners around the world, will lead innovations in data analytics across the largest network of global data to predict, prevent, detect prepare for and respond to pandemic and epidemic risks worldwide.
H.E. German Federal Chancellor Dr Angela Merkel said: “The current COVID-19 pandemic has taught us that we can only fight pandemics and epidemics together. The new WHO Hub will be a global platform for pandemic prevention, bringing together various governmental, academic and private sector institutions. I am delighted that WHO chose Berlin as its location and invite partners from all around the world to contribute to the WHO Hub.”
The WHO Hub for Pandemic and Epidemic Intelligence is part of WHO’s Health Emergencies Programme and will be a new collaboration of countries and partners worldwide, driving innovations to increase availability and linkage of diverse data; develop tools and predictive models for risk analysis; and to monitor disease control measures, community acceptance and infodemics. Critically, the WHO Hub will support the work of public health experts and policy-makers in all countries with insights so they can take rapid decisions to prevent and respond to future public health emergencies.
“We need to identify pandemic and epidemic risks as quickly as possible, wherever they occur in the world. For that aim, we need to strengthen the global early warning surveillance system with improved collection of health-related data and inter-disciplinary risk analysis,” said Jens Spahn, German Minister of Health. “Germany has consistently been committed to support WHO’s work in preparing for and responding to health emergencies, and the WHO Hub is a concrete initiative that will make the world safer.”
Working with partners globally, the WHO Hub will drive a scale-up in innovation for existing forecasting and early warning capacities in WHO and Member States. At the same time, the WHO Hub will accelerate global collaborations across public and private sector organizations, academia, and international partner networks. It will help them to collaborate and co-create the necessary tools for managing and analyzing data for early warning surveillance. It will also promote greater access to data and information….(More)”.
Lynne Parker at the AI.gov website: “Artificial intelligence (AI) has become one of the most impactful technologies of the twenty-first century. Nearly every sector of the economy and society has been affected by the capabilities and potential of AI. AI is enabling farmers to grow food more efficiently, medical researchers to better understand and treat COVID-19, scientists to develop new materials, transportation professionals to deliver more goods faster and with less energy, weather forecasters to more accurately predict the tracks of hurricanes, and national security protectors to better defend our Nation.
At the same time, AI has raised important societal concerns. What is the impact of AI on the changing nature of work? How can we ensure that AI is used appropriately, and does not result in unfair discrimination or bias? How can we guard against uses of AI that infringe upon human rights and democratic principles?
These dual perspectives on AI have led to the concept of “trustworthy AI”. Trustworthy AI is AI that is designed, developed, and used in a manner that is lawful, fair, unbiased, accurate, reliable, effective, safe, secure, resilient, understandable, and with processes in place to regularly monitor and evaluate the AI system’s performance and outcomes.
Achieving trustworthy AI requires an all-of-government and all-of-Nation approach, combining the efforts of industry, academia, government, and civil society. The Federal government is doing its part through a national strategy, called the National AI Initiative Act of 2020 (NAIIA). The National AI Initiative (NAII) builds upon several years of impactful AI policy actions, many of which were outcomes from EO 13859 on Maintaining American Leadership in AI.
Six key pillars define the Nation’s AI strategy:
- prioritizing AI research and development;
- strengthening AI research infrastructure;
- advancing trustworthy AI through technical standards and governance;
- training an AI-ready workforce;
- promoting international AI engagement; and
- leveraging trustworthy AI for government and national security.
Coordinating all of these efforts is the National AI Initiative Office, which is legislated by the NAIIA to coordinate and support the NAII. This Office serves as the central point of contact for exchanging technical and programmatic information on AI activities at Federal departments and agencies, as well as related Initiative activities in industry, academia, nonprofit organizations, professional societies, State and tribal governments, and others.
The AI.gov website provides a portal for exploring in more depth the many AI actions, initiatives, strategies, programs, reports, and related efforts across the Federal government. It serves as a resource for those who want to learn more about how to take full advantage of the opportunities of AI, and to learn how the Federal government is advancing the design, development, and use of trustworthy AI….(More)”