Sarah Wray at Cities Today: “Private companies, research institutions and government organisations in Amsterdam are now obliged to report sensors deployed in public spaces.
The information is being displayed via an online map to give residents more insight into how, where and what data is collected from sources such as cameras, air quality and traffic sensors, Wi-Fi counters and smart billboards. The map shows the type of sensor, the owner and whether personal data is processed.
A statement from the city said: “Amsterdam believes that residents have the right to know where and when data is collected. The sensor register and the reporting obligation help to create awareness. It is one of the 18 actions from the Amsterdam Data Strategy.”
The requirement applies to new sensors and those that are already installed in the city, including mobile sensors.
So far, only sensors from the City of Amsterdam have been included in the register. Other owners are now urged to report their sensors and have until 1 June 2022 before enforcement action will be taken.
If there is no response even after warnings, the municipality can remove the sensor at the owner’s expense, the city said.
The obligation to report sensors is part of a regulation update recently passed by the City Council…(More)”.
Paper by Noémi Bontridder and Yves Poullet: “Artificial intelligence (AI) systems are playing an overarching role in the disinformation phenomenon our world is currently facing. Such systems boost the problem not only by increasing opportunities to create realistic AI-generated fake content, but also, and essentially, by facilitating the dissemination of disinformation to a targeted audience and at scale by malicious stakeholders. This situation entails multiple ethical and human rights concerns, in particular regarding human dignity, autonomy, democracy, and peace. In reaction, other AI systems are developed to detect and moderate disinformation online. Such systems do not escape from ethical and human rights concerns either, especially regarding freedom of expression and information. Having originally started with ascending co-regulation, the European Union (EU) is now heading toward descending co-regulation of the phenomenon. In particular, the Digital Services Act proposal provides for transparency obligations and external audit for very large online platforms’ recommender systems and content moderation. While with this proposal, the Commission focusses on the regulation of content considered as problematic, the EU Parliament and the EU Council call for enhancing access to trustworthy content. In light of our study, we stress that the disinformation problem is mainly caused by the business model of the web that is based on advertising revenues, and that adapting this model would reduce the problem considerably. We also observe that while AI systems are inappropriate to moderate disinformation content online, and even to detect such content, they may be more appropriate to counter the manipulation of the digital ecosystem….(More)”.
Report by Laurie Laybourn-Langton, Harry Quilter-Pinner, and Nicolas Treloar: “Movements change the world. Throughout history, loosely organised networks of individuals and organisations have sought changes to societies – and won. From the abolitionist struggle and campaigns for voting rights to #MeToo and #BlackLivesMatter, the impact of movements can be seen everywhere.
Over the last year, IPPR and the Runnymede Trust have sought to understand what we can learn from movements that have made change – as well as those who have fallen short – for our efforts to create change today.
We did this by exploring what worked and didn’t work for four movements from recent decades. These were: LGBTQ+ rights, race equality, climate action, and health inequality….(More)”.
Book by Rebekah Dowd on “Digitized Data Governance as a Human Rights Issue in the EU”: “…This book considers contested responsibilities between the public and private sectors over the use of online data, detailing exactly how digital human rights evolved in specific European states and gradually became a part of the European Union framework of legal protections. The author uniquely examines why and how European lawmakers linked digital data protection to fundamental human rights, something heretofore not explained in other works on general data governance and data privacy. In particular, this work examines the utilization of national and European Union institutional arrangements as a location for activism by legal and academic consultants and by first-mover states who legislated digital human rights beginning in the 1970s. By tracing the way that EU Member States and non-state actors utilized the structure of EU bodies to create the new norm of digital human rights, readers will learn about the process of expanding the scope of human rights protections within multiple dimensions of European political space. The project will be informative to scholar, student, and layperson, as it examines a new and evolving area of technology governance – the human rights of digital data use by the public and private sectors….(More)”.
Paper by Lisa Schmidthuber, Dennis Hilgers, and Krithika Randhawa: “Government organizations increasingly use crowdsourcing platforms to interact with citizens and integrate their requests in designing and delivering public services. Government usually provides feedback to individual users on whether the request can be considered. Drawing on attribution theory, this study asks how the causal attributions of the government response affect continued participation in crowdsourcing platforms. To test our hypotheses, we use a 7-year dataset of both online requests from citizens to government and government responses to citizen requests. We focus on citizen requests that are denied by government, and find that stable and uncontrollable attributions of the government response have a negative effect on future participation behavior. Also, a local government’s locus of causality negatively affects continued participation. This study contributes to research on the role of responsiveness in digital interaction between citizens and government and highlights the importance of rationale transparency to sustain citizen participation…(More)”.
Book edited by Elise Poillot, Gabriele Lenzini, Giorgio Resta, and Vincenzo Zeno-Zencovich: “The volume presents the results of a research project (named “Legafight”) funded by the Luxembourg Fond National de la Recherche in order to verify if and how digital tracing applications could be implemented in the Grand-Duchy in order to counter and abate the Covid-19 pandemic. This inevitably brought to a deep comparative overview of the various existing various models, starting from that of the European Union and those put into practice by Belgium, France, Germany, and Italy, with attention also to some Anglo-Saxon approaches (the UK and Australia). Not surprisingly the main issue which had to be tackled was that of the protection of the personal data collected through the tracing applications, their use by public health authorities and the trust laid in tracing procedures by citizens. Over the last 18 months tracing apps have registered a rise, a fall, and a sudden rebirth as mediums devoted not so much to collect data, but rather to distribute real time information which should allow informed decisions and be used as repositories of health certifications…(More)”.
Paper by Roel Heijlen and Joep Crompvoets: “Governments around the world own multiple datasets related to the policy domain of health. Datasets range from vaccination rates to the availability of health care practitioners in a region to the outcomes of certain surgeries. Health is believed to be a promising subject in the case of open government data policies. However, the specific properties of health data such as its sensibilities regarding privacy, ethics, and ownership encompass particular conditions either enabling or preventing datasets to become freely and easily accessible for everyone…
This paper aims to map the ecosystem of open health data. By analyzing the foundations of health data and the commonalities of open data ecosystems via literature analysis, the socio-technical environment in which health data managed by governments are opened up or potentially stay closed is created. After its theoretical development, the open health data ecosystem is tested via a case study concerning the Data for Better Health initiative from the government of Belgium…
The policy domain of health includes de-identification activities, bioethical assessments, and the specific role of data providers within its open data ecosystem. However, the concept of open data does not always fully apply to the topic of health. Such several health datasets may be findable via government portals but not directly accessible. Differentiation within types of health data and data user capacities are recommendable for future research….(More)”
Policy Brief by UserCentriCities project: “.. looks critically at the need for putting citizens at the heart of digital government – and analyses six successful projects in key European cities: Bologna (Emilia Romagna Region), Espoo, Milan, Murcia, Rotterdam and Tallinn. Building on lessons learned in a year of structured interviews with leading officials in the UserCentriCities project, the policy brief looks at key trends driving breakthroughs in digital-service delivery – in the public and private sector – and proposes a five-point roadmap for greater Europe-national-local collaboration in the service of citizens. The policy brief will launch at The 2021 UserCentriCities Summit, in the presence of Boštjan Koritnik, minister for public administration of Slovenia, which currently holds the Presidency of the Council of the European Union….(More)”.
Article by Talib Visram: “In October, New York City’s three public library systems announced they would permanently drop fines on late book returns. Comprised of Brooklyn, Queens, and New York public libraries, the City’s system is the largest in the country to remove fines. It’s a reversal of a long-held policy intended to ensure shelves stayed stacked, but an outdated one that many major cities, including Chicago, San Francisco, and Dallas, had already scrapped without any discernible downsides. Though a source of revenue—in 2013, for instance, Brooklyn Public Library (BPL) racked up $1.9 million in late fees—the fee system also created a barrier to library access that disproportionately touched the low-income communities that most need the resources.
That’s just one thing Brooklyn’s library system has done to try to make its services more equitable. In 2017, well before the move to eliminate fines, BPL on its own embarked on a partnership with Nudge, a behavioral science lab at the University of West Virginia, to find ways to reduce barriers to access and increase engagement with the book collections. In the first-of-its-kind collaboration, the two tested behavioral science interventions via three separate pilots, all of which led to the library’s long-term implementation of successful techniques. Those involved in the project say the steps can be translated to other library systems, though it takes serious investment of time and resources….(More)”.
Paper by Michiel Bijlsma, Carin van der Cruijsen and Nicole Jonker: “The COVID-19 pandemic has increased our online presence and unleashed a new discussion on sharing sensitive personal data. Upcoming European legislation will facilitate data sharing in several areas, following the lead of the revised payments directive (PSD2), which enables payments data sharing with third parties. However, little is known about what drives consumers’ preferences with different types of data, as preferences may differ according to the type of data, type of usage or type of firm using the data.
Using a discrete-choice survey approach among a representative group of Dutch consumers, we find that next to health data, people are hesitant to share their financial data on payments, wealth and pensions, compared to other types of consumer data. Second, consumers are especially cautious about sharing their data when they are not used anonymously. Third, consumers are more hesitant to share their data with BigTechs, webshops and insurers than they are with banks. Fourth, a financial reward can trigger data sharing by consumers. Last, we show that attitudes towards data usage depend on personal characteristics, consumers’ digital skills, online behaviour and their trust in the firms using the data…(More)”