Book edited by Stan McClellan: “This book explores categories of applications and driving factors surrounding the Smart City phenomenon. The contributing authors provide perspective on the Smart Cities, covering numerous applications and classes of applications. The book uses a top-down exploration of the driving factors in Smart Cities, by including focal areas including “Smart Healthcare,” “Public Safety & Policy Issues,” and “Science, Technology, & Innovation.” Contributors have direct and substantive experience with important aspects of Smart Cities and discuss issues with technologies & standards, roadblocks to implementation, innovations that create new opportunities, and other factors relevant to emerging Smart City infrastructures….(More)”.
Review into bias in algorithmic decision-making
Interim Report by the Centre for Data Ethics and Innovation (UK): The use of algorithms has the potential to improve the quality of decision- making by increasing the speed and accuracy with which decisions are made. If designed well, they can reduce human bias in decision-making processes. However, as the volume and variety of data used to inform decisions increases, and the algorithms used to interpret the data become more complex, concerns are growing that without proper oversight, algorithms risk entrenching and potentially worsening bias.
The way in which decisions are made, the potential biases which they are subject to and the impact these decisions have on individuals are highly context dependent. Our Review focuses on exploring bias in four key sectors: policing, financial services, recruitment and local government. These have been selected because they all involve significant decisions being made about individuals, there is evidence of the growing uptake of machine learning algorithms in the sectors and there is evidence of historic bias in decision-making within these sectors. This Review seeks to answer three sets of questions:
- Data: Do organisations and regulators have access to the data they require to adequately identify and mitigate bias?
- Tools and techniques: What statistical and technical solutions are available now or will be required in future to identify and mitigate bias and which represent best practice?
- Governance: Who should be responsible for governing, auditing and assuring these algorithmic decision-making systems?
Our work to date has led to some emerging insights that respond to these three sets of questions and will guide our subsequent work….(More)”.
Studying Crime and Place with the Crime Open Database
M. P. J. Ashby in Research Data Journal for the Humanities and Social Sciences: “The study of spatial and temporal crime patterns is important for both academic understanding of crime-generating processes and for policies aimed at reducing crime. However, studying crime and place is often made more difficult by restrictions on access to appropriate crime data. This means understanding of many spatio-temporal crime patterns are limited to data from a single geographic setting, and there are few attempts at replication. This article introduces the Crime Open Database (code), a database of 16 million offenses from 10 of the largest United States cities over 11 years and more than 60 offense types. Open crime data were obtained from each city, having been published in multiple incompatible formats. The data were processed to harmonize geographic co-ordinates, dates and times, offense categories and location types, as well as adding census and other geographic identifiers. The resulting database allows the wider study of spatio-temporal patterns of crime across multiple US cities, allowing greater understanding of variations in the relationships between crime and place across different settings, as well as facilitating replication of research….(More)”.
Governing Smart Data in the Public Interest: Lessons from Ontario’s Smart Metering Entity
Paper by Teresa Scassa and Merlynda Vilain: “The collection of vast quantities of personal data from embedded sensors is increasingly an aspect of urban life. This type of data collection is a feature of so-called smart cities, and it raises important questions about data governance. This is particularly the case where the data may be made available for reuse by others and for a variety of purposes.
This paper focuses on the governance of data captured through “smart” technologies and uses Ontario’s smart metering program as a case study. Ontario rolled out mandatory smart metering for electrical consumption in the early 2000s largely to meet energy conservation goals. In doing so, it designed a centralized data governance system overseen by the Smart Metering Entity to manage smart meter data and to protect consumer privacy. As interest in access to the data grew among third parties, and as new potential applications for the data emerged, the regulator sought to develop a model for data sharing that would protect privacy in relation to these new uses and that would avoid uses that might harm the public interest…(More)”.
Stop Surveillance Humanitarianism
Mark Latonero at The New York Times: “A standoff between the United Nations World Food Program and Houthi rebels in control of the capital region is threatening the lives of hundreds of thousands of civilians in Yemen.
Alarmed by reports that food is being diverted to support the rebels, the aid program is demanding that Houthi officials allow them to deploy biometric technologies like iris scans and digital fingerprints to monitor suspected fraud during food distribution.
The Houthis have reportedly blocked food delivery, painting the biometric effort as an intelligence operation, and have demanded access to the personal data on beneficiaries of the aid. The impasse led the aid organization to the decision last month to suspend food aid to parts of the starving population — once thought of as a last resort — unless the Houthis allow biometrics.
With program officials saying their staff is prevented from doing its essential jobs, turning to a technological solution is tempting. But biometrics deployed in crises can lead to a form of surveillance humanitarianism that can exacerbate risks to privacy and security.
By surveillance humanitarianism, I mean the enormous data collection systems deployed by aid organizations that inadvertently increase the vulnerability of people in urgent need….(More)”.
The Governance Turn in Information Privacy Law
Paper by Jane K. Winn: “The governance turn in information privacy law is a turn away from a model of bureaucratic administration of individual control rights and toward a model of collaborative governance of shared interests in information. Collaborative information governance has roots in the American pragmatic philosophy of Peirce, James and Dewey and the 1973 HEW Report that rejected unilateral individual control rights, recognizing instead the essential characteristic of mutuality of shared purposes that are mediated through information governance. America’s current information privacy law regime consists of market mechanisms supplemented by sector-specific, risk-based laws designed to foster a culture of compliance. Prior to the GDPR, data protection law compliance in Europe was more honored in the breach than the observance, so the EU’s strengthening of its bureaucratic individual control rights model reveals more about the EU’s democratic deficit than a commitment to compliance.
The conventional “Europe good, America bad” wisdom about information privacy law obscures a paradox: if the focus shifts from what “law in the books” says to what “law in action” does, it quickly becomes apparent that American businesses lead the world with their efforts to comply with information privacy law, so “America good, Europe bad” might be more accurate. Creating a federal legislative interface through which regulators and voluntary, consensus standards organizations can collaborate could break the current political stalemate triggered by California’s 2018 EU-style information privacy law. Such a pragmatic approach to information governance can safeguard Americans’ continued access to the benefits of innovation and economic growth as well as providing risk-based protection from harm. America can preserve its leadership of the global information economy by rejecting EU-style information privacy laws and building instead a flexible, dynamic framework of information governance capable of addressing both privacy and disclosure issues simultaneously….(More)”.
Betting on biometrics to boost child vaccination rates
Ben Parker at The New Humanitarian: “Thousands of children between the ages of one and five are due to be fingerprinted in Bangladesh and Tanzania in the largest biometric scheme of its kind ever attempted, the Geneva-based vaccine agency, Gavi, announced recently.
Although the scheme includes data protection safeguards – and its sponsors are cautious not to promise immediate benefits – it is emerging during a widening debate on data protection, technology ethics, and the risks and benefits of biometric ID in development and humanitarian aid.
Gavi, a global vaccine provider, is teaming up with Japanese and British partners in the venture. It is the first time such a trial has been done on this scale, according to Gavi spokesperson James Fulker.
Being able to track a child’s attendance at vaccination centres, and replace “very unreliable” paper-based records, can help target the 20 million children who are estimated to miss key vaccinations, most in poor or remote communities, Fulker said.
Up to 20,000 children will have their fingerprints taken and linked to their records in existing health projects. That collection effort will be managed by Simprints, a UK-based not-for-profit enterprise specialising in biometric technology in international development, according to Christine Kim, the company’s head of strategic partnerships….
Ethics and legal safeguards
Kim said Simprints would apply data protection standards equivalent to the EU’s General Directive on Privacy Regulation (GDPR), even if national legislation did not demand it. Families could opt out without any penalties, and informed consent would apply to any data gathering. She added that the fieldwork would be approved by national governments, and oversight would also come from institutional review boards at universities in the two countries.
Fulker said Gavi had also commissioned a third-party review to verify Simprints’ data protection and security methods.
For critics of biometrics use in humanitarian settings, however, any such plan raises red flags….
Data protection analysts have long been arguing that gathering digital ID and biometric data carries particular risks for vulnerable groups who face conflict or oppression: their data could be shared or leaked to hostile parties who could use it to target them.
In a recent commentary on biometrics and aid, Linda Raftree told The New Humanitarian that “the greatest burden and risk lies with the most vulnerable, whereas the benefits accrue to [aid] agencies.”
And during a panel discussion on “Digital Do No Harm” held last year in Berlin, humanitarian professionals and data experts discussed a range of threats and unintended consequences of new technologies, noting that they are as yet hard to predict….(More)”.
Blockchain and Public Record Keeping: Of Temples, Prisons, and the (Re)Configuration of Power
Paper by Victoria L. Lemieux: “This paper discusses blockchain technology as a public record keeping system, linking record keeping to power of authority, veneration (temples), and control (prisons) that configure and reconfigure social, economic, and political relations. It discusses blockchain technology as being constructed as a mechanism to counter institutions and social actors that currently hold power, but whom are nowadays often viewed with mistrust. It explores claims for blockchain as a record keeping force of resistance to those powers using an archival theoretic analytic lens. The paper evaluates claims that blockchain technology can support the creation and preservation of trustworthy records able to serve as alternative sources of evidence of rights, entitlements and actions with the potential to unseat the institutional power of the nation-state….(More)”.
Secrecy, Privacy and Accountability: Challenges for Social Research
Book by Mike Sheaff: “Public mistrust of those in authority and failings of public organisations frame disputes over attribution of responsibility between individuals and systems. Exemplified with examples, including the Aberfan disaster, the death of Baby P, and Mid Staffs Hospital, this book explores parallel conflicts over access to information and privacy.
The Freedom of Information Act (FOIA) allows access to information about public organisations but can be in conflict with the Data Protection Act, protecting personal information. Exploring the use of the FOIA as a research tool, Sheaff offers a unique contribution to the development of sociological research methods, and debates connected to privacy and secrecy in the information age. This book will provide sociologists and social scientists with a fresh perspective on contemporary issues of power and control….(More)”.
How can Indigenous Data Sovereignty (IDS) be promoted and mainstreamed within open data movements?
OD Mekong Blog: “Considering Indigenous rights in the open data and technology space is a relatively new concept. Called “Indigenous Data Sovereignty” (IDS), it is defined as “the right of Indigenous peoples to govern the collection, ownership, and application of data about Indigenous communities, peoples, lands, and resources”, regardless of where the data is held or by whom. By default, this broad and all-encompassing framework bucks fundamental concepts of open data, and asks traditional open data practitioners to critically consider how open data can be used as a tool of transparency that also upholds equal rights for all…
Four main areas of concern and relevant barriers identified by participants were:
Self-determination to identify their membership
- National governments in many states, particularly across Asia and South America, still do not allow for self-determination under the law. Even when legislation offers some recognition these are scarcely enforced, and mainstream discourse demonises Indigenous self-determination.
- However, because Indigenous and ethnic minorities frequently face hardships and persecution on a daily basis, there were concerns about the applicability of data sovereignty at the local levels.
Intellectual Property Protocols
- It has become the norm in the everyday lives of people for big tech companies to extract data in excessive amounts. How do disenfranchised communities combat this?
- Indigenous data is often misappropriated to the detriment of Indigenous peoples.
- Intellectual property concepts, such as copyright, are not an ideal approach for protecting Indigenous knowledge and intellectual property rights because they are rooted in commercialistic ideals that are difficult to apply to Indigenous contexts. This is especially so because many groups do not practice commercialization in the globalized context. Also, as a concept based on exclusivity (i.e., when licenses expire knowledge gets transferred over as public goods), it doesn’t take into account the collectivist ideals of Indigenous peoples.
Data Governance
- Ultimately, data protection is about protecting lives. Having the ability to use data to direct decisions on Indigenous development places greater control in the hands of Indigenous peoples.
- National governments are barriers due to conflicts in sovereignty interests. Nation-state legal systems are often contradictory to customary laws, and thus don’t often reflect rights-based approaches.
Consent — Free Prior and Informed Consent (FPIC)
- FPIC, referring to a set of principles that define the process and mechanisms that apply specifically to Indigenous peoples in relation to the exercise of their collective rights, is a well-known phrase. They are intended to ensure that Indigenous peoples are treated as sovereign peoples with their own decision-making power, customary governance systems, and collective decision-making processes, but it is questionable as to what level one can ensure true FPIC in the Indigenous context.²
- It remains a question as too how effectively due diligence can be applied to research protocols, so as to ensure that the rights associated with FPIC and the UNDRIP framework are upheld….(More)”.