Research Handbook on Human Rights and Digital Technology


Book edited by Ben Wagner, Matthias C. Kettemann and Kilian Vieth: “In a digitally connected world, the question of how to respect, protect and implement human rights has become unavoidable. This contemporary Research Handbook offers new insights into well-established debates by framing them in terms of human rights. It examines the issues posed by the management of key Internet resources, the governance of its architecture, the role of different stakeholders, the legitimacy of rule making and rule-enforcement, and the exercise of international public authority over users. Highly interdisciplinary, its contributions draw on law, political science, international relations and even computer science and science and technology studies…(More)”.

Contracts for Data Collaboration


The GovLab: “The road to achieving the Sustainable Development Goals is complex and challenging. Policymakers around the world need both new solutions and new ways to become more innovative. This includes evidence-based policy and program design, as well as improved monitoring of progress made.

Unlocking privately processed data through data collaboratives — a new form of public-private partnership in which private industry, government and civil society work together to release previously siloed data — has become essential to address the challenges of our era.

Yet while research has proven its promise and value, several barriers to scaling data collaboration exist.

Ensuring trust and shared responsibility in how the data will be handled and used proves particularly challenging, because of the high transaction costs involved in drafting contracts and agreements of sharing.

Ensuring Trust in Data Collaboration

The goal of the Contracts for Data Collaboration (C4DC) initiative is to address the inefficiencies of developing contractual agreements for public-private data collaboration.

The intent is to inform and guide those seeking to establish a data collaborative by developing and making available a shared repository of contractual clauses (taken from existing data sharing agreements) that covers a host of issues, including (non –exclusive):

  • The provenance, quality and purpose of data;
  • Security and privacy concerns;
  • Roles and responsibilities of participants;
  • Access provisions; and use limitations;
  • Governance mechanisms;
  • Other contextual mechanisms

In addition to the searchable library of contractual clauses, the repository will house use cases, guides and other information that analyse common patterns, language and best practices.

Help Us Scale Data Collaboration

Contracts for Data Collaboration builds on efforts from member organizations that have experience in developing and managing data collaboratives; and have documented the legal challenges and opportunities of data collaboration.

The initiative is an open collaborative with charter members from the GovLab at NYU, UN SDSN Thematic Research Network on Data and Statistics (TReNDS), University of Washington and the World Economic Forum.

Organizations interested in joining the initiative should contact the individuals noted below; or share any agreements they have used for data sharing activities (without any sensitive or identifiable information): Stefaan Verhulst, GovLab ([email protected]) …(More)

“Giving something back”: A systematic review and ethical enquiry into public views on the use of patient data for research in the United Kingdom and the Republic of Ireland


Paper by Jessica Stockdale, Jackie Cassell and Elizabeth Ford: “The use of patients’ medical data for secondary purposes such as health research, audit, and service planning is well established in the UK, and technological innovation in analytical methods for new discoveries using these data resources is developing quickly. Data scientists have developed, and are improving, many ways to extract and process information in medical records. This continues to lead to an exciting range of health related discoveries, improving population health and saving lives. Nevertheless, as the development of analytic technologies accelerates, the decision-making and governance environment as well as public views and understanding about this work, has been lagging behind1.

Public opinion and data use

A range of small studies canvassing patient views, mainly in the USA, have found an overall positive orientation to the use of patient data for societal benefit27. However, recent case studies, like NHS England’s ill-fated Care.data scheme, indicate that certain schemes for secondary data use can prove unpopular in the UK. Launched in 2013, Care.data aimed to extract and upload the whole population’s general practice patient records to a central database for prevalence studies and service planning8. Despite the stated intention of Care.data to “make major advances in quality and patient safety”8, this programme was met with a widely reported public outcry leading to its suspension and eventual closure in 2016. Several factors may have been involved in this failure, from the poor public communication about the project, lack of social licence9, or as pressure group MedConfidential suggests, dislike of selling data to profit-making companies10. However, beyond these specific explanations for the project’s failure, what ignited public controversy was a concern with the impact that its aim to collect and share data on a large scale might have on patient privacy. The case of Care.data indicates a reluctance on behalf of the public to share their patient data, and it is still not wholly clear whether the public are willing to accept future attempts at extracting and linking large datasets of medical information. The picture of mixed opinion makes taking an evidence-based position, drawing on social consensus, difficult for legislators, regulators, and data custodians who may respond to personal or media generated perceptions of public views. However, despite differing results of studies canvassing public views, we hypothesise that there may be underlying ethical principles that could be extracted from the literature on public views, which may provide guidance to policy-makers for future data-sharing….(More)”.

EU negotiators agree on new rules for sharing of public sector data


European Commission Press Release: “Negotiators from the European Parliament, the Council of the EU and the Commission have reached an agreement on a revised directive that will facilitate the availability and re-use of public sector data.

Data is the fuel that drives the growth of many digital products and services. Making sure that high-quality, high-value data from publicly funded services is widely and freely available is a key factor in accelerating European innovation in highly competitive fields such as artificial intelligence requiring access to vast amounts of high-quality data.

In full compliance with the EU General Data Protection Regulation, the new Directive on Open Data and Public Sector Information (PSI) – which can be for example anything from anonymised personal data on household energy use to general information about national education or literacy levels – updates the framework setting out the conditions under which public sector data should be made available for re-use, with a particular focus on the increasing amounts of high-value data that is now available.

Vice-President for the Digital Single Market Andrus Ansip said: “Data is increasingly the lifeblood of today’s economy and unlocking the potential of public open data can bring significant economic benefits. The total direct economic value of public sector information and data from public undertakings is expected to increase from €52 billion in 2018 to €194 billion by 2030. With these new rules in place, we will ensure that we can make the most of this growth” 

Commissioner for Digital Economy and Society Mariya Gabriel said: “Public sector information has already been paid for by the taxpayer. Making it more open for re-use benefits the European data economy by enabling new innovative products and services, for example based on artificial intelligence technologies. But beyond the economy, open data from the public sector is also important for our democracy and society because it increases transparency and supports a facts-based public debate.”

As part of the EU Open Data policy, rules are in place to encourage Member States to facilitate the re-use of data from the public sector with minimal or no legal, technical and financial constraints. But the digital world has changed dramatically since they were first introduced in 2003.

What do the new rules cover?

  • All public sector content that can be accessed under national access to documents rules is in principle freely available for re-use. Public sector bodies will not be able to charge more than the marginal cost for the re-use of their data, except in very limited cases. This will allow more SMEs and start-ups to enter new markets in providing data-based products and services.
  • A particular focus will be placed on high-value datasets such as statistics or geospatial data. These datasets have a high commercial potential, and can speed up the emergence of a wide variety of value-added information products and services.
  • Public service companies in the transport and utilities sector generate valuable data. The decision on whether or not their data has to be made available is covered by different national or European rules, but when their data is available for re-use, they will now be covered by the Open Data and Public Sector Information Directive. This means they will have to comply with the principles of the Directive and ensure the use of appropriate data formats and dissemination methods, while still being able to set reasonable charges to recover related costs.
  • Some public bodies strike complex data deals with private companies, which can potentially lead to public sector information being ‘locked in’. Safeguards will therefore be put in place to reinforce transparency and to limit the conclusion of agreements which could lead to exclusive re-use of public sector data by private partners.
  • More real-time data, available via Application Programming Interfaces (APIs), will allow companies, especially start-ups, to develop innovative products and services, e.g. mobility apps. Publicly-funded research data is also being brought into the scope of the directive: Member States will be required to develop policies for open access to publicly funded research data while harmonised rules on re-use will be applied to all publicly-funded research data which is made accessible via repositories….(More)”.

Info We Trust: How to Inspire the World with Data


Book by R.J. Andrews: “How do we create new ways of looking at the world? Join award-winning data storyteller RJ Andrews as he pushes beyond the usual how-to, and takes you on an adventure into the rich art of informing.

Creating Info We Trust is a craft that puts the world into forms that are strong and true.  It begins with maps, diagrams, and charts — but must push further than dry defaults to be truly effective. How do we attract attention? How can we offer audiences valuable experiences worth their time? How can we help people access complexity?

Dark and mysterious, but full of potential, data is the raw material from which new understanding can emerge. Become a hero of the information age as you learn how to dip into the chaos of data and emerge with new understanding that can entertain, improve, and inspire. Whether you call the craft data storytelling, data visualization, data journalism, dashboard design, or infographic creation — what matters is that you are courageously confronting the chaos of it all in order to improve how people see the world. Info We Trust is written for everyone who straddles the domains of data and people: data visualization professionals, analysts, and all who are enthusiastic for seeing the world in new ways.

This book draws from the entirety of human experience, quantitative and poetic. It teaches advanced techniques, such as visual metaphor and data transformations, in order to create more human presentations of data.  It also shows how we can learn from print advertising, engineering, museum curation, and mythology archetypes. This human-centered approach works with machines to design information for people. Advance your understanding beyond by learning from a broad tradition of putting things “in formation” to create new and wonderful ways of opening our eyes to the world….(More)”.

Artificial Unintelligence: How Computers Misunderstand the World


Book by Meredith Broussard where she “…argues that our collective enthusiasm for applying computer technology to every aspect of life has resulted in a tremendous amount of poorly designed systems. We are so eager to do everything digitally—hiring, driving, paying bills, even choosing romantic partners—that we have stopped demanding that our technology actually work. Broussard, a software developer and journalist, reminds us that there are fundamental limits to what we can (and should) do with technology. With this book, she offers a guide to understanding the inner workings and outer limits of technology—and issues a warning that we should never assume that computers always get things right.

Making a case against technochauvinism—the belief that technology is always the solution—Broussard argues that it’s just not true that social problems would inevitably retreat before a digitally enabled Utopia. To prove her point, she undertakes a series of adventures in computer programming. She goes for an alarming ride in a driverless car, concluding “the cyborg future is not coming any time soon”; uses artificial intelligence to investigate why students can’t pass standardized tests; deploys machine learning to predict which passengers survived the Titanic disaster; and attempts to repair the U.S. campaign finance system by building AI software. If we understand the limits of what we can do with technology, Broussard tells us, we can make better choices about what we should do with it to make the world better for everyone….(More)”.

AI is sending people to jail—and getting it wrong


Karen Hao atMIT Technology Review : “Using historical data to train risk assessment tools could mean that machines are copying the mistakes of the past. …

AI might not seem to have a huge personal impact if your most frequent brush with machine-learning algorithms is through Facebook’s news feed or Google’s search rankings. But at the Data for Black Lives conference last weekend, technologists, legal experts, and community activists snapped things into perspective with a discussion of America’s criminal justice system. There, an algorithm can determine the trajectory of your life. The US imprisons more people than any other country in the world. At the end of 2016, nearly 2.2 million adults were being held in prisons or jails, and an additional 4.5 million were in other correctional facilities. Put another way, 1 in 38 adult Americans was under some form of correctional supervision. The nightmarishness of this situation is one of the few issues that unite politicians on both sides of the aisle. Under immense pressure to reduce prison numbers without risking a rise in crime, courtrooms across the US have turned to automated tools in attempts to shuffle defendants through the legal system as efficiently and safely as possible. This is where the AI part of our story begins….(More)”.

Machine Learning and the Rule of Law


Paper by Daniel L. Chen: “Predictive judicial analytics holds the promise of increasing the fairness of law. Much empirical work observes inconsistencies in judicial behavior. By predicting judicial decisions—with more or less accuracy depending on judicial attributes or case characteristics—machine learning offers an approach to detecting when judges most likely to allow extra legal biases to influence their decision making. In particular, low predictive accuracy may identify cases of judicial “indifference,” where case characteristics (interacting with judicial attributes) do no strongly dispose a judge in favor of one or another outcome. In such cases, biases may hold greater sway, implicating the fairness of the legal system….(More)”

Can I Trust the Data I See? A Physician’s Concern on Medical Data in IoT Health Architectures


Conference Paper by Fariha Tasmin Jaigirdar, Carsten Rudolph, and Chris Bain: “With the increasing advancement of Internet of Things (IoT) enabled systems, smart medical devices open numerous opportunities for the healthcare sector. The success of using such devices in the healthcare industry depends strongly on secured and reliable medical data transmission. Physicians diagnose that data and prescribe medicines and/or give guidelines/instructions/treatment plans for the patients. Therefore, a physician is always concerned about the medical data trustworthiness, because if it is not guaranteed, a savior can become an involuntary foe! This paper analyses two different scenarios to understand the real-life consequences in IoT-based healthcare (IoT-Health) application. Appropriate sequence diagrams for both scenarios show data movement as a basis for determining necessary security requirements in each layer of IoT-Health.

We analyse the individual entities of the overall system and develop a system-wide view of trust in IoT-Health. The security analysis pinpoints the research gap in end-to-end trust and indicates the necessity to treat the whole IoT-Health system as an integrated entity. This study highlights the importance of integrated cross-layer security solutions that can deal with the heterogeneous security architectures of IoT healthcare system and finally identifies a possible solution for the open question raised in the security analysis with appropriate future research directions….(More)”.

The Urban Commons: How Data and Technology Can Rebuild Our Communities


Book by Daniel T. O’Brien: “The future of smart cities has arrived, courtesy of citizens and their phones. To prove it, Daniel T. O’Brien explains the transformative insights gleaned from years researching Boston’s 311 reporting system, a sophisticated city management tool that has revolutionized how ordinary Bostonians use and maintain public spaces. Through its phone service, mobile app, website, and Twitter account, 311 catalogues complaints about potholes, broken street lights, graffiti, litter, vandalism, and other issues that are no one citizen’s responsibility but affect everyone’s quality of life. The Urban Commons offers a pioneering model of what modern digital data and technology can do for cities like Boston that seek both prosperous growth and sustainability.

Analyzing a rich trove of data, O’Brien discovers why certain neighborhoods embrace the idea of custodianship and willingly invest their time to monitor the city’s common environments and infrastructure. On the government’s side of the equation, he identifies best practices for implementing civic technologies that engage citizens, for deploying public services in collaborative ways, and for utilizing the data generated by these efforts.

Boston’s 311 system has narrowed the gap between residents and their communities, and between constituents and local leaders. The result, O’Brien shows, has been the creation of more effective policy and practices that reinvigorate the way citizens and city governments approach their mutual interests. By unpacking when, why, and how the 311 system has worked for Boston, The Urban Commons reveals the power and potential of this innovative system, and the lessons learned that other cities can adapt…(More)”.