Will Democracy Survive Big Data and Artificial Intelligence?


Dirk Helbing, Bruno S. Frey, Gerd Gigerenzer, Ernst Hafen, Michael Hagner, Yvonne Hofstetter, Jeroen van den Hoven, Roberto V. Zicari, and Andrej Zwitter in Scientific American: “….In summary, it can be said that we are now at a crossroads (see Fig. 2). Big data, artificial intelligence, cybernetics and behavioral economics are shaping our society—for better or worse. If such widespread technologies are not compatible with our society’s core values, sooner or later they will cause extensive damage. They could lead to an automated society with totalitarian features. In the worst case, a centralized artificial intelligence would control what we know, what we think and how we act. We are at the historic moment, where we have to decide on the right path—a path that allows us all to benefit from the digital revolution. Therefore, we urge to adhere to the following fundamental principles:

1. to increasingly decentralize the function of information systems;

2. to support informational self-determination and participation;

3. to improve transparency in order to achieve greater trust;

4. to reduce the distortion and pollution of information;

5. to enable user-controlled information filters;

6. to support social and economic diversity;

7. to improve interoperability and collaborative opportunities;

8. to create digital assistants and coordination tools;

9. to support collective intelligence, and

10. to promote responsible behavior of citizens in the digital world through digital literacy and enlightenment.

Following this digital agenda we would all benefit from the fruits of the digital revolution: the economy, government and citizens alike. What are we waiting for?A strategy for the digital age

Big data and artificial intelligence are undoubtedly important innovations. They have an enormous potential to catalyze economic value and social progress, from personalized healthcare to sustainable cities. It is totally unacceptable, however, to use these technologies to incapacitate the citizen. Big nudging and citizen scores abuse centrally collected personal data for behavioral control in ways that are totalitarian in nature. This is not only incompatible with human rights and democratic principles, but also inappropriate to manage modern, innovative societies. In order to solve the genuine problems of the world, far better approaches in the fields of information and risk management are required. The research area of responsible innovation and the initiative ”Data for Humanity” (see “Big Data for the benefit of society and humanity”) provide guidance as to how big data and artificial intelligence should be used for the benefit of society….(More)”

In Beta: Is policymaking stuck in the 19th century?


Global Partners Digital: “Today we’re launching a new series of podcasts – titled In beta – with the aim of critically examining the big questions facing human rights in the digital environment.

The series will be hosted by GPD’s executive director, Charles Bradley, who will interview a different guest – or guests – for each episode.

But before we go into details, a little more on the concept. We’ve created In beta because we felt that there weren’t enough forums for genuine debate and discussion within the digital rights community. We felt that we needed a space where we could host interesting conversations with interesting people in our field, outside of the conventions of traditional policy discourse; which can sometimes work to confine people in silos, and discourage more open, experimental thinking.

The series is called In beta because these conversations will be speculative, not definitive. The questions we examine won’t be easy – or even possible – to answer. They may sometimes be provocative. They may themselves raise new questions, and perhaps lay the groundwork for future work.

In the first episode, we talk to the c0-founder of GovLab, Stefaan Verhulst, asking – ‘Is policymaking stuck in the 19th century?’…(More)”

It takes more than social media to make a social movement


Hayley Tsukayama in the Washington Post: “President Trump may have used the power of social media to make his way into the White House, but now social media networks are showing that muscle can work for his opposition, too. Last week, more than 1 million marchers went to Washington and cities around the country — sparked by a Facebook post from one woman with no history of activism. This weekend, the Internet exploded again in discussion about Trump’s travel suspension order, and many used social media to get together and protest the decision.

Twitter said that more than 25 million tweets were sent about the order — as compared with 12 million about Trump’s inauguration. Facebook said that its users generated 151 million “likes, posts, comments and shares” related to the ban, less than the 208 million interactions generated about the inauguration. The companies didn’t reveal how many of those were aimed at organizing, but the social media calls to get people to protest are a testament to the power of these platforms to move people.

The real questionhowever, is whether this burgeoning new movement can avoid the fate of many so others kick-started by the power of social networks — only to find that it’s much harder to make political change than to make a popular hashtag….

Zeynep Tufekci, an associate professor at the University of North Carolina at Chapel Hill who has written a forthcoming book on the power and fragility of movements borne of social media, found in her research that the very ability for these movements to scale quickly is, in part, why they also can fall apart so quickly compared with traditional grass-roots campaigns….

Now, organizers can bypass the time it takes to build up the infrastructure for a massive march and all the publicity that comes with it. But that also means their high-profile movements skip some crucial organizing steps.

“Digitally networked movements look like the old movements. But by the time the civil rights movement had such a large march, they’d been working on [the issues] for 10 years — if not more,” Tufekci said. The months or even years spent discussing logistics, leafleting and building a coalition, she said, were crucial to the success of the civil rights movements. Other successful efforts, such as the Human Rights Campaign’s efforts to end the “don’t ask, don’t tell” policy against allowing gay people to serve openly in the military were also rooted in organization structures that had been developing and refining their demands for years to present a unified front. Movements organized over social networks often have more trouble jelling, she said, particularly if different factions air their differences on Facebook and Twitter, drawing attention to fractures in a movement….(More).”

The Signal Code


The Signal Code: “Humanitarian action adheres to the core humanitarian principles of impartiality, neutrality, independence, and humanity, as well as respect for international humanitarian and human rights law. These foundational principles are enshrined within core humanitarian doctrine, particularly the Red Cross/NGO Code of Conduct5 and the Humanitarian Charter.6 Together, these principles establish a duty of care for populations affected by the actions of humanitarian actors and impose adherence to a standard of reasonable care for those engaged in humanitarian action.

Engagement in HIAs, including the use of data and ICTs, must be consistent with these foundational principles and respect the human rights of crisis-affected people to be considered “humanitarian.” In addition to offering potential benefits to those affected by crisis, HIAs, including the use of ICTs, can cause harm to the safety, wellbeing, and the realization of the human rights of crisis-affected people. Absent a clear understanding of which rights apply to this context, the utilization of new technologies, and in particular experimental applications of these technologies, may be more likely to harm communities and violate the fundamental human rights of individuals.

The Signal Code is based on the application of the UDHR, the Nuremberg Code, the Geneva Convention, and other instruments of customary international law related to HIAs and the use of ICTs by crisis affected-populations and by humanitarians on their behalf. The fundamental human rights undergirding this Code are the rights to life, liberty, and security; the protection of privacy; freedom of expression; and the right to share in scientific advancement and its benefits as expressed in Articles 3, 12, 19, and 27 of the UDHR.7

The Signal Code asserts that all people have fundamental rights to access, transmit, and benefit from information as a basic humanitarian need; to be protected from harms that may result from the provision of information during crisis; to have a reasonable expectation of privacy and data security; to have agency over how their data is collected and used; and to seek redress and rectification when data pertaining to them causes harm or is inaccurate.

These rights are found to apply specifically to the access, collection, generation, processing, use, treatment, and transmission of information, including data, during humanitarian crises. These rights are also found herein to be interrelated and interdependent. To realize any of these rights individually requires realization of all of these rights in concert.

These rights are found to apply to all phases of the data lifecycle—before, during, and after the collection, processing, transmission, storage, or release of data. These rights are also found to be elastic, meaning that they apply to new technologies and scenarios that have not yet been identified or encountered by current practice and theory.

Data is, formally, a collection of symbols which function as a representation of information or knowledge. The term raw data is often used with two different meanings, the first being uncleaned data, that is, data that has been collected in an uncontrolled environment, and unprocessed data, which is collected data that has not been processed in such a way as to make it suitable for decision making. Colloquially, and in the humanitarian context, data is usually thought of solely in the machine readable or digital sense. For the purposes of the Signal Code, we use the term data to encompass information both in its analog and digital representations. Where it is necessary to address data solely in its digital representation, we refer to it as digital data.

No right herein may be used to abridge any other right. Nothing in this code may be interpreted as giving any state, group, or person the right to engage in any activity or perform any act that destroys the rights described herein.

The five human rights that exist specific to information and HIAs during humanitarian crises are the following:

The Right to Information
The Right to Protection
The Right to Data Security and Privacy
The Right to Data Agency
The Right to Redress and Rectification…(More)”

Technology tools in human rights


Engine Room: “Over the past few years, we have been witnessing a wave of new technology tools for human rights documentation. Along with the arrival of the new tools, human rights defenders are facing  new tools, new possibilities, new challenges, and new expectations of human rights documentation initiatives.

Produced with support from the Oak Foundation, this report is designed as a first attempt to detail available technologies that are designed for human rights documentation, understand the various perspectives on the challenges human rights documentation initiatives face when adopting new tools and practices, and analyse what is working and what is not for human rights documentation initiatives seeking to integrate new tools in their work….

Primary takeaways:

  • Traditional methods still apply: The environment in which HRDs are working has not dramatically inherently changed due to technology and data.
  • Unreliability and unknown risks provide huge barriers to engagement with technology: In high-pressured situations such as that of HRDs, methodologies used need to be concrete and reliable.
  • Priorities of HRDs centre around their particular issue: Digital technologies often come as an afterthought, rather than integrated into established strategies for communication or campaigning.
  • The lifespan of technology tools is a big barrier to longterm use: Sustainability of tools and maintenance is a big barrier to engaging with them and can cause fatigue among users having to change their practices often.
  • Past failed attempts at using tools makes future attempts more difficult: After having invested time and energy into changing a workflow or process only for it not to work, people are often reluctant to do the same again.
  • HRDs understand their context best: Tools recommendations coming from external parties sometimes do more harm than good.
  • There is a lack of technical capacity within HRD initiatives: As a result, when tools are introduced, groups become reliant on external parties for technical troubleshooting and support.

(Download the report)

 

21st Century Enlightenment Revisited


Matthew Taylor at the RSA: “The French historian Tzvetan Todorov describes the three essential ideas of the Enlightenment as ‘autonomy’, ‘universalism’ and ‘humanism’. The ideal of autonomy speaks to every individual’s right to self-determination. Universalism asserts that all human beings equally deserve basic rights and dignity (although, of course, in the 18th and 19th century most thinkers restricted this ambition to educated white men). The idea of humanism is that it is up to the people – not Gods or monarchs – through the use of rational inquiry to determine the path to greater human fulfilment….

21st Century Enlightenment 

Take autonomy; too often today we think of freedom either as a shrill demand to be able to turn our backs on wider society or in the narrow possessive terms of consumerism. Yet, brain and behavioural science have confirmed the intuition of philosophers through the ages genuine autonomy is something we only attain when we become aware of our human frailties and understand our truly social nature. Of course, freedom from oppression is the base line, but true autonomy is not a right to be granted but a goal to be pursued through self-awareness and engagement in society.

What of universalism, or social justice as we now tend to think of it? In most parts of the world and certainly in the West there have been incredible advances in equal rights. Discrimination and injustice still exist, but through struggle and reform huge strides have been made in widening the Enlightenment brotherhood of rich white men to women, people of different ethnicity, homosexuals and people with disabilities. Indeed the progress in legal equality over recent decades stands in contrast to the stubborn persistence, and even worsening, of social inequality, particularly based on class.

But the rationalist universalism of human rights needs an emotional corollary. People may be careful not to use the wrong words, but they still harbour resentment and suspicion towards other groups. …

Finally, humanism or the call of progress. The utilitarian philosophy that arose from the Enlightenment spoke to the idea that, free from the religious or autocratic dogma, the best routes to human fulfilment could be identified and should be pursued. The great motors of human progress – markets, science and technology, the modern state – shifted into gear and started to accelerate. Aspects of all these phenomena, indeed of Enlightenment ideas themselves, could be found at earlier stages of human history – what was different was the way they fed off each other and became dominant. Yet, in the process, the idea that these forces could deliver progress often became elided with the assumption that their development was the same as human progress.

Today this danger of letting the engines of progress determine the direction of the human journey feels particularly acute in relation to markets and technology. There is, for example, more discussion of how humans should best adapt to AI and robots than about how technological inquiry might be aligned with human fulfilment. The hollowing out of democratic institutions has diminished the space for public debate about what progress should comprise at just the time when the pace and scale of change makes those debates particularly vital.

A twenty first century enlightenment reinstates true autonomy over narrow ideas of freedom, it asserts a universalism based not just on legal status but on empathy and social connection and reminds us that humanism should lie at the heart of progress.

Think like a system act like an entrepreneur

There is one new strand I want to add to the 2010 account. In the face of many defeats, we must care as much about how we achieve change as about the goals we pursue. At the RSA we talk about ‘thinking like a system and acting like an entrepreneur’, a method which seeks to avoid the narrowness and path dependency of so many unsuccessful models of change. To alter the course our society is now on we need more fully to understand the high barriers to change but then to act more creatively and adaptively when we spot opportunities to take a different path….(More)”

New Data Portal to analyze governance in Africa


New UN resolution on the right to privacy in the digital age: crucial and timely


Deborah Brown at the Internet Policy Review: “The rapid pace of technological development enables individuals all over the world to use new information and communications technologies (ICTs) to improve their lives. At the same time, technology is enhancing the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy. In this context, the UN General Assembly’s Third Committee adoption on 21 November of a new resolution on the right to privacy in the digital age comes as timely and crucial for protecting the right to privacy in light of new challenges.

As with previous UN resolutions on this topic, the resolution adopted on 21 November 2016 recognises the importance of respecting international commitments in relation to the right to privacy. It underscores that any legitimate concerns states may have with regard to their security can and should be addressed in a manner consistent with obligations under international human rights law.

Recognising that more and more personal data is being collected, processed, and shared, this year’s resolution expresses concern about the sale or multiple re-sales of personal data, which often happens without the individual’s free, explicit and informed consent. It calls for the strengthening of prevention of and protection against such violations, and calls on states to develop preventative measures, sanctions, and remedies.

This year, the resolution more explicitly acknowledges the role of the private sector. It calls on states to put in place (or maintain) effective sanctions and remedies to prevent the private sector from committing violations and abuses of the right to privacy. This is in line with states’ obligations under the UN Guiding Principles on Business and Human Rights, which require states to protect against abuses by businesses within their territories or jurisdictions. The resolution specifically calls on states to refrain from requiring companies to take steps that interfere with the right to privacy in an arbitrary or unlawful way. With respect to companies, it recalls the responsibility of the private sector to respect human rights, and specifically calls on them to inform users about company policies that may impact their right to privacy….(More)”

The ethical impact of data science


Theme issue of Phil. Trans. R. Soc. A compiled and edited by Mariarosaria Taddeo and Luciano Floridi: “This theme issue has the founding ambition of landscaping data ethics as a new branch of ethics that studies and evaluates moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values). Data ethics builds on the foundation provided by computer and information ethics but, at the same time, it refines the approach endorsed so far in this research field, by shifting the level of abstraction of ethical enquiries, from being information-centric to being data-centric. This shift brings into focus the different moral dimensions of all kinds of data, even data that never translate directly into information but can be used to support actions or generate behaviours, for example. It highlights the need for ethical analyses to concentrate on the content and nature of computational operations—the interactions among hardware, software and data—rather than on the variety of digital technologies that enable them. And it emphasizes the complexity of the ethical challenges posed by data science. Because of such complexity, data ethics should be developed from the start as a macroethics, that is, as an overall framework that avoids narrow, ad hoc approaches and addresses the ethical impact and implications of data science and its applications within a consistent, holistic and inclusive framework. Only as a macroethics will data ethics provide solutions that can maximize the value of data science for our societies, for all of us and for our environments….(More)”

Table of Contents:

  • The dynamics of big data and human rights: the case of scientific research; Effy Vayena, John Tasioulas
  • Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue; Sebastian Porsdam Mann, Julian Savulescu, Barbara J. Sahakian
  • Faultless responsibility: on the nature and allocation of moral responsibility for distributed moral actions; Luciano Floridi
  • Compelling truth: legal protection of the infosphere against big data spills; Burkhard Schafer
  • Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems; Sabina Leonelli
  • Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy; Deirdre K. Mulligan, Colin Koopman, Nick Doty
  • Beyond privacy and exposure: ethical issues within citizen-facing analytics; Peter Grindrod
  • The ethics of smart cities and urban science; Rob Kitchin
  • The ethics of big data as a public good: which public? Whose good? Linnet Taylor
  • Data philanthropy and the design of the infraethics for information societies; Mariarosaria Taddeo
  • The opportunities and ethics of big data: practical priorities for a national Council of Data Ethics; Olivia Varley-Winter, Hetan Shah
  • Data science ethics in government; Cat Drew
  • The ethics of data and of data science: an economist’s perspective; Jonathan Cave
  • What’s the good of a science platform? John Gallacher

 

Data Ethics – The New Competitive Advantage


Book by Gry Hasselbalch and Pernille Tranberg: “…describes over 50 cases of mainly private companies working with data ethics to varying degrees

Respect for privacy and the right to control one’s own data are becoming key parameters to gain a competitive edge in today’s business world. Companies, organisations and authorities which view data ethics as a social responsibility,giving it the same importance as environmental awareness and respect for human rights,are tomorrow’s winners. Digital trust is paramount to digital growth and prosperity.
This book combines broad trend analyses with case studies to examine companies which use data ethics to varying degrees. The authors make the case that citizens and consumers are no longer just concerned about a lack of control over their data, but they also have begun to act. In addition, they describe alternative business models, advances in technology and a new European data protection regulation, all of which combine to foster a growing market for data-ethical products and services….(More).