New UN resolution on the right to privacy in the digital age: crucial and timely


Deborah Brown at the Internet Policy Review: “The rapid pace of technological development enables individuals all over the world to use new information and communications technologies (ICTs) to improve their lives. At the same time, technology is enhancing the capacity of governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy. In this context, the UN General Assembly’s Third Committee adoption on 21 November of a new resolution on the right to privacy in the digital age comes as timely and crucial for protecting the right to privacy in light of new challenges.

As with previous UN resolutions on this topic, the resolution adopted on 21 November 2016 recognises the importance of respecting international commitments in relation to the right to privacy. It underscores that any legitimate concerns states may have with regard to their security can and should be addressed in a manner consistent with obligations under international human rights law.

Recognising that more and more personal data is being collected, processed, and shared, this year’s resolution expresses concern about the sale or multiple re-sales of personal data, which often happens without the individual’s free, explicit and informed consent. It calls for the strengthening of prevention of and protection against such violations, and calls on states to develop preventative measures, sanctions, and remedies.

This year, the resolution more explicitly acknowledges the role of the private sector. It calls on states to put in place (or maintain) effective sanctions and remedies to prevent the private sector from committing violations and abuses of the right to privacy. This is in line with states’ obligations under the UN Guiding Principles on Business and Human Rights, which require states to protect against abuses by businesses within their territories or jurisdictions. The resolution specifically calls on states to refrain from requiring companies to take steps that interfere with the right to privacy in an arbitrary or unlawful way. With respect to companies, it recalls the responsibility of the private sector to respect human rights, and specifically calls on them to inform users about company policies that may impact their right to privacy….(More)”

New Institute Pushes the Boundaries of Big Data


Press Release: “Each year thousands of genomes are sequenced, millions of neuronal activity traces are recorded, and light from hundreds of millions of galaxies is captured by our newest telescopes, all creating datasets of staggering size. These complex datasets are then stored for analysis.

Ongoing analysis of these information streams has illuminated a problem, however: Scientists’ standard methodologies are inadequate to the task of analyzing massive quantities of data. The development of new methods and software to learn from data and to model — at sufficient resolution — the complex processes they reflect is now a pressing concern in the scientific community.

To address these challenges, the Simons Foundation has launched a substantial new internal research group called the Flatiron Institute (FI). The FI is the first multidisciplinary institute focused entirely on computation. It is also the first center of its kind to be wholly supported by private philanthropy, providing a permanent home for up to 250 scientists and collaborating expert programmers all working together to create, deploy and support new state-of-the-art computational methods. Few existing institutions support the combination of scientists and programmers, instead leaving programming to relatively impermanent graduate students and postdoctoral fellows, and none have done so at the scale of the Flatiron Institute or with such a broad scope, at a single location.

The institute will hold conferences and meetings and serve as a focal point for computational science around the world….(More)”.

Digital Kenya: An Entrepreneurial Revolution in the Making


(Open Access) book edited by Bitange Ndemo and Tim Weiss: “Presenting rigorous and original research, this volume offers key insights into the historical, cultural, social, economic and political forces at play in the creation of world-class ICT innovations in Kenya. Following the arrival of fiber-optic cables in 2009, Digital Kenya examines why the initial entrepreneurial spirit and digital revolution has begun to falter despite support from motivated entrepreneurs, international investors, policy experts and others. Written by engaged scholars and professionals in the field, the book offers 15 eye-opening chapters and 14 one-on-one conversations with entrepreneurs and investors to ask why establishing ICT start-ups on a continental and global scale remains a challenge on the “Silicon Savannah”. The authors present evidence-based recommendations to help Kenya to continue producing globally impactful  ICT innovations that improve the lives of those still waiting on the side-lines, and to inspire other nations to do the same….(More)”

Talent Gap Is a Main Roadblock as Agencies Eye Emerging Tech


Theo Douglas in GovTech: “U.S. public service agencies are closely eyeing emerging technologies, chiefly advanced analytics and predictive modeling, according to a new report from Accenture, but like their counterparts globally they must address talent and complexity issues before adoption rates will rise.

The report, Emerging Technologies in Public Service, compiled a nine-nation survey of IT officials across all levels of government in policing and justice, health and social services, revenue, border services, pension/Social Security and administration, and was released earlier this week.

It revealed a deep interest in emerging tech from the public sector, finding 70 percent of agencies are evaluating their potential — but a much lower adoption level, with just 25 percent going beyond piloting to implementation….

The revenue and tax industries have been early adopters of advanced analytics and predictive modeling, he said, while biometrics and video analytics are resonating with police agencies.

In Australia, the tax office found using voiceprint technology could save 75,000 work hours annually.

Closer to home, Utah Chief Technology Officer Dave Fletcher told Accenture that consolidating data centers into a virtualized infrastructure improved speed and flexibility, so some processes that once took weeks or months can now happen in minutes or hours.

Nationally, 70 percent of agencies have either piloted or implemented an advanced analytics or predictive modeling program. Biometrics and identity analytics were the next most popular technologies, with 29 percent piloting or implementing, followed by machine learning at 22 percent.

Those numbers contrast globally with Australia, where 68 percent of government agencies have charged into piloting and implementing biometric and identity analytics programs; and Germany and Singapore, where 27 percent and 57 percent of agencies respectively have piloted or adopted video analytic programs.

Overall, 78 percent of respondents said they were either underway or had implemented some machine-learning technologies.

The benefits of embracing emerging tech that were identified ranged from finding better ways of working through automation to innovating and developing new services and reducing costs.

Agencies told Accenture their No. 1 objective was increasing customer satisfaction. But 89 percent said they’d expect a return on implementing intelligent technology within two years. Four-fifths, or 80 percent, agreed intelligent tech would improve employees’ job satisfaction….(More).

Open Data Workspace for Analyzing Hate Crime Trends


Press Release: “The Anti-Defamation League (ADL) and data.world today announced the launch of a public, open data workspace to help understand and combat the rise of hate crimes. The new workspace offers instant access to ADL data alongside relevant data from the FBI and other authoritative sources, and provides citizens, journalists and lawmakers with tools to more effectively analyze, visualize and discuss hate crimes across the United States.

The new workspace was unveiled at ADL’s inaugural “Never Is Now” Summit on Anti-Semitism, a daylong event bringing together nearly 1,000 people in New York City to hear from an array of experts on developing innovative new ways to combat anti-Semitism and bigotry….

Hate Crime Reporting Gaps


The color scale depicts total reported hate crime incidents per 100,000 people in each state. States with darker shading have more reported incidents of hate crimes while states with lighter shading have fewer reported incidents. The green circles proportionally represent cities that either Did Not Report hate crime data or affirmatively reported 0 hate crimes for the year 2015. Note the lightly shaded states in which many cities either Do Not Report or affirmatively report 0 hate crimes….(More)”

Crowdjury


Crowdjury is an online platform that crowdsources judicial proceedings: filing of complaints, evaluation of evidence, trial and jury verdict.

CrowdJury plataform
Fast, affordable, transparent justice

Crowdjury algorithms are optimized to reach a true verdict for each case, quickly and at minimal cost.

Do well by doing good

Jurors and Researchers from the crowd are rewarded in Bitcoin. You help do good, you earn money….

Want to know more? Read the White Paper “

The internet is crowdsourcing ways to drain the fake news swamp


 at CNET: “Fighting the scourge of fake news online is one of the unexpected new crusades emerging from the fallout of Donald Trump’s upset presidential election win last week. Not surprisingly, the internet has no shortage of ideas for how to get its own house in order.

Eli Pariser, author of the seminal book “The Filter Bubble”that pre-saged some of the consequences of online platforms that tend to sequester users into non-overlapping ideological silos, is leading an inspired brainstorming effort via this open Google Doc.

Pariser put out the public call to collaborate via Twitter on Thursday and within 24 hours 21 pages worth of bullet-pointed suggestions has already piled up in the doc….

Suggestions ranged from the common call for news aggregators and social media platforms to hire more human editors, to launching more media literacy programs or creating “credibility scores” for shared content and/or users who share or report fake news.

Many of the suggestions are aimed at Facebook, which has taken a heavy heaping of criticism since the election and a recent report that found the top fake election news stories saw more engagement on Facebook than the top real election stories….

In addition to the crowdsourced brainstorming approach, plenty of others are chiming in with possible solutions. Author, blogger and journalism professor Jeff Jarvis teamed up with entrepreneur and investor John Borthwick of Betaworks to lay out 15 concrete ideas for addressing fake news on Medium ….The Trust Project at Santa Clara University is working to develop solutions to attack fake news that include systems for author verification and citations….(More)

Who Is Doing Computational Social Science?


Trends in Big Data Research, a Sage Whitepaper: “Information of all kinds is now being produced, collected, and analyzed at unprecedented speed, breadth, depth, and scale. The capacity to collect and analyze massive data sets has already transformed fields such as biology, astronomy, and physics, but the social sciences have been comparatively slower to adapt, and the path forward is less certain. For many, the big data revolution promises to ask, and answer, fundamental questions about individuals and collectives, but large data sets alone will not solve major social or scientific problems. New paradigms being developed by the emerging field of “computational social science” will be needed not only for research methodology, but also for study design and interpretation, cross-disciplinary collaboration, data curation and dissemination, visualization, replication, and research ethics (Lazer et al., 2009). SAGE Publishing conducted a survey with social scientists around the world to learn more about researchers engaged in big data research and the challenges they face, as well as the barriers to entry for those looking to engage in this kind of research in the future. We were also interested in the challenges of teaching computational social science methods to students. The survey was fully completed by 9412 respondents, indicating strong interest in this topic among our social science contacts. Of respondents, 33 percent had been involved in big data research of some kind and, of those who have not yet engaged in big data research, 49 percent (3057 respondents) said that they are either “definitely planning on doing so in the future” or “might do so in the future.”…(More)”

The ethical impact of data science


Theme issue of Phil. Trans. R. Soc. A compiled and edited by Mariarosaria Taddeo and Luciano Floridi: “This theme issue has the founding ambition of landscaping data ethics as a new branch of ethics that studies and evaluates moral problems related to data (including generation, recording, curation, processing, dissemination, sharing and use), algorithms (including artificial intelligence, artificial agents, machine learning and robots) and corresponding practices (including responsible innovation, programming, hacking and professional codes), in order to formulate and support morally good solutions (e.g. right conducts or right values). Data ethics builds on the foundation provided by computer and information ethics but, at the same time, it refines the approach endorsed so far in this research field, by shifting the level of abstraction of ethical enquiries, from being information-centric to being data-centric. This shift brings into focus the different moral dimensions of all kinds of data, even data that never translate directly into information but can be used to support actions or generate behaviours, for example. It highlights the need for ethical analyses to concentrate on the content and nature of computational operations—the interactions among hardware, software and data—rather than on the variety of digital technologies that enable them. And it emphasizes the complexity of the ethical challenges posed by data science. Because of such complexity, data ethics should be developed from the start as a macroethics, that is, as an overall framework that avoids narrow, ad hoc approaches and addresses the ethical impact and implications of data science and its applications within a consistent, holistic and inclusive framework. Only as a macroethics will data ethics provide solutions that can maximize the value of data science for our societies, for all of us and for our environments….(More)”

Table of Contents:

  • The dynamics of big data and human rights: the case of scientific research; Effy Vayena, John Tasioulas
  • Facilitating the ethical use of health data for the benefit of society: electronic health records, consent and the duty of easy rescue; Sebastian Porsdam Mann, Julian Savulescu, Barbara J. Sahakian
  • Faultless responsibility: on the nature and allocation of moral responsibility for distributed moral actions; Luciano Floridi
  • Compelling truth: legal protection of the infosphere against big data spills; Burkhard Schafer
  • Locating ethics in data science: responsibility and accountability in global and distributed knowledge production systems; Sabina Leonelli
  • Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy; Deirdre K. Mulligan, Colin Koopman, Nick Doty
  • Beyond privacy and exposure: ethical issues within citizen-facing analytics; Peter Grindrod
  • The ethics of smart cities and urban science; Rob Kitchin
  • The ethics of big data as a public good: which public? Whose good? Linnet Taylor
  • Data philanthropy and the design of the infraethics for information societies; Mariarosaria Taddeo
  • The opportunities and ethics of big data: practical priorities for a national Council of Data Ethics; Olivia Varley-Winter, Hetan Shah
  • Data science ethics in government; Cat Drew
  • The ethics of data and of data science: an economist’s perspective; Jonathan Cave
  • What’s the good of a science platform? John Gallacher

 

From Tech-Driven to Human-Centred: Opengov has a Bright Future Ahead


Essay by Martin Tisné: ” The anti-corruption and transparency field ten years ago was in pre-iPhone mode. Few if any of us spoke of the impact or relevance of technology to what would become known as the open government movement. When the wave of smart phone and other technology hit from the late 2000s onwards, it hit hard, and scaled fast. The ability of technology to create ‘impact at scale’ became the obvious truism of our sector, so much so that pointing out the failures of techno-utopianism became a favorite pastime for pundits and academics. The technological developments of the next ten years will be more human-centered — less ‘build it and they will come’ — and more aware of the un-intended consequences of technology (e.g. the fairness of Artifical Intelligence decision making) whilst still being deeply steeped in the technology itself.

By 2010, two major open data initiatives had launched and were already seen as successful in the US and UK, one of President Obama’s first memorandums was on openness and transparency, and an international research project had tracked 63 different instances of uses of technology for transparency around the world (from Reclamos in Chile, to I Paid a Bribe in India, via Maji Matone in Tanzania). Open data projects numbered over 200 world-wide within barely a year of data.gov.uk launching and to everyone’s surprise topped the list of Open Government Partnership commitments a few years hence.

The technology genie won’t go back into the bottle: the field will continue to grow alongside technological developments. But it would take a bold or foolish pundit to guess which of blockchain or other developments will have radically changed the field by 2025.

What is clearer is that the sector is more questioning towards technology, more human-centered both in the design of those technologies and in seeking to understand and pre-empt their impact….

We’ve moved from cyber-utopianism less than ten years ago to born-digital organisations taking a much more critical look at the deployment of technology. The evangelical phase of the open data movement is coming to an end. The movement no longer needs to preach the virtues of unfettered openness to get a foot in the door. It seeks to frame the debate as to whether, when and how data might legitimately be shared or closed, and what impacts those releases may have on privacy, surveillance, discrimination. An open government movement that is more human-centered and aware of the un-intended consequences of technology, has a bright and impactful future ahead….(More)”