The solution to US politics’ Facebook problem is Facebook


Parag Khanna in Quartz: “In just one short decade, Facebook has evolved from a fast-growing platform for sharing classmates’ memories and pet photos to being blamed for Donald Trump’s election victory, promoting hate speech, and accelerating ISIS recruitment. Clearly, Facebook has outgrown its original mission.

It should come as no surprise then that Facebook CEO Mark Zuckerberg has in the past few months issued a long manifesto explaining the company’s broader aim to foster global connectivity, given a commencement speech at Harvard focused on the need for people to feel a meaningful “sense of purpose,” as well as more recently changed the company’s mission to “give people the power to build community and bring the world closer together.”

In truth, Facebook has been doing this all along. In just a three year period between 2011-2014, the average number of international “friends” Facebook members have (whether from rich or poor countries) doubled and in many cases tripled. There is no denying that without Facebook, people would have much less exposure to people they would never meet, and therefore opportunities to gain wider perspectives (irrespective of whether they confirm or contradict one’s own). Then there are charities and NGOs from UNICEF to Human Rights Watch that raise millions of dollars on Facebook and other online platforms such as Avaaz and Change.org.

Facebook has just crossed two billion monthly users, meaning more people express their views on it each month than will vote in all elections in the world this year. That makes Facebook the largest player in wide array of social media tools that are the epicenter–and the lightening rod–for our conversation about technology and politics. Ironically, though, while so many of these innovations come out of the US, the American approach to using digital technology for better governance is at best pathetic…. Sloppy analysis, a cynical Kommentariat and an un-innovative government have led America down the path of ignoring most of the positive ways digital governance can unfold. Fortunately, there are plenty of lessons from around the world for those who care to look and learn.

Citizen engagement is an obvious start. But this should be more than just live-streamed town halls and Q&As in the run-up to elections. European governments such as the UK use Facebook pages to continuously gather policy proposals on public spending priorities. In Estonia, electronic voting is the norm. In the world’s oldest direct democracy, Switzerland, citizen petitions and initiatives are being digitized for even more transparent and inclusive deliberation. In Australia, the Flux movement is allowing all citizens to cast digital ballots on specific policy issues and submit them straight to parliament. Meanwhile, America has the Koch Brothers and the NRA…..

Even governments that are less respected in the West because their regimes do not resemble our own do a better job of harnessing social media. Sheikh Mohammed, ruler of Dubai, uses Facebook to crowdsource suggestions for infrastructure projects and other ideas from a population that is a whopping 90 percent foreign.

Singapore may be the most sophisticated government in this domain. Though the incumbent People’s Action Party (PAP) wins every parliamentary election hands-down, more important is the fact that surveys the public ad nauseam on issues of savings and healthcare, transit routes, immigration policy and just about everything else. Singapore is not Switzerland, but it might be the world’s most responsive government.

This is how governments that appear illegitimate according to a narrow reading of Western political theory boast far higher public satisfaction than most all Western governments today. If you don’t understand this, you probably spend too much time in a filter bubble….

The US should aspire to be a place where democracy and data reinforce rather than contradict each other….(More)

What is One Team Government?


Kit Collingwood-Richardso at Medium: “On 29th June, 186 people came together in London to talk about how we could work across disciplines to make government more effective…. Below are our current ideas on what we want it to be. We’d love your help shaping them up.

So what is One Team Government?

At its heart, it’s a community (join it here and see the bottom of this post), united and guided by a set of principles. Together, we are working to create a movement of reform through practical action.

The community is made up of people who are passionate about public sector reform (we deliberately want this to be wider than just government), with the emphasis on improving the services we offer to citizens and how we work. We believe the public sector can be brilliant, and we’re committed to making it so.

You don’t have to work for government to be in the community, nor be a public servant in the wider sense, nor indeed be in the UK; we need diverse perspectives, with people of all sectors, areas and interests helping. We think we’re unstoppable if we work together.

Our initial thinking (see below for how to help us iterate on this) is that we want the One Team Government movement to be guided by seven principles:

1. Work in the open and positively

We’re a community; everything we do will be documented and made to share. Where conversations happen that can’t be shared, the wider learning still will be. This is a reform cooperative, where we choose to be generous with knowledge. Ideas are infectious; we’ll share ours early and often….

2. Take practical action

Although talking is vital, we will be defined more by the things we do than the things we say. We will create change by taking small, measured steps every day — everything from creating a new contact in a different area or discipline, sharing something we’ve written, or giving our time to contribute to others’ work — and encouraging others to do the same. We won’t create huge plans, but do things that make a real difference today, no matter how big or small. We will document what they are.

3. Experiment and iterate

We don’t think there’s one way to ‘do’ reform. We will experiment with design, and put user-focused service design thinking into everything we do, learning from and with each other. We will test, iterate and reflect. We will be humble in our approach, focusing on asking the right questions to get to the best answers.

We will embrace small failures as opportunities to learn. We won’t get everything right, and we won’t try to. We will listen, learn and improve together.

4. Be diverse and inclusive

Our approach to inclusiveness and diversity is driven by a simple desire to better represent the citizens we serve. We’ll put effort into making that so, by balancing our events, making sure our teams are reflective of society at large and by making sure we have a range of citizen and team voices in the room with us….

5. Care deeply about citizens

We work for users and other citizens affected by our work; everything we do will be guided by our impact on them. We will talk to them, early and often; we will use the best research methods to understand them better. We will be distinguished by our empathy — for users and for each other. The policy that we develop will be tested with real people as early as possible, and refined with their needs in mind.

6. Work across borders

We believe that diverse views make our outcomes and services better. We will be characterised by our work to break down boundaries between groups. …

7. Embrace technology

We are passionate about public sector reform for the internet age. We will be a technology-enabled community, using online tools to collaborate, network and share. We will put the best of digital thinking into policy and service design, using technology to make us quicker, smarter, better and more data-driven. We will help to shape a public sector we can be proud to work in in the 21st century….(More)”.

Four lessons NHS Trusts can learn from the Royal Free case


Blog by Elizabeth Denham, Information Commissioner in the UK: “Today my office has announced that the Royal Free London NHS Foundation Trust did not comply with the Data Protection Act when it turned over the sensitive medical data of around 1.6 million patients to Google DeepMind, a private sector firm, as part of a clinical safety initiative. As a result of our investigation, the Trust has been asked to sign an undertaking committing it to changes to ensure it is acting in accordance with the law, and we’ll be working with them to make sure that happens.

But what about the rest of the sector? As organisations increasingly look to unlock the huge potential that creative uses of data can have for patient care, what are the lessons to be learned from this case?

It’s not a choice between privacy or innovation

It’s welcome that the trial looks to have been positive. The Trust has reported successful outcomes. Some may reflect that data protection rights are a small price to pay for this.

But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights….

Don’t dive in too quickly

Privacy impact assessments are a key data protection tool of our era, as evolving law and best practice around the world demonstrate. Privacy impact assessments play an increasingly prominent role in data protection, and they’re a crucial part of digital innovation. ….

New cloud processing technologies mean you can, not that you always should

Changes in technology mean that vast data sets can be made more readily available and can be processed faster and using greater data processing technologies. That’s a positive thing, but just because evolving technologies can allow you to do more doesn’t mean these tools should always be fully utilised, particularly during a trial initiative….

Know the law, and follow it

No-one suggests that red tape should get in the way of progress. But when you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason….(More)”

The role of Open Data in driving sustainable mobility in nine smart cities


Paper by Piyush Yadav et al: “In today’s era of globalization, sustainable mobility is considered as a key factor in the economic growth of any country. With the emergence of open data initiatives, there is tremendous potential to improve mobility. This paper presents findings of a detailed analysis of mobility open data initiatives in nine smart cities – Amsterdam, Barcelona, Chicago, Dublin, Helsinki, London, Manchester, New York and San Francisco. The paper discusses the study of various sustainable indicators in the mobility domain and its convergence with present open datasets. Specifically, it throws light on open data ecosystems in terms of their production and consumption. It gives a comprehensive view of the nature of mobility open data with respect to their formats, interactivity, and availability. The paper details the open datasets in terms of their alignment with different mobility indicators, publishing platforms, applications and API’s available. The paper discusses how these open datasets have shown signs of fostering organic innovation and sustainable growth in smart cities with impact on mobility trends. The results of the work can be used to inform the design of data driven sustainable mobility in smart cities to maximize the utilization of available open data resources….(More)”.

Research data infrastructures in the UK


The Open Research Data Task Force : “This report is intended to inform the work of the Open Research Data Task Force, which has been established with the aim of building on the principles set out in Open Research Data Concordat (published in July 2016) to co-ordinate creation of a roadmap to develop the infrastructure for open research data across the UK. As an initial contribution to that work, the report provides an outline of the policy and service infrastructure in the UK as it stands in the first half of 2017, including some comparisons with other countries; and it points to some key areas and issues which require attention. It does not seek to identify possible courses of action, nor even to suggest priorities the Task Force might consider in creating its final report to be published in 2018. That will be the focus of work for the Task Force over the next few months.

Why is this important?

The digital revolution continues to bring fundamental changes to all aspects of research: how it is conducted, the findings that are produced, and how they are interrogated and transmitted not only within the research community but more widely. We are as yet still in the early stages of a transformation in which progress is patchy across the research community, but which has already posed significant challenges for research funders and institutions, as well as for researchers themselves. Research data is at the heart of those challenges: not simply the datasets that provide the core of the evidence analysed in scholarly publications, but all the data created and collected throughout the research process. Such data represents a potentially-valuable resource for people and organisations in the commercial, public and voluntary sectors, as well as for researchers. Access to such data, and more general moves towards open science, are also critically-important in ensuring that research is reproducible, and thus in sustaining public confidence in the work of the research community. But effective use of research data depends on an infrastructure – of hardware, software and services, but also of policies, organisations and individuals operating at various levels – that is as yet far from fully-formed. The exponential increases in volumes of data being generated by researchers create in themselves new demands for storage and computing power. But since the data is characterised more by heterogeneity then by uniformity, development of the infrastructure to manage it involves a complex set of requirements in preparing, collecting, selecting, analysing, processing, storing and preserving that data throughout its life cycle.

Over the past decade and more, there have been many initiatives on the part of research institutions, funders, and members of the research community at local, national and international levels to address some of these issues. Diversity is a key feature of the landscape, in terms of institutional types and locations, funding regimes, and nature and scope of partnerships, as well as differences between disciplines and subject areas. Hence decision-makers at various levels have fostered via their policies and strategies many community-organised developments, as well as their own initiatives and services. Significant progress has been achieved as a result, through the enthusiasm and commitment of key organisations and individuals. The less positive features have been a relative lack of harmonisation or consolidation, and there is an increasing awareness of patchiness in provision, with gaps, overlaps and inconsistencies. This is not surprising, since policies, strategies and services relating to research data necessarily affect all aspects of support for the diverse processes of research itself. Developing new policies and infrastructure for research data implies significant re-thinking of structures and regimes for supporting, fostering and promoting research itself. That in turn implies taking full account of widely-varying characteristics and needs of research of different kinds, while also keeping in clear view the benefits to be gained from better management of research data, and from greater openness in making data accessible for others to re-use for a wide range of different purposes….(More)”.

Volunteers teach AI to spot slavery sites from satellite images


This data will then be used to train machine learning algorithms to automatically recognise brick kilns in satellite imagery. If computers can pinpoint the location of such possible slavery sites, then the coordinates could be passed to local charities to investigate, says Kevin Bales, the project leader, at the University of Nottingham, UK.

South Asian brick kilns are notorious as modern-day slavery sites. There are an estimated 5 million people working in brick kilns in South Asia, and of those nearly 70 per cent are thought to be working there under duress – often to pay off financial debts.

 However, no one is quite sure how many of these kilns there are in the so-called “Brick Belt”, a region that stretches across parts of Pakistan, India and Nepal. Some estimates put the figure at 20,000, but it may be as high as 50,000.

Bales is hoping that his machine learning approach will produce a more accurate figure and help organisations on the ground know where to direct their anti-slavery efforts.

It’s great to have a tool for identifying possible forced labour sites, says Sasha Jesperson at St Mary’s University in London. But it is just a start – to really find out how many people are being enslaved in the brick kiln industry, investigators still need to visit every site and work out exactly what’s going on there, she says….

So far, volunteers have identified over 4000 potential slavery sites across 400 satellite images taken via Google Earth. Once these have been checked several times by volunteers, Bales plans to use these images to teach the machine learning algorithm what kilns look like, so that it can learn to recognise them in images automatically….(More)”.

The Politics of Listening: Possibilities and Challenges for Democratic Life


Book by Leah Bassel: “…explores listening as a social and political practice, in contrast to the more common focus on voice and speaking.  The author draws on cases from Canada, France and the United Kingdom, exploring: minority women and debates over culture and religion; riots and young men in France and England; citizen journalism and the creative use of different media; and solidarity between migrant justice and indigenous activists. Analysis across these diverse settings considers whether and how a politics of listening, which demands that the roles of speakers and listeners change, can be undertaken in adversarial and tense political moments. The Politics of Listening argues that such a practice has the potential to create new ways of being and acting together, as political equals who are heard on their own terms….(More)”

Detecting riots with Twitter


Cardiff University News: “An analysis of data taken from the London riots in 2011 showed that computer systems could automatically scan through Twitter and detect serious incidents, such as shops being broken in to and cars being set alight, before they were reported to the Metropolitan Police Service.

The computer system could also discern information about where the riots were rumoured to take place and where groups of youths were gathering. The new research, published in the peer-review journal ACM Transactions on Internet Technology, showed that on average the computer systems could pick up on disruptive events several minutes before officials and over an hour in some cases.

“Antagonistic narratives and cyber hate”

The researchers believe that their work could enable police officers to better manage and prepare for both large and small scale disruptive events.

Co-author of the study Dr Pete Burnap, from Cardiff University’s School of Computer Science and Informatics, said: “We have previously used machine-learning and natural language processing on Twitter data to better understand online deviance, such as the spread of antagonistic narratives and cyber hate…”

“We will never replace traditional policing resource on the ground but we have demonstrated that this research could augment existing intelligence gathering and draw on new technologies to support more established policing methods.”

Scientists are continually looking to the swathes of data produced from Twitter, Facebook and YouTube to help them to detect events in real-time.

Estimates put social media membership at approximately 2.5 billion non-unique users, and the data produced by these users have been used to predict elections, movie revenues and even the epicentre of earthquakes.

In their study the research team analysed 1.6m tweets relating to the 2011 riots in England, which began as an isolated incident in Tottenham on August 6 but quickly spread across London and to other cities in England, giving rise to looting, destruction of property and levels of violence not seen in England for more than 30 years.

Machine-learning algorithms

The researchers used a series of machine-learning algorithms to analyse each of the tweets from the dataset, taking into account a number of key features such as the time they were posted, the location where they were posted and the content of the tweet itself.

Results showed that the machine-learning algorithms were quicker than police sources in all but two of the disruptive events reported…(More)”.

Index: Collective Intelligence


By Hannah Pierce and Audrie Pirkl

The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on collective intelligence and was originally published in 2017.

The Collective Intelligence Universe

  • Amount of money that Reykjavik’s Better Neighbourhoods program has provided each year to crowdsourced citizen projects since 2012: € 2 million (Citizens Foundation)
  • Number of U.S. government challenges that people are currently participating in to submit their community solutions: 778 (Challenge.gov).
  • Percent of U.S. arts organizations used social media to crowdsource ideas in 2013, from programming decisions to seminar scheduling details: 52% (Pew Research)
  • Number of Wikipedia members who have contributed to a page in the last 30 days: over 120,000 (Wikipedia Page Statistics)
  • Number of languages that the multinational crowdsourced Letters for Black Lives has been translated into: 23 (Letters for Black Lives)
  • Number of comments in a Reddit thread that established a more comprehensive timeline of the theater shooting in Aurora than the media: 1272 (Reddit)
  • Number of physicians that are members of SERMO, a platform to crowdsource medical research: 800,000 (SERMO)
  • Number of citizen scientist projects registered on SciStarter: over 1,500 (Collective Intelligence 2017 Plenary Talk: Darlene Cavalier)
  • Entrants to NASA’s 2009 TopCoder Challenge: over 1,800 (NASA)

Infrastructure

  • Number of submissions for Block Holm (a digital platform that allows citizens to build “Minecraft” ideas on vacant city lots) within the first six months: over 10,000 (OpenLearn)
  • Number of people engaged to The Participatory Budgeting Project in the U.S.: over 300,000. (Participatory Budgeting Project)
  • Amount of money allocated to community projects through this initiative: $238,000,000

Health

  • Percentage of Internet-using adults with chronic health conditions that have gone online within the US to connect with others suffering from similar conditions: 23% (Pew Research)
  • Number of posts to Patient Opinion, a UK based platform for patients to provide anonymous feedback to healthcare providers: over 120,000 (Nesta)
    • Percent of NHS health trusts utilizing the posts to improve services in 2015: 90%
    • Stories posted per month: nearly 1,000 (The Guardian)
  • Number of tumors reported to the English National Cancer Registration each year: over 300,000 (Gov.UK)
  • Number of users of an open source artificial pancreas system: 310 (Collective Intelligence 2017 Plenary Talk: Dana Lewis)

Government

  • Number of submissions from 40 countries to the 2017 Open (Government) Contracting Innovation Challenge: 88 (The Open Data Institute)
  • Public-service complaints received each day via Indonesian digital platform Lapor!: over 500 (McKinsey & Company)
  • Number of registered users of Unicef Uganda’s weekly, SMS poll U-Report: 356,468 (U-Report)
  • Number of reports regarding government corruption in India submitted to IPaidaBribe since 2011: over 140,000 (IPaidaBribe)

Business

  • Reviews posted since Yelp’s creation in 2009: 121 million reviews (Statista)
  • Percent of Americans in 2016 who trust online customer reviews as much as personal recommendations: 84% (BrightLocal)
  • Number of companies and their subsidiaries mapped through the OpenCorporates platform: 60 million (Omidyar Network)

Crisis Response

Public Safety

  • Number of sexual harassment reports submitted to from 50 cities in India and Nepal to SafeCity, a crowdsourcing site and mobile app: over 4,000 (SafeCity)
  • Number of people that used Facebook’s Safety Check, a feature that is being used in a new disaster mapping project, in the first 24 hours after the terror attacks in Paris: 4.1 million (Facebook)

Public Data Is More Important Than Ever–And Now It’s Easier To Find


Meg Miller at Co.Design: “Public data, in theory, is meant to be accessible to everyone. But in practice, even finding it can be near impossible, to say nothing of figuring out what to do with it once you do. Government data websites are often clunky and outdated, and some data is still trapped on physical media–like CDs or individual hard drives.

Tens of thousands of these CDs and hard drives, full of data on topics from Arkansas amusement parks to fire incident reporting, have arrived at the doorstep of the New York-based start-up Enigma over the past four years. The company has obtained thousands upon thousands more datasets by way of Freedom of Information Act (FOIA) requests. Enigma specializes in open data: gathering it, curating it, and analyzing it for insights into a client’s industry, for example, or for public service initiatives.

Enigma also shares its 100,000 datasets with the world through an online platform called Public—the broadest collection of public data that is open and searchable by everyone. Public has been around since Enigma launched in 2013, but today the company is introducing a redesigned version of the site that’s fresher and more user-friendly, with easier navigation and additional features that allow users to drill further down into the data.

But while the first iteration of Public was mostly concerned with making Enigma’s enormous trove of data—which it was already gathering and reformating for client work—accessible to the public, the new site focuses more on linking that data in new ways. For journalists, researchers, and data scientists, the tool will offer more sophisticated ways of making sense of the data that they have access to through Enigma….

…the new homepage also curates featured datasets and collections to enforce a sense of discoverability. For example, an Enigma-curated collection of U.S. sanctions data from the U.S. Treasury Department’s Office of Foreign Assets Control (OFAC) shows data on the restrictions on entities or individuals that American companies can and can’t do business with in an effort to achieve specific national security or foreign policy objectives. A new round of sanctions against Russia have been in the news lately as an effort by President Trump to loosen restrictions on blacklisted businesses and individuals in Russia was overruled by the Senate last week. Enigma’s curated data selection on U.S. sanctions could help journalists contextualize recent events with data that shows changes in sanctions lists over time by presidential administration, for instance–or they could compare the U.S. sanctions list to the European Union’s….(More).