People-led innovation project to help tackle policy challenges


Natalie Leal at Global Government Forum: “A new initiative by two US think tanks aims to help public bodies explore innovative ways of consulting and engaging with communities, finding new answers to public policy challenges. 

The People-Led Innovation project was launched on Tuesday by GovLab and the Bertelsmann Foundation. Noting that citizens’ knowledge, insights and ideas often hold the key to the problems faced by governments, GovLab co-founder Stefaan Verhulst said the new tools will help officials consider “the most effective ways to engage the right people for the right task at the right time.”

Verhulst explained that the initiative, ‘People-Led Innovation: Toward a Methodology for Solving Urban Problems in the 21st Century’, is “built on the idea that, as governments increasingly experiment with new means for drawing on the public’s knowledge and skills to address common challenges, one-size-fits-all citizen engagement efforts are often too broad and unwieldy to surface useful insights.”

A fresh methodology

The new site aims to provide leaders with a toolkit and “a set of steps that enable them to tap into their potentially most important – but underutilized – asset: people.” While the project’s main audience is US city governments, the skills and methodology are transferable and the researchers have drawn on case studies from around the world.

The methodology breaks the process down into four distinct stages: defining the problem; curating possible solutions using people and data; experimenting and testing what works in practice; and reviewing and ‘expanding’ – incorporating feedback and transferring lessons learned to a wider audience. At each stage, leaders are encouraged to identify stakeholders to consult or co-create with. 

At the heart of the initiative is the idea that everyone – from local residents, small businesses and community bodies through to government agencies, corporate giants and international organisations – can contribute valuable ideas and help solve complex problems....

“People’s expertise comes in a range of flavours – from interests and experiences to skills and credentialed knowledge – yet all are equally valuable to engage when solving problems,” say the creators in a report on the website. 

Four types of engagement methods are suggested as ways to best “tap into the diverse expertise distributed among people outside of government. These are: commenting, for example a discussion platform to gather views, experiences and opinions; co-creating, e.g. a sector-specific hackathon to leverage datasets; reviewing, including online or offline engagements allowing people to vote on specific proposals or ideas; and reporting, e.g. a crowdsourcing platform for citizens to record incidents of problematic issues such as potholes or graffiti….(More)”.

AI is sending people to jail—and getting it wrong


Karen Hao atMIT Technology Review : “Using historical data to train risk assessment tools could mean that machines are copying the mistakes of the past. …

AI might not seem to have a huge personal impact if your most frequent brush with machine-learning algorithms is through Facebook’s news feed or Google’s search rankings. But at the Data for Black Lives conference last weekend, technologists, legal experts, and community activists snapped things into perspective with a discussion of America’s criminal justice system. There, an algorithm can determine the trajectory of your life. The US imprisons more people than any other country in the world. At the end of 2016, nearly 2.2 million adults were being held in prisons or jails, and an additional 4.5 million were in other correctional facilities. Put another way, 1 in 38 adult Americans was under some form of correctional supervision. The nightmarishness of this situation is one of the few issues that unite politicians on both sides of the aisle. Under immense pressure to reduce prison numbers without risking a rise in crime, courtrooms across the US have turned to automated tools in attempts to shuffle defendants through the legal system as efficiently and safely as possible. This is where the AI part of our story begins….(More)”.

Looking after and using data for public benefit


Heather Savory at the Office for National Statistics (UK): “Official Statistics are for the benefit of society and the economy and help Britain to make better decisions. They allow the formulation of better public policy and the effective measurement of those policies. They inform the direction of economic and commercial activities. They provide valuable information for analysts, researchers, public and voluntary bodies. They enable the public to hold organisations that spend public money to account, thus informing democratic debate.

The ability to harness the power of data is critical in enabling official statistics to support the most important decisions facing the country.

Under the new powers in the Digital Economy Act , ONS can now gain access to new and different sources of data including ‘administrative’ data from government departments and commercial data. Alongside the availability of these new data sources ONS is experiencing a strong demand for ad hoc insights alongside our traditional statistics.

We need to deliver more, faster, finer-grained insights into the economy and society. We need to deliver high quality, trustworthy information, on a faster timescale, to help decision-making. We will increasingly develop innovative data analysis methods, for example using images to gain insight from the work we’ve recently announced on Urban Forests….

I should explain here that our data is not held in one big linked database; we’re architecting our Data Access Platform so that data can be linked in different ways for different purposes. This is designed to preserve data confidentiality, so only the necessary subset of data is accessible by authorised people, for a certain purpose. To avoid compromising their effectiveness, we do not make public the specific details of the security measures we have in place, but our recently tightened security regime, which is independently assured by trusted external bodies, includes:

  • physical measures to restrict who can access places where data is stored;
  • protective measures for all data-related IT services;
  • measures to restrict who can access systems and data held by ONS;
  • controls to guard against staff or contractors misusing their legitimate access to data; including vetting to an appropriate level for the sensitivity of data to which they might have access.

One of the things I love about working in the public sector is that our work can be shared openly.

We live in a rapidly changing and developing digital world and we will continue to monitor and assess the data standards and security measures in place to ensure they remain strong and effective. So, as well as sharing this work openly to reassure all our data suppliers that we’re taking good care of their data, we’re also seeking feedback on our revised data policies.

The same data can provide different insights when viewed through different lenses or in different combinations. The more data is shared – with the appropriate safeguards of course – the more it has to give.

If you work with data, you’ll know that collaborating with others in this space is key and that we need to be able to share data more easily when it makes sense to do so. So, the second reason for sharing this work openly is that, if you’re in the technical space, we’d value your feedback on our approach and if you’re in the data space and would like to adopt the same approach, we’d love to support you with that – so that we can all share data more easily in the future….(More)

ONS’s revised policies on the use, management and security of data can befound here.

Inside the world’s ‘what works’ teams


Jen Gold at What Works Blog: “There’s a small but growing band of government teams around the world dedicated to making experiments happen. The Cabinet Office’s What Works Team, set up in 2013, was the first of its kind. But you’ll now find them in Canada, the US, Finland, Australia, Colombia, and the UAE.

All of these teams work across government to champion the testing and evaluation of new approaches to public service delivery. This blog takes a look at the many ways in which we’re striving to make experimentation the norm in our governments.

Unsurprisingly we’re all operating in very different contexts. Some teams were set up in response to central requirements for greater experimentation. Take Canada, for instance. In 2016 the Treasury Board directed departments and agencies to devote a fixed proportion of programme funds to “experimenting with new approaches” (building on Prime Minister Trudeau’s earlier instruction to Ministers). An Innovation and Experimentation Team was then set up in the Treasury Board to provide some central support.

Finland’s Experimentation Office, based in the Prime Minister’s Office, is in a similar position. The team supports the delivery of Prime Minister Juha Sipilä’s 2016 national action plan that calls for “a culture of experimentation” in public services and a series of flagship policy experiments.

Others, like the US Office of Evaluation Sciences (OES) and the Behavioural Economics Team of the Australian Government (BETA), grew out of political interest in using behavioural science experiments in public policy. But these teams now run experiments in a much broader set of areas.

What unites us is a focus on helping public servants generate and use new evidence in policy decisions and service delivery….(More)”.

Listening to the people who think we are wrong


Larry Kramer at the Hewlett Foundation: “Among the most corrosive developments of recent years—one that predates the election of Donald Trump—has been a breakdown in our ability to debate and reason with others with whom we disagree. The term du jour, “tribalism,” replaced the earlier “polarization” precisely to capture the added ingredient of animosity that has made even conversation across partisan divides difficult. Mistrust and hostility have been grafted onto disagreement about ideas.

Political scientists differ about how widespread the phenomenon is—some seeing it shared broadly across American society, while others believe it confined to activist elites. I lean toward the latter view, though the disease seems to be spreading awfully fast. The difference hardly matters, because activists drive and shape public debates. And, either way, the resulting take-no-prisoners politics threatens the future of democratic government, which presupposes disagreement and depends on willingness to work through and across differences from a sense of shared community....

Learning to listen with empathy matters for a number of reasons. An advocate needs to see an opponent’s argument in its strongest light, not only to counter the position effectively, but also to fully understand his or her own position—its weaknesses as well as its strengths—and so be properly prepared to defend it. Nor is this the only reason, because adversarial advocacy is only part of what lawyers do. Most legal work involves bargaining among conflicting interests and finding ways to settle disputes. Good lawyers know how to negotiate and cooperate; they know (in the phrase made famous by Roger Fisher and William Ury) how to “get to yes”—something made vastly easier if one fully and fairly comprehends both sides of an issue. There is a reason lawyers have historically constituted such a disproportionate share of our legislators and executives, and it’s not because they know how to argue. It is because they know how to find common ground.

Not that compromising is always the right thing to do. Without doubt, there are matters of principle too important to relinquish, and instances in which an adversary is too inflexible or too extreme to accommodate. In today’s public discourse, moreover, outright fabrication has become, if not quite acceptable, increasingly common. But one cannot know if or when these are the case unless and until one has examined the other side’s position honestly and confronted the weaknesses in one’s own position fearlessly…

Three techniques in particular pervade the practice of paying heed to an opposing argument without condescending to meet it:

  • First, there is the “straw man” method—a tried and true practice that involves taking the weakest or most extreme or least plausible argument in favor of a position and acting as if it were the only argument for that position; a variation of this method takes the most extreme and unattractive advocates for a position and treats them as typical.
  • Second is the practice of attributing bad motives to one’s opponents. Those employing this approach assume that people who take a contrary position know in their hearts that they are wrong and make the arguments they do for some inappropriate reason, such as racism or self-interest, that makes it easy to ignore what they have to say.
  • Third, a relatively new entrant, is what might be called the identity excuse: “We don’t need to listen to them because they are [blank].” Then fill in the blank with whatever identity you think warrants dismissal: a white male, a Black Lives Matter supporter, a Trump voter, a Democrat, the oil industry, a union, someone who received money for their work, and so on….(More)”.

New mathematical model can help save endangered species


Blogpost by Majken Brahe and Ellegaard Christensen: “What does the blue whale have in common with the Bengal tiger and the green turtle? They share the risk of extinction and are classified as endangered species. There are multiple reasons for species to die out, and climate changes is among the main reasons.

The risk of extinction varies from species to species depending on how individuals in its populations reproduce and how long each animal survives. Understanding the dynamics of survival and reproduction can support management actions to improve a specie’s chances of surviving.

Mathematical and statistical models have become powerful tools to help explain these dynamics. However, the quality of the information we use to construct such models is crucial to improve our chances of accurately predicting the fate of populations in nature.

Colchero’s research focuses on mathematically recreating the population dynamics by better understanding the species’s demography. He works on constructing and exploring stochastic population models that predict how a certain population (for example an endangered species) will change over time.

These models include mathematical factors to describe how the species’ environment, survival rates and reproduction determine to the population’s size and growth. For practical reasons some assumptions are necessary.

Two commonly accepted assumptions are that survival and reproduction are constant with age, and that high survival in the species goes hand in hand with reproduction across all age groups within a species. Colchero challenged these assumptions by accounting for age-specific survival and reproduction, and for trade-offs between survival and reproduction. This is, that sometimes conditions that favor survival will be unfavorable for reproduction, and vice versa.

For his work Colchero used statistics, mathematical derivations, and computer simulations with data from wild populations of 24 species of vertebrates. The outcome was a significantly improved model that had more accurate predictions for a species’ population growth.

Despite the technical nature of Fernando’s work, this type of model can have very practical implications as they provide qualified explanations for the underlying reasons for the extinction. This can be used to take management actions and may help prevent extinction of endangered species….(More)”

Your old tweets give away more location data than you think


Issie Lapowsky at Wired: “An international group of researchers has developed an algorithmic tool that uses Twitter to automatically predict exactly where you live in a matter of minutes, with more than 90 percent accuracy. It can also predict where you work, where you pray, and other information you might rather keep private, like, say, whether you’ve frequented a certain strip club or gone to rehab.

The tool, called LPAuditor (short for Location Privacy Auditor), exploits what the researchers call an “invasive policy” Twitter deployed after it introduced the ability to tag tweets with a location in 2009. For years, users who chose to geotag tweets with any location, even something as geographically broad as “New York City,” also automatically gave their precise GPS coordinates. Users wouldn’t see the coordinates displayed on Twitter. Nor would their followers. But the GPS information would still be included in the tweet’s metadata and accessible through Twitter’s API.

Twitter didn’t change this policy across its apps until April of 2015. Now, users must opt-in to share their precise location—and, according to a Twitter spokesperson, a very small percentage of people do. But the GPS data people shared before the update remains available through the API to this day.

The researchers developed LPAuditor to analyze those geotagged tweets and infer detailed information about people’s most sensitive locations. They outline this process in a new, peer-reviewed paper that will be presented at the Network and Distributed System Security Symposium next month. By analyzing clusters of coordinates, as well as timestamps on the tweets, LPAuditor was able to suss out where tens of thousands of people lived, worked, and spent their private time…(More)”.

Gradually, Then Suddenly


Blogpost by Tim O’Reilly: “There’s a passage in Ernest Hemingway’s novel The Sun Also Rises in which a character named Mike is asked how he went bankrupt. “Two ways,” he answers. “Gradually, then suddenly.”

Technological change happens in much the same way. Small changes accumulate, and suddenly the world is a different place. Throughout my career at O’Reilly Media, we’ve tracked and fostered a lot of “gradually, then suddenly” movements: the World Wide Web, open source software, big data, cloud computing, sensors and ubiquitous computing, and now the pervasive effects of AI and algorithmic systems on society and the economy.

What are some of the things that are in the middle of their “gradually, then suddenly” transition right now? The list is long; here are a few of the areas that are on my mind.

1) AI and algorithms are everywhere

The most important trend for readers of this newsletter to focus on is the development of new kinds of partnership between human and machine. We take for granted that algorithmic systems do much of the work at online sites like Google, Facebook, Amazon, and Twitter, but we haven’t fully grasped the implications. These systems are hybrids of human and machine. Uber, Lyft, and Amazon Robotics brought this pattern to the physical world, reframing the corporation as a vast, buzzing network of humans both guiding and guided by machines. In these systems, the algorithms decide who gets what and why; they’re changing the fundamentals of market coordination in ways that gradually, then suddenly, will become apparent.

2) The rest of the world is leapfrogging the US

The volume of mobile payments in China is $13 trillion versus the US’s $50 billion, while credit cards never took hold. Already Zipline’s on-demand drones are delivering 20% of all blood supplies in Rwanda and will be coming soon to other countries (including the US). In each case, the lack of existing infrastructure turned out to be an advantage in adopting a radically new model. Expect to see this pattern recur, as incumbents and old thinking hold back the adoption of new models..

9) The crisis of faith in government

Ever since Jennifer Pahlka and I began working on the Gov 2.0 Summit back in 2008, we’ve been concerned that if we can’t get government up to speed on 21st century technology, a critical pillar of the good society will crumble. When we started that effort, we were focused primarily on government innovation; over time, through Jen’s work at Code for America and the United States Digital Service, that shifted to a focus on making sure that government services actually work for those who need them most. Michael Lewis’s latest book, The Fifth Risk, highlights just how bad things might get if we continue to neglect and undermine the machinery of government. It’s not just the political fracturing of our country that should concern us; it’s the fact that government plays a critical role in infrastructure, in innovation, and in the safety net. That role has gradually been eroded, and the cracks that are appearing in the foundation of our society are coming at the worst possible time….(More)”.

Paying Users for Their Data Would Exacerbate Digital Inequality


Blog post by Eline Chivot: “Writing ever more complicated and intrusive regulations rules about data processing and data use has become the new fad in policymaking. Many are lending an ear to tempting yet ill-advised proposals to treat personal data as traditional finite resource. The latest example can be found in an article, A Blueprint for a Better Digital Society, by Glen Weyl, an economist at Microsoft Research, and Jaron Lanier, a computer scientist and writer. Not content with Internet users being able to access many online services like Bing and Twitter for free, they want online users to be paid in cash for the data they provide. To say that this proposal is flawed is an understatement. Its flawed for three main reasons: 1) consumers would lose significant shared value in exchange for minimal cash compensation; 2) higher incomes individuals would benefit at the expense of the poor; and 3) transaction costs would increase substantially, further reducing value for consumers and limiting opportunities for businesses to innovate with the data.

Weyl and Lanier’s argument is motivated by the belief that because Internet users are getting so many valuable services—like search, email, maps, and social networking—for free, they must be paying with their data. Therefore, they argue, if users are paying with their data, they should get something in return. Never mind that they do get something in return: valuable digital services that they do not pay for monetarily. But Weyl and Lanier say this is not enough, and consumers should get more.

While this idea may sound good on paper, in practice, it would be a disaster.

…Weyl and Lanier’s self-declared objective is to ensure digital dignity, but in practice this proposal would disrupt the equal treatment users receive from digital services today by valuing users based on their net worth. In this techno-socialist nirvana, to paraphrase Orwell, some pigs would be more equal than others. The French Data Protection Authority, CNIL, itself raised concerns about treating data as a commodity, warning that doing so would jeopardize society’s humanist values and fundamental rights which are, in essence, priceless.

To ensure “a better digital society,” companies should continue to be allowed to decide the best Internet business models based on what consumers demand. Data is neither cash nor a commodity, and pursuing policies based on this misconception will damage the digital economy and make the lives of digital consumers considerably worse….(More)”.

Innovations in satellite measurements for development


Ran Goldblatt, Trevor Monroe, Sarah Elizabeth Antos, Marco Hernandez at the World Bank Data Blog: “The desire of human beings to “think spatially” to understand how people and objects are organized in space has not changed much since Eratosthenes—the Greek astronomer best known as the “father of Geography”—first used the term “Geographika” around 250 BC. Centuries later, our understanding of economic geography is being propelled forward by new data and new capabilities to rapidly process, analyze and convert these vast data flows into meaningful and near real-time information.

The increasing availability of satellite data has transformed how we use remote sensing analytics to understand, monitor and achieve the 2030 Sustainable Development Goals. As satellite data becomes ever more accessible and frequent, it is now possible not only to better understand how the Earth is changing, but also to utilize these insights to improve decision making, guide policy, deliver services, and promote better-informed governance. Satellites capture many of the physical, economic and social characteristics of Earth, providing a unique asset for developing countries, where reliable socio-economic and demographic data is often not consistently available. Analysis of satellite data was once relegated to researchers with access to costly data or to “super computers”. Today, the increased availability of “free” satellite data, combined with powerful cloud computing and open source analytical tools have democratized data innovation, enabling local governments and agencies to use satellite data to improve sector diagnostics, development indicators, program monitoring and service delivery.

Drivers of innovation in satellite measurements

  • Big (geo)data – Satellites in Global Development are improving every day, creating new opportunities for impact in development. They capture millions of images from Earth in different spatial, spectral and temporal resolutions, generating data in ever increasing volume, variety and velocity.
  • Open Source Open source annotated datasets, the World Bank’s Open Data, and other publicly available resources allow to process and document the data (e.g. Cumuluslabel maker) and perform machine learning analysis using common programming languages such as R or Python.
  • Crowd – crowdsource platforms like MTurkFigure-eight and Tomnod are used to collect and enhance inputs (reference data) to train machines to identify automatically specific objects and land cover on Earth.
  • High Quality Ground Truth –Robust algorithms that analyze the entire planet require diverse training data, and traditional development Microdata for use in machine learning training, validation and calibration, for example, to map urbanization processes.
  • Cloud – cloud computing and data storage capabilities within platforms like AWSAzure and Google Earth Engine provide scalable solutions for storage, management and parallel processing of large volumes of data.

…As petabytes of geo data are being collected, novel methods are developed to convert these data into meaningful information about the nature and pace of change on Earth, for example, the formation of urban landscapes and human settlements, the creation of transportation networks that connect cities or the conversion of natural forests into productive agricultural land. New possibilities emerge for harnessing this data for a better understanding about our changing world….(More)”.