Shedding light on government, one dataset at a time


Bill Below of the OECD Directorate for Public Governance and Territorial Development at OECD Insights: “…As part of its Open Government Data (OGD) work, the OECD has created OURdata, an index that assesses governments’ efforts to implement OGD in three critical areas: Openness, Usefulness and Re-usability. The results are promising. Those countries that began the process in earnest some five years ago, today rank very high on the scale. According to this Index, which closely follows the principles of the G8 Open Data Charter, Korea is leading the implementation of OGD initiatives with France a close second.

ourdata

Those who have started the process but who are lagging (such as Poland) can draw on the experience of other OECD countries, and benefit from a clear roadmap to guide them.

Indeed, bringing one’s own country’s weaknesses out into the light is the first, and sometimes most courageous, step towards achieving the benefits of OGD. Poland has just completed its Open Government Data country review with the OECD revealing some sizable challenges ahead in transforming the internal culture of its institutions. For the moment, a supply-side rather than people-driven approach to data release is prevalent. Also, OGD in Poland is not widely understood to be a source of value creation and growth….(More)”

Data Reinvents Libraries for the 21st Century


 in GovTech: “Libraries can evoke tired assumptions. It could be a stack of battered books and yesteryear movies; that odd odor of wilted pages and circa-1970s decor; or it could be a bout of stereotypes, like obsolete encyclopedias and ruler-snapping librarians.

Whatever the case, the truth is that today libraries are proving they’re more than mausoleums of old knowledge. They’re in a state of progressive reform, rethinking services and restructuring with data. It’s a national trend as libraries modernize, strategize and recast themselves as digital platforms. They’ve taken on the role of data curator for information coming in and citizen-generated data going out….

Nate Hill is among this band of progressives. As a data zealot who believes in data’s inclination for innovation, the former deputy director for Tennessee’s Chattanooga Public Library, led a charge to transform the library into a data centric community hub. The library boasts an open data portal that it manages for the city, a civic hacker lab, a makerspace for community projects, and expanded access to in-person and online tutorials for coding and other digital skill sets….

The draw in data sharing and creating, Hill said, comes from the realization that today’s data channels are no longer one-way systems.

“I push people to the idea that now it’s about being a producer rather than just a consumer,” Hill said, “because really that whole idea of a read-write Web comes from the notion that you and I, for example, are just as capable at editing Wikipedia articles on the fly and changing information as anybody else.”

For libraries, Hill sees this as an opportunity and asks what institution can better pioneer the new frontier of information exchange. He posits the idea that, as the original public content curator, adding open data to libraries is only natural. In fact, he says it’s a logical next step when considering that traditional media like books, research journals and other sources infuse data points with rich context — something most city and state open data portals typically don’t do.

“The dream here is to treat the library as a different kind of community infrastructure,” Hill said. “You can conceivably be feeding live data about a city into an open data portal, and at the same time, turning the library into a real live information source — rather than something just static.”

In Chattanooga, an ongoing effort is in the works to do just that. The library seeks to integrate open data into its library catalog searches. Visitors researching Chattanooga’s waterfront could do a quick search and pull up local books, articles and mapping documents, but also a collection of latest data sets on water pollution and land use, for example.

Eyeing the library data movement at scale, Hill said he could easily envision a network of public libraries that act as local data hubs, retrieving and funneling data into larger state and national data portals….(More).

Lawsuits In The Public Interest Now Have Their Own Crowdfunding Site


Jessica Leber at Fast CoExist: “…Crowdjustice, …aims to serve as a replacement for those who can no longer get help launching a lawsuit. It’s crowdfunding platform where anyone can donate to fund cases that have been vetted by the site.

Operating in the U.K. for now, Crowdjustice will be reserved for civil cases that involve a community interest (rather than a dispute between two people), such as a disability discrimination or human rights case or a fight to save nature from development. Litigants must already have a lawyer who has accepted the case in order to be listed on the platform, which is for now invitation-only.

Niche crowdfunding sites aren’t new, but they’ve had varying success. Salasky believes a site dedicated to legal campaigns is useful because litigation varies from country to country. Another reasons is that on Crowdjustice, the funds raised will go directly into a trust account held by the lawyer for the case—not the individual. “There’s more oversight of what’s happening,” she says. Campaigns will only be funded if the full goal is met, and Crowdjustice takes a 5% cut.

The first case on the site right now is that of Colombian petroleum engineer and whistleblower Gilberto Torres, who wants to hold the oil company BP responsible for his 2002 kidnapping. Though his lawyer will only get a fee if the case wins, the $7,800 crowdfunding campaign is to cover various other costs and court fees….(More).

Putting Open at the Heart of the Digital Age


Presentation by Rufus Pollock: “….To repeat then: technology is NOT teleology. The medium is NOT the message – and it’s the message that matters.

The printing press made possible an “open” bible but it was Tyndale who made it open – and it was the openness that mattered.

Digital technology gives us unprecedented potential for creativity, sharing, for freedom. But they are possible not inevitable. Technology alone does not make a choice for us.

Remember that we’ve been here before: the printing press was revolutionary but we still ended up with a print media that was often dominated by the few and the powerful.

Think of radio. If you read about how people talked about it in the 1910s and 1920s, it sounds like the way we used to talk about the Internet today. The radio was going to revolutionize human communications and society. It was going to enable a peer to peer world where everyone can broadcast, it was going to allow new forms of democracy and politics, etc. What happened? We got a one way medium, controlled by the state and a few huge corporations.

Look around you today.

The Internet’s costless transmission can – and is – just as easily creating information empires and information robber barons as it can creating digital democracy and information equality.

We already know that this technology offers unprecedented opportunities for surveillance, for monitoring, for tracking. It can just as easily exploit us as empower us.

We need to put openness at the heart of this information age, and at the heart of the Net, if we are really to realize its possibilities for freedom, empowerment, and connection.

The fight then is on the soul of this information age and we have a choice.

A choice of open versus closed.

Of collaboration versus control.

Of empowerment versus exploitation.

Its a long road ahead – longer perhaps than our lifetimes. But we can walk it together.

In this 21st century knowledge revolution, William Tyndale isn’t one person. It’s all of us, making small and big choices: from getting governments and private companies to release their data, to building open databases and infrastructures together, from choosing apps on your phone that are built on open to using social networks that give you control of your data rather than taking it from you.

Let’s choose openness, let’s choose freedom, let’s choose the infinite possibilities of this digital age by putting openness at its heart….(More)”- See also PowerPoint Presentation

Algorithmic Citizenship


Citizen-Ex: “Algorithmic Citizenship is a new form of citizenship, one where your citizenship, and therefore both your allegiances and your rights, are constantly being questioned, calculated, and rewritten.

Most people are assigned a citizenship at birth, in one of two ways. You may receive your citizenship from the place you’re born, which is called jus soli, or the right of soil. If you’re born in a place, that’s where you’re a citizen of. This is true in a lot of North and South America, for example – but not much of the rest of the world. You may get your citizenship based on where your parents are citizens of, which is called jus sanguinis, or the right of blood. Everybody is supposed to have a citizenship, although millions of stateless people do not, as a result of war, migration or the collapse of existing states. Many people also change citizenship over the course of their life, through various legal mechanisms. Some countries allow you to hold more than one citizenship at once, and some do not.

Having a citizenship means that you have a place in the world, an allegiance to a state. That state is supposed to guarantee you certain rights, like freedom from arrest, imprisonment, torture, or surveillance – depending on which state you belong to. Hannah Arendt famously said that “citizenship is the right to have rights”. To tamper with ones citizenship is to endanger ones most fundamental rights. Without citizenship, we have no rights at all.

Algorithmic Citizenship is a form of citizenship which is not assigned at birth, or through complex legal documents, but through data. Like other computerised processes, it can happen at the speed of light, and it can happen over and over again, constantly revising and recalculating. It can split a single citizenship into an infinite number of sub-citizenships, and count and weight them over time to produce combinations of affiliations to different states.

Citizen Ex calculates your Algorithmic Citizenship based on where you go online. Every site you visit is counted as evidence of your affiliation to a particular place, and added to your constantly revised Algorithmic Citizenship. Because the internet is everywhere, you can go anywhere – but because the internet is real, this also has consequences….(More)”

A Research Roadmap for Human Computation


Emerging Technology From the arXiv : “The wisdom of the crowd has become so powerful and so accessible via the Internet that it has become a resource in its own right. Various services now tap into this rich supply of human cognition, such as Wikipedia, Duolingo, and Amazon’s Mechanical Turk.

So important is this resource that scientists have given it a name; they call it human computation. And a rapidly emerging and increasingly important question is how best to exploit it.

Today, we get an answer of sorts thanks to a group of computer scientists, crowdsourcing pioneers, and visionaries who have created a roadmap for research into human computation. The team, led by Pietro Michelucci at the Human Computation Institute, point out that human computation systems have been hugely successful at tackling complex problems from identifying spiral galaxies to organizing disaster relief.

But their potential is even greater still, provided that human cognition can be efficiently harnessed on a global scale. Last year, they met to discuss these issues and have now published the results of their debate.

The begin by pointing out the extraordinary successes of human computation….then describe the kinds of projects they want to create. They call one idea Project Houston after the crowdsourced effort on the ground that helped bring back the Apollo 13 astronauts after an on-board explosion on the way to the moon.

Their idea is that similar help can be brought to bear from around the world when individuals on earth find themselves in trouble. By this they mean individuals who might be considering suicide or suffering from depression, for example.

The plan is to use state-of-the-art speech analysis and natural language understanding to detect stress and offer help. This would come in the form of composite personalities made up from individuals with varying levels of expertise in the crowd, supported by artificial intelligence techniques. “Project Houston could provide a consistently kind and patient personality even if the “crowd” changes completely over time,” they say.

Another idea is to build on the way that crowdsourcing helps people learn. One example of this is Duolingo, an app that offers free language lessons while simultaneously acting as a document translation service. “Why stop with language learning and translation?” they ask.

A similar approach could help people learn new skills as they work online, a process that should allow them to take on more complex roles. One example is in the field of radiology, where an important job is to recognize tumors on x-ray images. This is a task that machine vision algorithms do not yet perform reliably…..

Yet another idea would be to crowdsource information that helps the poorest families in America find social welfare programs. These programs are often difficult to navigate and represent a disproportionate hardship for the people who are most likely to benefit from them: those who are homeless, who have disabilities, who are on low income, and so on.

The idea is that the crowd should take on some of this burden freeing up this group for other tasks, like finding work, managing health problems and so on.

These are worthy goals but they raise some significant questions. Chief among these is the nature of the ethical, legal, and social implications of human computation. How can this work be designed to allow meaningful and dignified human participation? How can the outcomes be designed so that the most vulnerable people can benefit from it? And what is the optimal division of labor between machines and humans to produce a specific result?

Ref:  arxiv.org/abs/1505.07096 : A U.S. Research Roadmap for Human Computation”

Nudges Do Not Undermine Human Agency


Cass R. Sunstein in the Journal of Consumer Policy: “Some people believe that nudges undermine human agency, but with appropriate nudges, neither agency nor consumer freedom is at risk. On the contrary, nudges can promote both goals. In some contexts, they are indispensable. There is no opposition between education on the one hand and nudges on the other. Many nudges are educative. Even when they are not, they can complement, and not displace, consumer education….(More)”.

Field experimenting in economics: Lessons learned for public policy


Robert Metcalfe at OUP Blog: “Do neighbourhoods matter to outcomes? Which classroom interventions improve educational attainment? How should we raise money to provide important and valued public goods? Do energy prices affect energy demand? How can we motivate people to become healthier, greener, and more cooperative? These are some of the most challenging questions policy-makers face. Academics have been trying to understand and uncover these important relationships for decades.

Many of the empirical tools available to economists to answer these questions do not allow causal relationships to be detected. Field experiments represent a relatively new methodological approach capable of measuring the causal links between variables. By overlaying carefully designed experimental treatments on real people performing tasks common to their daily lives, economists are able to answer interesting and policy-relevant questions that were previously intractable. Manipulation of market environments allows these economists to uncover the hidden motivations behind economic behaviour more generally. A central tenet of field experiments in the policy world is that governments should understand the actual behavioural responses of their citizens to changes in policies or interventions.

Field experiments represent a departure from laboratory experiments. Traditionally, laboratory experiments create experimental settings with tight control over the decision environment of undergraduate students. While these studies also allow researchers to make causal statements, policy-makers are often concerned subjects in these experiments may behave differently in settings where they know they are being observed or when they are permitted to sort out of the market.

For example, you might expect a college student to contribute more to charity when she is scrutinized in a professor’s lab than when she can avoid the ask altogether. Field experiments allow researchers to make these causal statements in a setting that is more generalizable to the behaviour policy-makers are directly interested in.

To date, policy-makers traditionally gather relevant information and data by using focus groups, qualitative evidence, or observational data without a way to identify causal mechanisms. It is quite easy to elicit people’s intentions about how they behave with respect to a new policy or intervention, but there is increasing evidence that people’s intentions are a poor guide to predicting their behaviour.

However, we are starting to see a small change in how governments seek to answer pertinent questions. For instance, the UK tax office (Her Majesty’s Revenue and Customs) now uses field experiments across some of its services to improve the efficacy of scarce taxpayers money. In the US, there are movements toward gathering more evidence from field experiments.

In the corporate world, experimenting is not new. Many of the current large online companies—such as Amazon, Facebook, Google, and Microsoft—are constantly using field experiments matched with big data to improve their products and deliver better services to their customers. More and more companies will use field experiments over time to help them better set prices, tailor advertising, provide a better customer journey to increase welfare, and employ more productive workers…(More).

See also Field Experiments in the Developed World: An Introduction (Oxford Review of Economic Policy)

Why Technology Hasn’t Delivered More Democracy


Collection of POVs aggregated by Thomas Carothers at Foreign Policy: “New technologies offer important tools for empowerment — yet democracy is stagnating. What’s up?…

THe current moment confronts us with a paradox. The first fifteen years of this century have been a time of astonishing advances in communications and information technology, including digitalization, mass-accessible video platforms, smart phones, social media, billions of people gaining internet access, and much else. These revolutionary changes all imply a profound empowerment of individuals through exponentially greater access to information, tremendous ease of communication and data-sharing, and formidable tools for networking. Yet despite these changes, democracy — a political system based on the idea of the empowerment of individuals — has in these same years become stagnant in the world. The number of democracies today is basically no greater than it was at the start of the century. Many democracies, both long-established ones and newer ones, are experiencing serious institutional debilities and weak public confidence.

How can we reconcile these two contrasting global realities — the unprecedented advance of technologies that facilitate individual empowerment and the overall lack of advance of democracy worldwide? To help answer this question, I asked six experts on political change, all from very different professional and national perspectives. Here are their responses, followed by a few brief observations of my own.

1. Place a Long Bet on the Local By Martin Tisné

2. Autocrats Know How to Use Tech, Too By Larry Diamond

3. Limits on Technology Persist By Senem Aydin Düzgit

4. The Harder Task By Rakesh Rajani

5. Don’t Forget Institutions By Diane de Gramont

6. Mixed Lessons from Iran By Golnaz Esfandiari

7. Yes, It’s Complicated byThomas Carothers…(More)”

Signal: Understanding What Matters in a World of Noise,


Book by Stephen Few: “In this age of so-called Big Data, organizations are scrambling to implement new software and hardware to increase the amount of data they collect and store. However, in doing so they are unwittingly making it harder to find the needles of useful information in the rapidly growing mounds of hay. If you don’t know how to differentiate signals from noise, adding more noise only makes things worse. When we rely on data for making decisions, how do we tell what qualifies as a signal and what is merely noise? In and of itself, data is neither. Assuming that data is accurate, it is merely a collection of facts. When a fact is true and useful, only then is it a signal. When it’s not, it’s noise. It’s that simple. In Signal, Stephen Few provides the straightforward, practical instruction in everyday signal detection that has been lacking until now. Using data visualization methods, he teaches how to apply statistics to gain a comprehensive understanding of one’s data and adapts the techniques of Statistical Process Control in new ways to detect not just changes in the metrics but also changes in the patterns that characterize data…(More)”