Seven design principles for using blockchain for social impact


Stefaan Verhulst at Apolitical: “2018 will probably be remembered as the bust of the blockchain hype. Yet even as crypto currencies continue to sink in value and popular interest, the potential of using blockchain technologies to achieve social ends remains important to consider but poorly understood.

In 2019, business will continue to explore blockchain for sectors as disparate as finance, agriculture, logistics and healthcare. Policymakers and social innovators should also leverage 2019 to become more sophisticated about blockchain’s real promise, limitations  and current practice.

In a recent report I prepared with Andrew Young, with the support of the Rockefeller Foundation, we looked at the potential risks and challenges of using blockchain for social change — or “Blockchan.ge.” A number of implementations and platforms are already demonstrating potential social impact.

The technology is now being used to address issues as varied as homelessness in New York City, the Rohingya crisis in Myanmar and government corruption around the world.

In an illustration of the breadth of current experimentation, Stanford’s Center for Social Innovation recently analysed and mapped nearly 200 organisations and projects trying to create positive social change using blockchain. Likewise, the GovLab is developing a mapping of blockchange implementations across regions and topic areas; it currently contains 60 entries.

All these examples provide impressive — and hopeful — proof of concept. Yet despite the very clear potential of blockchain, there has been little systematic analysis. For what types of social impact is it best suited? Under what conditions is it most likely to lead to real social change? What challenges does blockchain face, what risks does it pose and how should these be confronted and mitigated?

These are just some of the questions our report, which builds its analysis on 10 case studies assembled through original research, seeks to address.

While the report is focused on identity management, it contains a number of lessons and insights that are applicable more generally to the subject of blockchange.

In particular, it contains seven design principles that can guide individuals or organisations considering the use of blockchain for social impact. We call these the Genesis principles, and they are outlined at the end of this article…(More)”.

Implementing Public Policy: Is it possible to escape the ‘Public Policy Futility’ trap?


Blogpost by Matt Andrews:

Screen Shot 2018-12-06 at 6.29.15 PM

“Polls suggest that governments across the world face high levels of citizen dissatisfaction, and low levels of citizen trust. The 2017 Edelman Trust Barometer found, for instance, that only 43% of those surveyed trust Canada’s government. Only 15% of those surveyed trust government in South Africa, and levels are low in other countries too—including Brazil (at 24%), South Korea (28%), the United Kingdom (36%), Australia, Japan, and Malaysia (37%), Germany (38%), Russia (45%), and the United States (47%). Similar surveys find trust in government averaging only 40-45% across member countries of the Organization for Economic Cooperation and Development (OECD), and suggest that as few as 31% and 32% of Nigerians and Liberians trust government.

There are many reasons why trust in government is deficient in so many countries, and these reasons differ from place to place. One common factor across many contexts, however, is a lack of confidence that governments can or will address key policy challenges faced by citizens.

Studies show that this confidence deficiency stems from citizen observations or experiences with past public policy failures, which promote jaundiced views of their public officials’ capabilities to deliver. Put simply, citizens lose faith in government when they observe government failing to deliver on policy promises, or to ‘get things done’. Incidentally, studies show that public officials also often lose faith in their own capabilities (and those of their organizations) when they observe, experience or participate in repeated policy implementation failures. Put simply, again, these public officials lose confidence in themselves when they repeatedly fail to ‘get things done’.

I call the ‘public policy futility’ trap—where past public policy failure leads to a lack of confidence in the potential of future policy success, which feeds actual public policy failure, which generates more questions of confidence, in a vicious self fulfilling prophecy. I believe that many governments—and public policy practitioners working within governments—are caught in this trap, and just don’t believe that they can muster the kind of public policy responses needed by their citizens.

Along with my colleagues at the Building State Capability (BSC) program, I believe that many policy communities are caught in this trap, to some degree or another. Policymakers in these communities keep coming up with ideas, and political leaders keep making policy promises, but no one really believes the ideas will solve the problems that need solving or produce the outcomes and impacts that citizens need. Policy promises under such circumstances center on doing what policymakers are confident they can actually implement: like producing research and position papers and plans, or allocating inputs toward the problem (in a budget, for instance), or sponsoring visible activities (holding meetings or engaging high profile ‘experts’ for advice), or producing technical outputs (like new organizations, or laws). But they hold back from promising real solutions to real problems, as they know they cannot really implement them (given past political opposition, perhaps, or the experience of seemingly interactable coordination challenges, or cultural pushback, and more)….(More)”.

Sludge and Ordeals


Paper by Cass R. Sunstein: “In 2015, the United States government imposed 9.78 billion hours of paperwork burdens on the American people. Many of these hours are best categorized as “sludge,” reducing access to important licenses, programs, and benefits. Because of the sheer costs of sludge, rational people are effectively denied life-changing goods and services; the problem is compounded by the existence of behavioral biases, including inertia, present bias, and unrealistic optimism. In principle, a serious deregulatory effort should be undertaken to reduce sludge, through automatic enrollment, greatly simplified forms, and reminders. At the same time, sludge can promote legitimate goals.

First, it can protect program integrity, which means that policymakers might have to make difficult tradeoffs between (1) granting benefits to people who are not entitled to them and (2) denying benefits to people who are entitled to them. Second, it can overcome impulsivity, recklessness, and self-control problems. Third, it can prevent intrusions on privacy. Fourth, it can serve as a rationing device, ensuring that benefits go to people who most need them. In most cases, these defenses of sludge turn out to be more attractive in principle than in practice.

For sludge, a form of cost-benefit analysis is essential, and it will often argue in favor of a neglected form of deregulation: sludge reduction. For both public and private institutions,“Sludge Audits” should become routine. Various suggestions are offered for new action by the Office of Information and Regulatory Affairs, which oversees the Paperwork Reduction Act; for courts; and for Congress…(More)”.

The global race is on to build ‘City Brains’


Prediction by Geoff Mulgan, Eva Grobbink and Vincent Straub: “The USSR’s launch of the Sputnik 1 satellite in 1958 was a major psychological blow to the United States. The US had believed it was technologically far ahead of its rival, but was confronted with proof that the USSR was pulling ahead in some fields. After a bout of soul-searching the country responded with extraordinary vigour, massively increasing investment in space technologies and promising to put a man on the Moon by the end of the 1960s.

In 2019, China’s success in smart cities could prompt a similar “Sputnik Moment” for the rest of the world. It may not be as dramatic as that of 1958. But unlike beeping satellites and Moon landings, it could be coming to a town near you….

The concept of a “smart city” has been around for several decades, often associated with hype, grandiose failures, and an overemphasis on hardware rather than people (Nesta has previously written on how we can rethink smart cities and ensure digital innovation realises the potential of technology and people). But various technologies are now coming of age which bring the vision of a smart city closer to fruition. China is in the forefront, investing heavily in sensors and infrastructures, and its ET City Brain project shows just how far the country’s thinking has progressed.

First launched in September 2016, ET City Brain is a collaboration between Chinese technology giant Alibaba and several cities. It was first trialled in Hangzhou, the hometown of Alibaba’s executive chairman, Jack Ma, but has since expanded to other Chinese cities. Earlier this year, Kuala Lumpurbecame the first city outside of China to import the ET City Brain model.

The ET City Brain system gathers large amounts of data (including logs, videos, and data stream) from sensors. These are then processed by algorithms in supercomputers and fed back into control centres around the city for administrators to act on—in some cases, automation means the system works without any human intervention at all.

So far, the project has been used to monitor congestion in Hangzhou, improve the response of emergency services in Guangzhou, and detect traffic accidents in Suzhou. In Hangzhou, Alibaba was given control of 104 traffic light junctions in the city’s Xiaoshan district and tasked with managing traffic flows. By combining mass video surveillance with live data from public transportation systems, ET City Brain was able to autonomously change traffic lights so that emergency vehicles could travel to accident scenes without interruption. As a result, arrival times for ambulances improved by 49 percent….(More)”.

Cybersecurity of the Person


Paper by Jeff Kosseff: “U.S. cybersecurity law is largely an outgrowth of the early-aughts concerns over identity theft and financial fraud. Cybersecurity laws focus on protecting identifiers such as driver’s licenses and social security numbers, and financial data such as credit card numbers. Federal and state laws require companies to protect this data and notify individuals when it is breached, and impose civil and criminal liability on hackers who steal or damage this data. In this paper, I argue that our current cybersecurity laws are too narrowly focused on financial harms. While such concerns remain valid, they are only one part of the cybersecurity challenge that our nation faces.

Too often overlooked by the cybersecurity profession are the harms to individuals, such as revenge pornography and online harassment. Our legal system typically addresses these harms through retrospective criminal prosecution and civil litigation, both of which face significant limits. Accounting for such harms in our conception of cybersecurity will help to better align our laws with these threats and reduce the likelihood of the harms occurring….(More)”,

Bad Landlord? These Coders Are Here to Help


Luis Ferré-Sadurní in the New York Times: “When Dan Kass moved to New York City in 2013 after graduating from college in Boston, his introduction to the city was one that many New Yorkers are all too familiar with: a bad landlord….

Examples include an app called Heatseek, created by students at a coding academy, that allows tenants to record and report the temperature in their homes to ensure that landlords don’t skimp on the heat. There’s also the Displacement Alert Project, built by a coalition of affordable housing groups, that maps out buildings and neighborhoods at risk of displacement.

Now, many of these civic coders are trying to band together and formalize a community.

For more than a year, Mr. Kass and other housing-data wonks have met each month at a shared work space in Brooklyn to exchange ideas about projects and talk about data sets over beer and snacks. Some come from prominent housing advocacy groups; others work unrelated day jobs. They informally call themselves the Housing Data Coalition.

“The real estate industry has many more programmers, many more developers, many more technical tools at their disposal,” said Ziggy Mintz, 30, a computer programmer who is part of the coalition. “It never quite seems fair that the tenant side of the equation doesn’t have the same tools.”

“Our collaboration is a counteracting force to that,” said Lucy Block, a research and policy associate at the Association for Neighborhood & Housing Development, the group behind the Displacement Alert Project. “We are trying to build the capacity to fight the displacement of low-income people in the city.”

This week, Mr. Kass and his team at JustFix.nyc, a nonprofit technology start-up, launched a new database for tenants that was built off ideas raised during those monthly meetings.

The tool, called Who Owns What, allows tenants to punch in an address and look up other buildings associated with the landlord or management company. It might sound inconsequential, but the tool goes a long way in piercing the veil of secrecy that shrouds the portfolios of landlords….(More)”.

To Reduce Privacy Risks, the Census Plans to Report Less Accurate Data


Mark Hansen at the New York Times: “When the Census Bureau gathered data in 2010, it made two promises. The form would be “quick and easy,” it said. And “your answers are protected by law.”

But mathematical breakthroughs, easy access to more powerful computing, and widespread availability of large and varied public data sets have made the bureau reconsider whether the protection it offers Americans is strong enough. To preserve confidentiality, the bureau’s directors have determined they need to adopt a “formal privacy” approach, one that adds uncertainty to census data before it is published and achieves privacy assurances that are provable mathematically.

The census has always added some uncertainty to its data, but a key innovation of this new framework, known as “differential privacy,” is a numerical value describing how much privacy loss a person will experience. It determines the amount of randomness — “noise” — that needs to be added to a data set before it is released, and sets up a balancing act between accuracy and privacy. Too much noise would mean the data would not be accurate enough to be useful — in redistricting, in enforcing the Voting Rights Act or in conducting academic research. But too little, and someone’s personal data could be revealed.

On Thursday, the bureau will announce the trade-off it has chosen for data publications from the 2018 End-to-End Census Test it conducted in Rhode Island, the only dress rehearsal before the actual census in 2020. The bureau has decided to enforce stronger privacy protections than companies like Apple or Google had when they each first took up differential privacy….

In presentation materials for Thursday’s announcement, special attention is paid to lessening any problems with redistricting: the potential complications of using noisy counts of voting-age people to draw district lines. (By contrast, in 2000 and 2010 the swapping mechanism produced exact counts of potential voters down to the block level.)

The Census Bureau has been an early adopter of differential privacy. Still, instituting the framework on such a large scale is not an easy task, and even some of the big technology firms have had difficulties. For example, shortly after Apple’s announcement in 2016 that it would use differential privacy for data collected from its macOS and iOS operating systems, it was revealed that the actual privacy loss of their systems was much higher than advertised.

Some scholars question the bureau’s abandonment of techniques like swapping in favor of differential privacy. Steven Ruggles, Regents Professor of history and population studies at the University of Minnesota, has relied on census data for decades. Through the Integrated Public Use Microdata Series, he and his team have regularized census data dating to 1850, providing consistency between questionnaires as the forms have changed, and enabling researchers to analyze data across years.

“All of the sudden, Title 13 gets equated with differential privacy — it’s not,” he said, adding that if you make a guess about someone’s identity from looking at census data, you are probably wrong. “That has been regarded in the past as protection of privacy. They want to make it so that you can’t even guess.”

“There is a trade-off between usability and risk,” he added. “I am concerned they may go far too far on privileging an absolutist standard of risk.”

In a working paper published Friday, he said that with the number of private services offering personal data, a prospective hacker would have little incentive to turn to public data such as the census “in an attempt to uncover uncertain, imprecise and outdated information about a particular individual.”…(More)”.

New methods help identify what drives sensitive or socially unacceptable behaviors


Mary Guiden at Physorg: “Conservation scientists and statisticians at Colorado State University have teamed up to solve a key problem for the study of sensitive behaviors like poaching, harassment, bribery, and drug use.

Sensitive behaviors—defined as socially unacceptable or not compliant with rules and regulations—are notoriously hard to study, researchers say, because people often do not want to answer direct questions about them.

To overcome this challenge, scientists have developed indirect questioning approaches that protect responders’ identities. However, these methods also make it difficult to predict which sectors of a population are more likely to participate in sensitive behaviors, and which factors, such as knowledge of laws, education, or income, influence the probability that an individual will engage in a sensitive behavior.

Assistant Professor Jennifer Solomon and Associate Professor Michael Gavin of the Department of Human Dimensions of Natural Resources at CSU, and Abu Conteh from MacEwan University in Alberta, Canada, have teamed up with Professor Jay Breidt and doctoral student Meng Cao in the CSU Department of Statistics to develop a new method to solve the problem.

The study, “Understanding the drivers of sensitive behavior using Poisson regression from quantitative randomized response technique data,” was published recently in PLOS One.

Conteh, who, as a doctoral student, worked with Gavin in New Zealand, used a specific technique, known as quantitative randomized response, to elicit confidential answers to questions on behaviors related to non-compliance with natural resource regulations from a protected area in Sierra Leone.

In this technique, the researcher conducting interviews has a large container containing pingpong balls, some with numbers and some without numbers. The interviewer asks the respondent to pick a ball at random, without revealing it to the interviewer. If the ball has a number, the respondent tells the interviewer the number. If the ball does not have a number, the respondent reveals how many times he illegaly hunted animals in a given time period….

Armed with the new computer program, the scientists found that people from rural communities with less access to jobs in urban centers were more likely to hunt in the reserve. People in communities with a greater proportion people displaced by Sierra Leone’s 10-year civil war were also more likely to hunt illegally….(More)”

The researchers said that collaborating across disciplines was and is key to addressing complex problems like this one. It is commonplace for people to be noncompliant with rules and regulations and equally important for social scientists to analyze these behaviors….(More)”

The Innovation System of the Public Service of Canada


OECD: Today, the OECD Observatory of Public Sector Innovation (OPSI) is pleased to announce the release of The Innovation System of the Public Service of Canada, the first of the OECD’s reviews of a national public sector innovation system….Some of the key findings and observations from the report include:

  • The Government of Canada starts with a strong base, having a long demonstrated history of innovation. The civil service also has a longstanding awareness and appreciation of the need for innovation.
  • There has been an ongoing recognition that the Public Service of Canada needs to continue to adapt and be responsive. Respective Clerks (the Heads of the Public Service) have repeatedly identified the need to go further.
  • Much of the ‘low-hanging’ fruit (i.e. activities to support public sector innovation such as awards, efforts to remove hurdles, introduction of new tools) has already been picked, but this is unlikely to lead to long term sustainability.
  • The innovation system is still relatively fragmented, in that most actors are experiencing the same system in different ways. New approaches are needed.
  • The Canadian Public Service has made some significant steps towards a more systemic approach to public sector innovation. However, it is likely that without continuous efforts and direction the innovation system will not be able to consistently and reliably contribute to the delivery of the best outcomes for citizens.

Given that much is still being learnt about public sector innovation, the report avoids a prescriptive approach as to what should be done. It identifies potential areas of intervention, but recognises that the context will continue to evolve, and that the specific actions taken should be matched to the ambitions and intent of the Public Service of Canada.

An innovation system is made up of many parts and contributed to by many actors. The effectiveness of the innovation system – i.e. its ability to consistently and reliably develop and deliver innovative solutions that contribute to achieving the goals and priorities of the government – will depend on collective effort, involving action from different actors at the individual, organisational, and system levels.

While a range of options are put forward, the aim of this review, and the guidance included within it, is to help provide a reflection of the system so that all actors can see themselves within it. This can provide a contribution to the ongoing discussion and deliberation about what the collective aim for innovation is within the Public Service of Canada, and how everyone can play a part, and be supported in that….(More)”.

Library of Congress Launches Crowdsourcing Platform


Matt Enis at the Library Journal: “The Library of Congress (LC) last month launched crowd.loc.gov, a new crowdsourcing platform that will improve discovery and access to the Library’s digital collections with the help of volunteer transcription and tagging. The project kicked off with the “Letters to Lincoln Challenge,” a campaign encouraging volunteers to transcribe 10,000 digitized versions of documents written by or to Abraham Lincoln, which will make these materials full-text searchable for the first time….

The new project is the earliest example of LC’s new Digital Strategy, which complements the library’s new 2019–23 strategic plan. Announced in October, the strategic plan, “Enriching the User Experience,” outlines four high-level goals—expanding access, enhancing services, optimizing resources, and measuring results—while the digital strategy outlines how LC plans to accomplish these goals with its digital resources, described as “throwing open the treasure chest, connecting, and investing in our future”…

LC aims to use crowdsourcing to enrich the user experience in two key ways, Zwaard said.

“First, it helps with the legibility of our collections,” she explained. “The Library of Congress is home to so many historic treasures, but the handwriting can be hard to read…. For example, we have this amazing letter from Abraham Lincoln to his first fiancée. It’s really quite lovely, but at a glance, if you’re not familiar with historic handwriting, it’s hard to read.”…

Second, crowdsourcing “invites people into the collections,” she added. “The library is very optimized around answering specific research questions. One of the things we’re thinking about is how to serve users who don’t have a specific research question—who just want to see all of the cool stuff. We have so much cool stuff! But it can be hard for people to find purchase when they are just browsing and don’t have anything specific in mind. One of the ways we can [showcase interesting content] is by offering them a window into the collections by asking for their help.”…

To facilitate ongoing engagement with these varied projects, LC has set up an online forum on History Hub, a site hosted by the National Archives, to encourage crowd.loc.gov participants to ask questions, discuss projects, and meet other volunteers. …

Crowd.loc.gov is not LC’s first crowdsourcing project. Followers of the library’s official Flickr account have added tens of thousands of descriptive tags to digitized historical photos since the account debuted in 2007. And last year, the debut of labs.loc.gov—which aims to encourage creative use of LOC’s digital collections—included the Beyond Words crowdsourcing project developed by LC software developer Tong Wang….(More)”