Opportunities and risks in emerging technologies


White Paper Series at the WebFoundation: “To achieve our vision of digital equality, we need to understand how new technologies are shaping society; where they present opportunities to make people’s lives better, and indeed where they threaten to create harm. To this end, we have commissioned a series of white papers examining three key digital trends: artificial intelligence, algorithms and control of personal data. The papers focus on low and middle-income countries, which are all too often overlooked in debates around the impacts of emerging technologies.

The series addresses each of these three digital issues, looking at how they are impacting people’s lives and identifying steps that governments, companies and civil society organisations can take to limit the harms, and maximise benefits, for citizens.

Download the white papers

We will use these white papers to refine our thinking and set our work agenda on digital equality in the years ahead. We are sharing them openly with the hope they benefit others working towards our goals and to amplify the limited research currently available on digital issues in low and middle-income countries. We intend the papers to foster discussion about the steps we can take together to ensure emerging digital technologies are used in ways that benefit people’s lives, whether they are in Los Angeles or Lagos….(More)”.

Why We Should Care About Bad Data


Blog by Stefaan G. Verhulst: “At a time of open and big data, data-led and evidence-based policy making has great potential to improve problem solving but will have limited, if not harmful, effects if the underlying components are riddled with bad data.

Why should we care about bad data? What do we mean by bad data? And what are the determining factors contributing to bad data that if understood and addressed could prevent or tackle bad data? These questions were the subject of my short presentation during a recent webinar on  Bad Data: The Hobgoblin of Effective Government, hosted by the American Society for Public Administration and moderated by Richard Greene (Partner, Barrett and Greene Inc.). Other panelists included Ben Ward (Manager, Information Technology Audits Unit, California State Auditor’s Office) and Katherine Barrett (Partner, Barrett and Greene Inc.). The webinar was a follow-up to the excellent Special Issue of Governing on Bad Data written by Richard and Katherine….(More)”

Let the People Know the Facts: Can Government Information Removed from the Internet Be Reclaimed?


Paper by Susan Nevelow Mart: “…examines the legal bases of the public’s right to access government information, reviews the types of information that have recently been removed from the Internet, and analyzes the rationales given for the removals. She suggests that the concerted use of the Freedom of Information Act by public interest groups and their constituents is a possible method of returning the information to the Internet….(More)”.

How Can Blockchain Technology Help Government Drive Economic Activity?


Thomas Hardjono and Pete Teigen providing “A Blueprint Discussion on Identity“: Data breaches, identity theft, and trust erosion are all identity-related issues that citizens and government organizations face with increased frequency and magnitude. The rise of blockchain technology, and related distributed ledger technology, is generating significant interest in how a blockchain infrastructure can enable better identity management across a variety of industries.  Historically, governments have taken the primary role in issuing certain types of identities (e.g. social security numbers, driver licenses, and passports) based on strong authentication proofing of individuals using government-vetted documentation – a process often referred to as on-boarding. This identity proofing and on-boarding process presents a challenge to government because it is still heavily paper-based, making it cumbersome, time consuming and dependent on siloed, decades old, and inefficient systems.

Another aspect of the identity challenge is the risk of compromising an individual’s digital identifiers and government-issued credentials through identity theft. With so many vital services (e.g. banking, health services, transport, residency) dependent on trusted, government-vetted credentials, any compromise of that identity can result in a significant negative impact to the individual and be difficult to repair. Compounding the problem, many instances of identity theft go undetected and only discovered after damage is done.

Increasing the efficiency of the identity vetting process while also enhancing transparency would help mitigate those identity challenges.  Blockchain technology promises to do just that. Through the use of multiple computer systems (nodes) that are interconnected in a peer-to-peer (P2P) network, a shared common view of the information in the network ensures synchronicity of agreed data. A trusted ledger then exists in a distributed manner across the network that inherently is accountable to all network participants, thereby providing transparency and trustworthiness.

Using that trusted distributed ledger, identity-related data vetted by one Government entity and including that data’s location (producing a link in the chain) can be shared with other members of the network as needed — allowing members to instantaneously accept an identity without the need to duplicate the identity vetting process.  The more sophisticated blockchain systems possess this “record-link-fetch” feature that  is inherent in  the blockchain system’s building blocks.  Additional efficiency enhancing features allow downstream processes using that identity assertion as automated input to enable “smart contracts”, discussed below.

Thus, the combination of Government vetting of individual data, together with the embedded transparency and accountability capabilities of blockchain systems, allow relying parties (e.g. businesses, online merchants, individuals, etc.) to obtain higher degrees of assurance regarding the identity of other parties with whom they are conducting transactions…..

Identity and membership management solutions already exist and can be applied to private (permissioned) blockchain systems. Features within these solutions should be evaluated for their suitability for blockchain systems.  Specifically, these four steps can enable government to start in suing blockchain to address identity challenges:

  1. Evaluate existing identity and membership management solutions in order to identify features that apply to permissioned blockchain systems in the short term.
  2. Experiment with integrating these existing solutions with open source blockchain implementations.
  3. Create a roadmap (with a 2-3 year horizon) for identity and membership management for smart contracts within permissioned blockchains.
  4. Develop a long term plan (a 5 year horizon) for addressing identity and membership management for permissionless (public) blockchain systems. Here again, use open source blockchain implementations as the basis to understand the challenges in the identity space for permissionless blockchains….(More)”.

Open Participatory Security: Unifying Technology, Citizens, and the State


Book by Jesse Paul Lehrke: “Our modern security systems have recently come under a lot of criticism: as too bureaucratic and unadaptable, too secretive and untrustworthy, and too obsessed with information technology rather than human needs. Yet listing failures is easy; security is never perfect. The question is why current approaches fail and whether there are viable alternatives. The root of their shortcomings is in the interaction of the very pillars of our security system in the contemporary context. While our enemies have adopted the technologies of the Information Age, changing how they organize and fight, these same technologies have only created more vulnerabilities for states. Governments have been generally unwilling to maximize their use of these technologies because it would require the wider release of information and the opening of organizational structures to include society in security making. Yet countering diffuse modern threats striking deep into our states and across our economies requires mobilizing the diffuse skills and variation of modern society. Open approaches for mobilizing participation and coproduction have the capabilities needed to improve contemporary security policy making, problem solving, and provision. Moreover, open participatory security can be effective not only for technical security, but also for restoring trust among the citizens and rebuilding the legitimacy of the state….(More)”

Deadly Data Gaps: How Lack of Information Harms Refugee Policy Making


Interview with Galen Englund by Charlotte Alfred: “The U.N. Refugee Agency recently released its annual estimate of the world’s displaced population: 65.6 million. This figure is primarily based on data provided by governments, each using their own definitions and data collection methods.

This leaves ample space for inconsistencies and data gaps. South Africa, for example, reported 463,900 asylum seekers in 20141.1 million in 2015 and then just 218,300 last year. But the number of people had not fluctuated that wildly. What did change was how asylum seekers are counted.

National estimates can also obscure entire groups of people, like internally displaced groups that governments don’t want to acknowledge, notes Galen Englund, who analyzes humanitarian data at the ONE Campaign advocacy organization.

Over the past year, Englund has been digging into the data on refugees and displaced populations for the ONE Campaign. It was an uphill battle. He collected figures from 67 reports that used 356 differently worded metrics in order to identify the needs of displaced populations. “Frequently information is not there, or it’s siloed within organizations, or there’s too much bureaucratic red tape around it, or it just hasn’t been collected yet,” he said.

His research resulted in a displacement tracking platform called Movement, which compiles various U.N. data, and a briefing paper outlining displacement data gaps that concludes: “The information architecture of humanitarian aid is not fit for purpose.” We spoke to Englund about his findings….

Galen Englund: There’s several layers of massive data gaps that all coincide with each other. Probably the most troubling for me is not being able to understand at a granular level where refugees and displaced people are inside of countries, and the transition between when someone leaves their home and becomes displaced, and when they actually cross international borders and become refugees or asylum seekers. That’s an incredibly difficult transition to track, and one that there’s inadequate data on right now….(More)”.

Our digital journey: moving to electronic questionnaires


Jason Bradbury at the Office for National Statistics (UK): “Earlier this year we shared news about the Retail Sales Inquiry (RSI) – the monthly national survey of shops and shopping –  moving to digital data collection. ONS is transforming the way it collects data, improving the speed and quality of the information while reducing the burden on respondents. The past six months has seen a significant expansion of our digital survey availability. In January 5,000 retailers were invited to sign-up for an account giving them the option to send us their data  for one of our business surveys digitally.

Electronic questionnaires

The take-up of the electronic questionnaire (eQ) was incredible with over 80% of respondents choosing to supply their information for the RSI online. Overt the last six months, we have continued to see the appetite for online completion grow. Each month, an average of 300 new businesses opt to return their Retail Sales data digitally with many eager to move to digital methods for the other surveys they are required to complete….

Moving data collection from the phone and paper to online has been a huge success delivering improved quality, an ‘easy  to access’ online experience and when thinking about the impact this change could  have had on our core function as a statistical body, I am delighted to share that we have not witnessed any statistical issues and all of outputs have been compiled and produced as normal.

Put simply, the easier it is for someone to complete our surveys, the more likely they are to take the time to provide more detailed accurate data. It is worth noting that once a business has an account with ONS they often send back data to us quicker. The earlier and more detailed responses allow us more time to quality assure (QA) the information and reduce the need to re-contact the businesses.

Our digital journey

The digital world is a fast paced and an ever changing environment. We have found it challenging to match this pace in both our team’s skill base and our digital service. We are in the process of up-skilling our teams and updating our data collection service and infrastructure. This will enable us to improve our data collection service and move even more surveys online….(More)”

Smart or dumb? The real impact of India’s proposal to build 100 smart cities


 in The Conversation: “In 2014, the new Indian government declared its intention to achieve 100 smart cities.

In promoting this objective, it gave the example of a large development in the island city of Mumbai, Bhendi Bazaar. There, 3-5 storey housing would be replaced with towers of between 40 to 60 storeys to increase density. This has come to be known as “vertical with a vengeance”.

We have obtained details of the proposed project from the developer and the municipal authorities. Using an extended urban metabolism model, which measures the impacts of the built environment, we have assessed its overall impact. We determined how the flows of materials and energy will change as a result of the redevelopment.

Our research shows that the proposal is neither smart nor sustainable.

Measuring impacts

The Indian government clearly defined what they meant with “smart”. Over half of the 11 objectives were environmental and main components of the metabolism of a city. These include adequate water and sanitation, assured electricity, efficient transport, reduced air pollution and resource depletion, and sustainability.

We collected data from various primary and secondary sources. This included physical surveys during site visits, local government agencies, non-governmental organisations, the construction industry and research.

We then made three-dimensional models of the existing and proposed developments to establish morphological changes, including building heights, street widths, parking provision, roof areas, open space, landscaping and other aspects of built form.

Demographic changes (population density, total population) were based on census data, the developer’s calculations and an assessment of available space. Such information about the magnitude of the development and the associated population changes allowed us to analyse the additional resources required as well as the environmental impact….

Case studies such as Bhendi Bazaar provide an example of plans for increased density and urban regeneration. However, they do not offer an answer to the challenge of limited infrastructure to support the resource requirements of such developments.

The results of our research indicate significant adverse impacts on the environment. They show that the metabolism increases at a greater rate than the population grows. On this basis, this proposed development for Mumbai, or the other 99 cities, should not be called smart or sustainable.

With policies that aim to prevent urban sprawl, cities will inevitably grow vertically. But with high-rise housing comes dependence on centralised flows of energy, water supplies and waste disposal. Dependency in turn leads to vulnerability and insecurity….(More)”.

The hidden costs of open data


Sara Friedman at GCN: “As more local governments open their data for public use, the emphasis is often on “free” — using open source tools to freely share already-created government datasets, often with pro bono help from outside groups. But according to a new report, there are unforeseen costs when it comes pushing government datasets out of public-facing platforms — especially when geospatial data is involved.

The research, led by University of Waterloo professor Peter A. Johnson and McGill University professor Renee Sieber, was based on work as part of Geothink.ca partnership research grant and exploration of the direct and indirect costs of open data.

Costs related to data collection, publishing, data sharing, maintenance and updates are increasingly driving governments to third-party providers to help with hosting, standardization and analytical tools for data inspection, the researchers found. GIS implementation also has associated costs to train staff, develop standards, create valuations for geospatial data, connect data to various user communities and get feedback on challenges.

Due to these direct costs, some governments are more likely to avoid opening datasets that need complex assessment or anonymization techniques for GIS concerns. Johnson and Sieber identified four areas where the benefits of open geospatial data can generate unexpected costs.

First, open data can create “smoke and mirrors” situation where insufficient resources are put toward deploying open data for government use. Users then experience “transaction costs” when it comes to working in specialist data formats that need additional skills, training and software to use.

Second, the level of investment and quality of open data can lead to “material benefits and social privilege” for communities that devote resources to providing more comprehensive platforms.

While there are some open source data platforms, the majority of solutions are proprietary and charged on a pro-rata basis, which can present a challenge for cities with larger, poor populations compared to smaller, wealthier cities. Issues also arise when governments try to combine their data sets, leading to increased costs to reconcile problems.

The third problem revolves around the private sector pushing for the release of data sets that can benefit their business objectives. Companies could push for the release high-value sets, such as a real-time transit data, to help with their product development goals. This can divert attention from low-value sets, such as those detailing municipal services or installations, that could have a bigger impact on residents “from a civil society perspective.”

If communities decide to release the low-value sets first, Johnson and Sieber think the focus can then be shifted to high-value sets that can help recoup the costs of developing the platforms.

Lastly, the report finds inadvertent consequences could result from tying open data resources to private-sector companies. Public-private open data partnerships could lead to infrastructure problems that prevent data from being widely shared, and help private companies in developing their bids for public services….

Johnson and Sieber encourage communities to ask the following questions before investing in open data:

  1. Who are the intended constituents for this open data?
  2. What is the purpose behind the structure for providing this data set?
  3. Does this data enable the intended users to meet their goals?
  4. How are privacy concerns addressed?
  5. Who sets the priorities for release and updates?…(More)”

Read the full report here.

Design Thinking for the Greater Good


New Book by Jeanne Liedtka, Randy Salzman, and Daisy Azer:  “Facing especially wicked problems, social sector organizations are searching for powerful new methods to understand and address them. Design Thinking for the Greater Good goes in depth on both the how of using new tools and the why. As a way to reframe problems, ideate solutions, and iterate toward better answers, design thinking is already well established in the commercial world. Through ten stories of struggles and successes in fields such as health care, education, agriculture, transportation, social services, and security, the authors show how collaborative creativity can shake up even the most entrenched bureaucracies—and provide a practical roadmap for readers to implement these tools.

The design thinkers Jeanne Liedtka, Randy Salzman, and Daisy Azer explore how major agencies like the Department of Health and Human Services and the Transportation and Security Administration in the United States, as well as organizations in Canada, Australia, and the United Kingdom, have instituted principles of design thinking. In each case, these groups have used the tools of design thinking to reduce risk, manage change, use resources more effectively, bridge the communication gap between parties, and manage the competing demands of diverse stakeholders. Along the way, they have improved the quality of their products and enhanced the experiences of those they serve. These strategies are accessible to analytical and creative types alike, and their benefits extend throughout an organization. This book will help today’s leaders and thinkers implement these practices in their own pursuit of creative solutions that are both innovative and achievable….(More)”.