Opportunities and risks in emerging technologies


White Paper Series at the WebFoundation: “To achieve our vision of digital equality, we need to understand how new technologies are shaping society; where they present opportunities to make people’s lives better, and indeed where they threaten to create harm. To this end, we have commissioned a series of white papers examining three key digital trends: artificial intelligence, algorithms and control of personal data. The papers focus on low and middle-income countries, which are all too often overlooked in debates around the impacts of emerging technologies.

The series addresses each of these three digital issues, looking at how they are impacting people’s lives and identifying steps that governments, companies and civil society organisations can take to limit the harms, and maximise benefits, for citizens.

Download the white papers

We will use these white papers to refine our thinking and set our work agenda on digital equality in the years ahead. We are sharing them openly with the hope they benefit others working towards our goals and to amplify the limited research currently available on digital issues in low and middle-income countries. We intend the papers to foster discussion about the steps we can take together to ensure emerging digital technologies are used in ways that benefit people’s lives, whether they are in Los Angeles or Lagos….(More)”.

The hidden costs of open data


Sara Friedman at GCN: “As more local governments open their data for public use, the emphasis is often on “free” — using open source tools to freely share already-created government datasets, often with pro bono help from outside groups. But according to a new report, there are unforeseen costs when it comes pushing government datasets out of public-facing platforms — especially when geospatial data is involved.

The research, led by University of Waterloo professor Peter A. Johnson and McGill University professor Renee Sieber, was based on work as part of Geothink.ca partnership research grant and exploration of the direct and indirect costs of open data.

Costs related to data collection, publishing, data sharing, maintenance and updates are increasingly driving governments to third-party providers to help with hosting, standardization and analytical tools for data inspection, the researchers found. GIS implementation also has associated costs to train staff, develop standards, create valuations for geospatial data, connect data to various user communities and get feedback on challenges.

Due to these direct costs, some governments are more likely to avoid opening datasets that need complex assessment or anonymization techniques for GIS concerns. Johnson and Sieber identified four areas where the benefits of open geospatial data can generate unexpected costs.

First, open data can create “smoke and mirrors” situation where insufficient resources are put toward deploying open data for government use. Users then experience “transaction costs” when it comes to working in specialist data formats that need additional skills, training and software to use.

Second, the level of investment and quality of open data can lead to “material benefits and social privilege” for communities that devote resources to providing more comprehensive platforms.

While there are some open source data platforms, the majority of solutions are proprietary and charged on a pro-rata basis, which can present a challenge for cities with larger, poor populations compared to smaller, wealthier cities. Issues also arise when governments try to combine their data sets, leading to increased costs to reconcile problems.

The third problem revolves around the private sector pushing for the release of data sets that can benefit their business objectives. Companies could push for the release high-value sets, such as a real-time transit data, to help with their product development goals. This can divert attention from low-value sets, such as those detailing municipal services or installations, that could have a bigger impact on residents “from a civil society perspective.”

If communities decide to release the low-value sets first, Johnson and Sieber think the focus can then be shifted to high-value sets that can help recoup the costs of developing the platforms.

Lastly, the report finds inadvertent consequences could result from tying open data resources to private-sector companies. Public-private open data partnerships could lead to infrastructure problems that prevent data from being widely shared, and help private companies in developing their bids for public services….

Johnson and Sieber encourage communities to ask the following questions before investing in open data:

  1. Who are the intended constituents for this open data?
  2. What is the purpose behind the structure for providing this data set?
  3. Does this data enable the intended users to meet their goals?
  4. How are privacy concerns addressed?
  5. Who sets the priorities for release and updates?…(More)”

Read the full report here.

Nudges in a post-truth world


Neil Levy at the Journal of Medical Ethics: “Nudges—policy proposals informed by work in behavioural economics and psychology that are designed to lead to better decision-making or better behaviour—are controversial. Critics allege that they bypass our deliberative capacities, thereby undermining autonomy and responsible agency. In this paper, I identify a kind of nudge I call a nudge to reason, which make us more responsive to genuine evidence. I argue that at least some nudges to reason do not bypass our deliberative capacities. Instead, use of these nudges should be seen as appeals to mechanisms partially constitutive of these capacities, and therefore as benign (so far as autonomy and responsible agency are concerned). I sketch some concrete proposals for nudges to reason which are especially important given the apparent widespread resistance to evidence seen in recent political events….(More)”.

Citizen sensing, air pollution and fracking: From ‘caring about your air’ to speculative practices of evidencing harm


 at the Sociological Review: “Hydraulic fracturing, or fracking, is an emerging and growing industry that is having considerable effects on environments and health. Yet fracking often lacks environmental regulations that might be understood as governmental forms of care. In some locations in the US, citizens have taken up environmental monitoring as a way to address this perceived absence of care, and to evidence harm in order to argue for new infrastructures of care. This article documents the practices of residents engaged in monitoring air pollution near fracking sites in the US, as well as the participatory and practice-based research undertaken by the Citizen Sense research project to develop monitoring kits for residents to use and test over a period of seven months. Citizen sensing practices for monitoring air pollution can constitute ways of expressing care about environments, communities and individual and public health. Yet practices for documenting and evidencing harm through the ongoing collection of air pollution data are also speculative attempts to make relevant these unrecognised and overlooked considerations of the need for care. Working with the concept of speculation, this article advances alternative notions of evidence, care and policy that attend to citizens’ experiences of living in the gas fields. How do citizen sensing practices work towards alternative ways of evidencing harm? In what ways does monitoring with environmental sensors facilitate this process? And what new speculative practices emerge to challenge the uses of environmental sensors, as well as to expand the types of data gathered, along with their political impact?…(More)”.

Global innovations in measurement and evaluation


Report by Andrew WestonAnne KazimirskiAnoushka KenleyRosie McLeodRuth Gripper: “Measurement and evaluation is core to good impact practice. It helps us understand what works, how it works and how we can achieve more. Good measurement and evaluation involves reflective, creative, and proportionate approaches. It makes the most of existing theoretical frameworks as well as new digital solutions, and focuses on learning and improving. We researched the latest changes in theory and practice based on both new and older, renascent ideas. We spoke to leading evaluation experts from around the world, to ask what’s exciting them, what people are talking about and what is most likely to make a long lasting contribution to evaluation. And we found that new thinking, techniques, and technology are influencing and improving practice.

Technology is enabling us to gather different types of data on bigger scales, helping us gain insights or spot patterns we could not see before. Advances in systems to capture, manage and share sensitive data are helping organisations that want to work collaboratively, while moves towards open data are providing better access to data that can be linked together to generate even greater insight. Traditional models of evaluating a project once it has finished are being overtaken by methods that feed more dynamically into service design. We are learning from the private sector, where real-time feedback shapes business decisions on an ongoing basis asking: ‘is it working?’ instead of ‘did it work?’.

And approaches that focus on assessing not just if something works but how and why, for whom, and under what conditions are also generating more insight into the effectiveness of programmes. Technology may be driving many of the innovations we highlight here, but some of the most exciting developments are happening because of changes in the ideologies and cultures that inform our approach to solving big problems. This is resulting in an increased focus on listening to and involving users, and on achieving change at a systemic level—with technology simply facilitating these changes.

Some of the pressures that compel measurement and evaluation activity remain misguided. For example, there can be too big a focus on obtaining a cost-benefit ratio—regardless of the quality of the data it is based on—and not enough encouragement from funders for charities to learn from their evaluation activity. Even the positive developments have their pitfalls: new technologies pose new data protection risks, ethical hazards, and the possibility of exclusion if participation requires high levels of technical ability. It is important that, as the field develops and capabilities increase, we remain focused on achieving best practice.

This report highlights the developments that we think have the greatest potential to improve evaluation and programme design, and the careful collection and use of data. We want to celebrate what is possible, and encourage wider application of these ideas. Choosing the innovations In deciding which trends to include in this report, we considered how different approaches contributed to better evaluation by:

  • overcoming previous barriers to good evaluation practice, eg, through new technologies or skills;
  • providing more meaningful or robust data;
  • using data to support decision-making, learning and improving practice;
  • increasing equality between users, service deliverers and funders; and
  • offering new contexts for collaboration that improve the utility of data.

… Eight key trends emerged from our research that we thought to be most exciting, relevant and likely to have a long-lasting contribution. Some of these are driven by cutting-edge technology; others reflect growing application of ideas that push practice beyond ‘traditional’ models of evaluation. User-centric and shared approaches are leading to better informed measurement and evaluation design. Theory-based evaluation and impact management embolden us to ask better research questions and obtain more useful answers. Data linkage, the availability of big data, and the possibilities arising from remote sensing are increasing the number of questions we can answer. And data visualisation opens up doors to better understanding and communication of this data. Here we present each of these eight innovations and showcase examples of how organisations are using them to better understand and improve their work….(More)”

Digital transformation’s people problem


Jen Kelchner at open source: …Arguably, the greatest chasm we see in our organizational work today is the actual transformation before, during, or after the implementation of a digital technology—because technology invariably crosses through and impacts people, processes, and culture. What are we transforming from? What are we transforming into? These are “people issues” as much as they are “technology issues,” but we too rarely acknowledge this.

Operating our organizations on open principles promises to spark new ways of thinking that can help us address this gap. Over the course of this three-part series, we’ll take a look at how the principle foundations of open play a major role in addressing the “people part” of digital transformation—and closing that gap before and during implementations.

The impact of digital transformation

The meaning of the term “digital transformation” has changed considerably in the last decade. For example, if you look at where organizations were in 2007, you’d watch them grapple with the first iPhone. Focus here was more on search engines, data mining, and methods of virtual collaboration.

A decade later in 2017, however, we’re investing in artificial intelligence, machine learning, and the Internet of Things. Our technologies have matured—but our organizational and cultural structures have not kept pace with them.

Value Co-creation In The Organizations of the Future, a recent research report from Aalto University, states that digital transformation has created opportunities to revolutionize and change existing business models, socioeconomic structures, legal and policy measures, organizational patterns, and cultural barriers. But we can only realize this potential if we address both the technological and the organizational aspects of digital transformation.

Four critical areas of digital transformation

Let’s examine four crucial elements involved in any digital transformation effort:

  • change management
  • the needs of the ecosystem
  • processes
  • silos

Any organization must address these four elements in advance of (ideally) or in conjunction with the implementation of a new technology if that organization is going to realize success and sustainability….(More)”.

Innovation@DFID: Crowdsourcing New Ideas at the UK’s Department for International Development


Paper by Anke Schwittay and Paul Braund: “Over the last decade, traditional development institutions have joined market-based actors in embracing inclusive innovation to ensure the sector’s relevance and impacts. In 2014, the UK’s Department for International Development’s (DFID) Innovation Hub launched Amplify as its own flagship initiative. The programme, which is managed by IDEO, a Silicon Valley-based design consultancy, aims to crowdsource new ideas to various development challenges from a broad and diverse group of actors, including poor people themselves. By examining the direction, diversity and distribution of Amplify’s work, we argue that while development innovation can generate more inclusive practices, its transformative potential is constrained by broader developmental logics and policy regimes….(More)”

We have unrealistic expectations of a tech-driven future utopia


Bob O’Donnell in RECODE: “No one likes to think about limits, especially in the tech industry, where the idea of putting constraints on almost anything is perceived as anathema.

In fact, the entire tech industry is arguably built on the concept of bursting through limitations and enabling things that weren’t possible before. New technology developments have clearly created incredible new capabilities and opportunities, and have generally helped improve the world around us.

But there does come a point — and I think we’ve arrived there — where it’s worth stepping back to both think about and talk about the potential value of, yes, technology limits … on several different levels.

On a technical level, we’ve reached a point where advances in computing applications like AI, or medical applications like gene splicing, are raising even more ethical questions than practical ones on issues such as how they work and for what applications they might be used. Not surprisingly, there aren’t any clear or easy answers to these questions, and it’s going to take a lot more time and thought to create frameworks or guidelines for both the appropriate and inappropriate uses of these potentially life-changing technologies.

Does this mean these kinds of technological advances should be stopped? Of course not. But having more discourse on the types of technologies that get created and released certainly needs to happen.

 Even on a practical level, the need for limiting people’s expectations about what a technology can or cannot do is becoming increasingly important. With science-fiction-like advances becoming daily occurrences, it’s easy to fall into the trap that there are no limits to what a given technology can do. As a result, people are increasingly willing to believe and accept almost any kind of statements or predictions about the future of many increasingly well-known technologies, from autonomous driving to VR to AI and machine learning. I hate to say it, but it’s the fake news of tech.

Just as we’ve seen the fallout from fake news on all sides of the political perspective, so, too, are we starting to see that unbridled and unlimited expectations for certain new technologies are starting to have negative implications of their own. Essentially, we’re starting to build unrealistic expectations for a tech-driven nirvana that doesn’t clearly jibe with the realities of the modern world, particularly in the time frames that are often discussed….(More)”.

Crowdsourcing Expertise to Increase Congressional Capacity


Austin Seaborn at Beeck Center: “Members of Congress have close connections with their districts, and information arising from local organizations, such as professional groups, academia, industry as well as constituents with relevant expertise (like retirees, veterans or students) is highly valuable to them.  Today, congressional staff capacity is at a historic low, while at the same time, constituents in districts are often well equipped to address the underlying policy questions that Congress seeks to solve….

In meetings we have had with House and Senate staffers, they repeatedly express both the difficulty managing their substantial area-specific work loads and their interest in finding ways to substantively engage constituents to find good nuggets of information to help them in their roles as policymakers. At the same time, constituents are demanding more transparency and dialogue from their elected representatives. In many cases, our project brings these two together. It allows Members to tap the expertise in their districts while at the same time creating an avenue for constituents to contribute their knowledge and area expertise to the legislative process. It’s a win for constituents and a win for Member of Congress and their staffs.

It is important to note that the United States lags behind other democracies in experimenting with more inclusive methods during the policymaking process. In the United Kingdom, for example, the UK Parliament has experimented with a variety of new digital tools to engage with constituents. These methods range from Twitter hashtags, which are now quite common given the rise in social media use by governments and elected officials, to a variety of web forums on a variety of platforms. Since June of 2015, they have also been doing digital debates, where questions from the general public are crowdsourced and later integrated into a parliamentary debate by the Member of Parliament leading the debate. Estonia, South Africa, Taiwan, France also…notable examples.

One promising new development we hope to explore more thoroughly is the U.S. Library of Congress’s recently announced legislative data App Challenge. This competition is distinct from the many hackathons that have been held on behalf of Congress in the past, in that this challenge seeks new methods not only to innovate, but also to integrate and legislate. In his announcement, the Library’s Chief Information Officer, Bernard A. Barton, Jr., stated, “An informed citizenry is better able to participate in our democracy, and this is a very real opportunity to contribute to a better understanding of the work being done in Washington.  It may even provide insights for the people doing the work around the clock, both on the Hill, and in state and district offices.  Your innovation and integration may ultimately benefit the way our elected officials legislate for our future.” We believe these sorts of new methods will play a crucial role in the future of engaging citizens in their democracies….(More)”.

Modernizing government’s approach to transportation and land use data: Challenges and opportunities


Adie Tomer and Ranjitha Shivaram at Brookings: “In the fields of transportation and land use planning, the public sector has long taken the leading role in the collection, analysis, and dissemination of data. Often, public data sets drawn from traveler diaries, surveys, and supply-side transportation maps were the only way to understand how people move around in the built environment – how they get to work, how they drop kids off at school, where they choose to work out or relax, and so on.

But, change is afoot: today, there are not only new data providers, but also new types of data. Cellphones, GPS trackers, and other navigation devices offer real-time demand-side data. For instance, mobile phone data can point to where distracted driving is a problem and help implement measures to deter such behavior. Insurance data and geo-located police data can guide traffic safety improvements, especially in accident-prone zones. Geotagged photo data can illustrate the use of popular public spaces by locals and tourists alike, enabling greater return on investment from public spaces. Data from exercise apps like Fitbit and Runkeeper can help identify recreational hot spots that attract people and those that don’t.

However, integrating all this data into how we actually plan and build communities—including the transportation systems that move all of us and our goods—will not be easy. There are several core challenges. Limited staff capacity and restricted budgets in public agencies can slow adoption. Governmental procurement policies are stuck in an analog era. Privacy concerns introduce risk and uncertainty. Private data could be simply unavailable to public consumers. And even if governments could acquire all of the new data and analytics that interest them, their planning and investment models must be updated to fully utilize these new resources.

Using a mix of primary research and expert interviews, this report catalogs emerging data sets related to transportation and land use, and assesses the ease by which they can be integrated into how public agencies manage the built environment. It finds that there is reason for the hype; we have the ability to know more about how humans move around today than at any time in history. But, despite all the obvious opportunities, not addressing core challenges will limit public agencies’ ability to put all that data to use for the collective good….(More)”