Stefaan Verhulst
Speech by Professor Gary Banks: “One of the challenges in talking about EBPM (evidence-based
For example, some have interpreted the term EBPM so literally as to insist that the word “based” be replaced by “influenced”, arguing that policy decisions are rarely based on evidence alone. That of course is true, but few using the term (myself included) would have thought otherwise. And I am sure no-one in an audience such as this, especially in our nation’s capital, believes policy decisions could derive solely from evidence — or even rational analysis!
If you’ll pardon a quotation from my earlier address: “Values, interests, personalities, timing, circumstance and happenstance – in short, democracy – determine what actually happens” (EBPM: What is it? How do we get it?). Indeed it is precisely because of such multiple influences, that “evidence” has a potentially significant role to play.
So, adopting the position from Alice in Wonderland, I am inclined to stick with the term EBPM, which I choose to mean an approach to policy-making that makes systematic provision for evidence and analysis. Far from the deterministic straw man depicted in certain academic articles, it is an approach that seeks to achieve policy decisions that are better informed in a substantive sense, accepting that they will nevertheless ultimately be – and in a democracy need to be — political in nature.
A second and more significant area of debate concerns the meaning and value of “evidence” itself. There are a number of strands involved.
Evidentiary elitism?
One relates to methodology, and can be likened to the differences between the thresholds for a finding of guilt under civil and criminal law (“balance of probabilities” versus “beyond reasonable doubt”).
Some analysts have argued that, to be useful for policy, evidence must involve rigorous unbiased research techniques, the “gold standard” for which are “randomized control trials”. The “randomistas”, to use the term which headlines Andrew Leigh’s new book (Leigh, 2018), claim that only such a methodology is able to truly tell us “what works”
However adopting this exacting standard from the medical research world would leave policy makers with an excellent tool of limited application. Its forte is testing a specific policy or program relative to business as usual, akin to drug tests involving a placebo for a control group. And there are some inspiring examples of insights gained. But for many areas of public policy the technique is not practicable. Even where it is, it requires that a case has to some extent already been made. And while it can identify the extent to which a particular program “works”, it is less useful for understanding why, or whether something else might work even better.
That is not to say that any evidence will do. Setting the quality bar too low is the bigger problem in practice and the notion of a hierarchy of methodologies is helpful. However, no such analytical tools are self-sufficient for policy-making purposes and in my view are best thought of as components of a “
Book by Shoshana Zuboff: “The challenges to humanity posed by the digital future, the first detailed examination of the unprecedented form of power called “surveillance capitalism,” and the quest by powerful corporations to predict and control our behavior.
Shoshana Zuboff’s interdisciplinary breadth and depth enable her to come to grips with the social, political, business, and technological meaning of the changes taking place in our time. We are at a critical juncture in the confrontation between the vast power of giant high-tech companies and government, the hidden economic logic of surveillance capitalism, and the propaganda of machine supremacy that threaten to shape and control human life. Will the brazen new methods of social engineering and behavior modification threaten individual autonomy and democratic rights and introduce extreme new forms of social inequality? Or will the promise of the digital age be one of individual empowerment and democratization?
The Age of Surveillance Capitalism is neither a hand-wringing narrative of danger and decline nor a digital fairy tale. Rather, it offers a deeply reasoned and evocative examination of the contests over the next chapter of capitalism that will decide the meaning of information civilization in the twenty-first century. The stark issue at hand is whether we will be the masters of information and machines or its slaves. …(More)”.
Book (New 3rd Edition) by Matthew A. Russell and Mikhail Klassen: “Mine the rich data tucked away in popular social websites such as Twitter, Facebook, LinkedIn, and Instagram. With the third edition of this popular guide, data scientists, analysts, and programmers will learn how to glean insights from social media—including who’s connecting with whom, what they’re talking about, and where they’re located—using Python code examples, Jupyter notebooks, or Docker containers.
In part one, each standalone chapter focuses on one aspect of the social landscape, including each of the major social sites, as well as web pages, blogs and feeds, mailboxes, GitHub, and a newly added chapter covering Instagram. Part two provides a cookbook with two dozen bite-size recipes for solving particular issues with Twitter….(More)”.
Mike Orcutt at Technology Review: “China’s crackdown on blockchain technology has taken another step: the country’s internet censorship agency has just approved new regulations aimed at blockchain companies.
Hand over the data: The Cyberspace Administration of China (CAC) will require any “entities or nodes” that provide “blockchain information services” to collect users’ real names and national ID or telephone numbers, and allow government officials to access that data.
It will ban companies from using blockchain technology to “produce, duplicate, publish, or disseminate” any content that Chinese law prohibits. Last year, internet users evaded censors by recording the content of two banned articles on the Ethereum blockchain. The rules, first proposed in October, will go into effect next month.
Defeating the purpose? For more than a year, China has been cracking down on cryptocurrency trading and its surrounding industry while also singing the praises of
Issie Lapowsky at Wired: “An international group of researchers has developed an algorithmic tool that uses Twitter to automatically predict exactly where you live in a matter of minutes, with more than 90 percent accuracy. It can also predict where you work, where you pray, and other information you might rather keep private, like, say, whether you’ve frequented a certain strip club or gone to rehab.
The tool, called LPAuditor (short for Location Privacy Auditor), exploits what the researchers call an “invasive policy” Twitter deployed after it introduced the ability to tag tweets with a location in 2009. For years, users who chose to geotag tweets with any location, even something as geographically broad as “New York City,” also automatically gave their precise GPS coordinates. Users wouldn’t see the coordinates displayed on Twitter. Nor would their followers. But the GPS information would still be included in the tweet’s metadata and accessible through Twitter’s API.
Twitter didn’t change this policy across its apps until April of 2015. Now, users must opt-in to share their precise location—and, according to a Twitter spokesperson, a very small percentage of people do. But the GPS data people shared before the update remains available through the API to this day.
The researchers developed LPAuditor to analyze those geotagged tweets and infer detailed information about people’s most sensitive locations. They outline this process in a new, peer-reviewed paper that will be presented at the Network and Distributed System Security Symposium next month. By analyzing clusters of coordinates, as well as timestamps on the tweets, LPAuditor was able to suss out where tens of thousands of people lived, worked, and spent their private time…(More)”.
Paper by Jäske, Maija and Ertiö,
Eric Topol in Nature: “The use of artificial intelligence, and the deep-learning subtype
Technological change happens in much the same way. Small changes accumulate, and suddenly the world is a different place. Throughout my career at O’Reilly Media, we’ve tracked and fostered a lot of “gradually, then suddenly” movements: the World Wide Web, open source software, big data, cloud computing, sensors and ubiquitous computing, and now the pervasive effects of AI and algorithmic systems on society and the economy.
What are some of the things that are in the middle of their “gradually, then suddenly” transition right now? The list is long; here are a few of the areas that are on my mind.
1) AI and algorithms are everywhere
The most important trend for readers of this newsletter to focus on is the development of new kinds of partnership between human and machine. We take for granted that algorithmic systems do much of the work at online sites like Google, Facebook, Amazon, and Twitter, but we haven’t fully grasped the implications. These systems are hybrids of human and machine. Uber, Lyft, and Amazon Robotics brought this pattern to the physical world, reframing the corporation as a vast, buzzing network of humans both guiding and guided by machines. In these systems, the algorithms decide who gets what and why; they’re changing the fundamentals of market coordination in ways that gradually, then suddenly, will become apparent.
2) The rest of the world is leapfrogging the US
The volume of mobile payments in China is $13 trillion versus the US’s $50 billion, while credit cards never took hold. Already Zipline’s on-demand drones are delivering 20% of all blood supplies in Rwanda and will be coming soon to other countries (including the US). In each case, the lack of existing infrastructure turned out to be an advantage in adopting a radically new model. Expect to see this pattern recur, as incumbents and old thinking hold back the adoption of new models
9) The crisis of faith in government
Ever since Jennifer Pahlka and I began working on the Gov 2.0 Summit back in 2008, we’ve been concerned that if we can’t get
Blog post by Eline Chivot: “Writing ever more complicated and intrusive regulations rules about data processing and data use
Weyl and Lanier’s argument is motivated by the belief that because Internet users are getting so many valuable services—like search, email, maps, and social networking—for free, they must be paying with their data. Therefore, they argue, if users are paying with their data, they should get something in return. Never mind that they do get something in return: valuable digital services that they do not pay for monetarily. But Weyl and Lanier say this is not enough, and consumers should get more.
While this idea may sound good on paper, in practice, it would be a disaster.
…Weyl and Lanier’s self-declared objective is to ensure digital dignity, but in practice this proposal would disrupt the equal treatment users receive from digital services today by valuing users based on their net worth. In this techno-socialist nirvana, to paraphrase Orwell, some pigs would be more equal than others. The French Data Protection Authority, CNIL, itself raised concerns about treating data as a commodity, warning that doing so would jeopardize society’s humanist values and fundamental rights which are, in essence, priceless.
To ensure “a better digital society,” companies should continue to be allowed to decide the best Internet business models based on what consumers demand. Data is neither cash nor a commodity, and pursuing policies based on this misconception will damage the digital economy and make the lives of digital consumers considerably worse….(More)”.
Ran Goldblatt, Trevor Monroe, Sarah Elizabeth Antos, Marco Hernandez at the World Bank Data Blog: “The desire of human beings to “think spatially” to understand how people and objects are organized in space has not changed much since Eratosthenes—the Greek astronomer
The increasing availability of satellite data has transformed how we use remote sensing analytics to understand, monitor and achieve the 2030 Sustainable Development Goals. As satellite data becomes ever more accessible and frequent, it is now possible not only to better understand how the Earth is changing, but also to utilize these insights to improve decision making, guide policy, deliver services, and promote better-informed governance. Satellites capture many of the physical, economic and social characteristics of Earth, providing a unique asset for developing countries, where reliable socio-economic and demographic data is often not consistently available. Analysis of satellite data was once relegated to researchers with access to costly data or to “super computers”. Today, the increased availability of “free” satellite data, combined with powerful cloud computing and open source analytical tools have democratized data innovation, enabling local governments and agencies to use satellite data to improve sector diagnostics, development indicators, program monitoring and service delivery.
Drivers of innovation in satellite measurements
- Big (geo)data – Satellites in Global Development are improving every day, creating new opportunities for impact in development. They capture millions of images from Earth in different spatial, spectral and temporal resolutions, generating data in ever increasing volume, variety and velocity.
- Open Source –Open source annotated datasets, the World Bank’s Open Data, and other publicly available resources allow to process and document the data (e.g. Cumulus, label maker) and perform machine learning analysis using common programming languages such as R or Python.
- Crowd – crowdsource platforms like MTurk, Figure-eight and Tomnod are used to collect and enhance inputs (reference data) to train machines to identify automatically specific objects and land cover on Earth.
- High Quality Ground Truth –Robust algorithms that analyze the entire planet require diverse training data, and traditional development Microdata for use in machine learning training, validation and calibration, for example, to map urbanization processes.
- Cloud – cloud computing and data storage capabilities within platforms like AWS, Azure and Google Earth Engine provide scalable solutions for storage, management and parallel processing of large volumes of data.
…As petabytes of geo data are being collected, novel methods are developed to convert these data into meaningful information about the nature and pace of change on Earth, for example, the formation of urban landscapes and human settlements, the creation of transportation networks that connect cities or the conversion of natural forests into productive agricultural land. New possibilities emerge for harnessing this data for a better understanding