Drones to deliver medicines to 12m people in Ghana


Neil Munshi in the Financial Times: “The world’s largest drone delivery network, ferrying 150 different medicines and vaccines, as well as blood, to 2,000 clinics in remote parts of Ghana, is set to be announced on Wednesday.

The network represents a big expansion for the Silicon Valley start-up Zipline, which began delivering blood in Rwanda in 2016 using pilotless, preprogrammed aircraft. The move, along with a new agreement in Rwanda signed in December, takes the company beyond simple blood distribution to more complicated vaccine and plasma deliveries.

“What this is going to show is that you can reach every GPS co-ordinate, you can serve everybody,” said Keller Rinaudo, Zipline chief executive. “Every human in that region or country [can be] within a 15-25 minute delivery of any essential medical product — it’s a different way of thinking about universal coverage.”

Zipline will deliver vaccines for yellow fever, polio, diptheria and tetanus which are provided by the World Health Organisation’s Expanded Project on Immunisation. The WHO will also use the company’s system for future mass immunisation programmes in Ghana.

Later this year, Zipline has plans to start operations in the US, in North Carolina, and in south-east Asia. The company said it will be able to serve 100m people within a year, up from the 22m that its projects in Ghana and Rwanda will cover.

In Ghana, Zipline said health workers will receive deliveries via a parachute drop within about 30 minutes of placing their orders by text message….(More)”.

Open data promotes citizen engagement at the local level


Afua Bruce at the Hill: “The city of Los Angeles recently released three free apps for its citizens: one to report broken street lighting, one to make 311 requests and one to get early alerts about earthquakes. Though it may seem like the city is just following a trend to modernize, the apps are part of a much larger effort to spread awareness of the more than 1,100 datasets that the city has publicized for citizens to view, analyze and share. In other words, the city has officially embraced the open data movement.

In the past few years, communities across the country have realized the power of data once only available to government. Often, the conversation about data focuses on criminal justice, because the demand for this data is being met by high-profile projects like Kamala Harris’ Open Justice Initiative, which makes California criminal justice data available to the citizenry and  the Open Data Policing Project, which provides a publicly searchable database of stop, search and use-of-force data. But the possibilities for data go far beyond justice and show the possibility for use in a variety of spaces, such as efforts to preserve local wildlifetrack potholes and  understand community health trends….(More)”.

How Recommendation Algorithms Run the World


Article by Zeynep Tufekci: “What should you watch? What should you read? What’s news? What’s trending? Wherever you go online, companies have come up with very particular, imperfect ways of answering these questions. Everywhere you look, recommendation engines offer striking examples of how values and judgments become embedded in algorithms and how algorithms can be gamed by strategic actors.

Consider a common, seemingly straightforward method of making suggestions: a recommendation based on what people “like you” have read, watched, or shopped for. What exactly is a person like me? Which dimension of me? Is it someone of the same age, gender, race, or location? Do they share my interests? My eye color? My height? Or is their resemblance to me determined by a whole mess of “big data” (aka surveillance) crunched by a machine-learning algorithm?

Deep down, behind every “people like you” recommendation is a computational method for distilling stereotypes through data. Even when these methods work, they can help entrench the stereotypes they’re mobilizing. They might easily recommend books about coding to boys and books about fashion to girls, simply by tracking the next most likely click. Of course, that creates a feedback cycle: If you keep being shown coding books, you’re probably more likely to eventually check one out.

Another common method for generating recommendations is to extrapolate from patterns in how people consume things. People who watched this then watched that; shoppers who purchased this item also added that one to their shopping cart. Amazon uses this method a lot, and I admit, it’s often quite useful. Buy an electric toothbrush? How nice that the correct replacement head appears in your recommendations. Congratulations on your new vacuum cleaner: Here are some bags that fit your machine.

But these recommendations can also be revealing in ways that are creepy. …

One final method for generating recommendations is to identify what’s “trending” and push that to a broader user base. But this, too, involves making a lot of judgments….(More)”.

Five myths about whistleblowers


Dana Gold in the Washington Post: “When a whistleblower revealed the Trump administration’s decision to overturn 25 security clearance denials, it was the latest in a long and storied history of insiders exposing significant abuses of public trust. Whistles were blown on U.S. involvement in Vietnam, the Watergate coverupEnron’s financial fraud, the National Security Agency’s mass surveillance of domestic electronic communications and, during the Trump administration, the corruption of former Environmental Protection Agency chief Scott Pruitt , Cambridge Analytica’s theft of Facebook users’ data to develop targeted political ads, and harm to children posed by the “zero tolerance” immigration policy. Despite the essential role whistleblowers play in illuminating the truth and protecting the public interest, several myths persist about them, some pernicious.

MYTH NO. 1 Whistleblowers are employees who report problems externally….

MYTH NO. 2 Whistleblowers are either disloyal or heroes….

MYTH NO. 3 ‘Leaker’ is another term for ‘whistleblower.’…

MYTH NO. 4 Remaining anonymous is the best strategy for whistleblowing….

MYTH NO. 5 Julian Assange is a whistleblower….(More)”.

Illuminating Big Data will leave governments in the dark


Robin Wigglesworth in the Financial Times: “Imagine a world where interminable waits for backward-looking, frequently-revised economic data seem as archaically quaint as floppy disks, beepers and a civil internet. This fantasy realm may be closer than you think.

The Bureau of Economic Analysis will soon publish its preliminary estimate for US economic growth in the first three months of the year, finally catching up on its regular schedule after a government shutdown paralysed the agency. But other data are still delayed, and the final official result for US gross domestic product won’t be available until July. Along the way there are likely to be many tweaks.

Collecting timely and accurate data are a Herculean task, especially for an economy as vast and varied as the US’s. But last week’s World Bank-International Monetary Fund’s annual spring meetings offered some clues on a brighter, more digital future for economic data.

The IMF hosted a series of seminars and discussions exploring how the hot new world of Big Data could be harnessed to produce more timely economic figures — and improve economic forecasts. Jiaxiong Yao, an IMF official in its African department, explained how it could use satellites to measure the intensity of night-time lights, and derive a real-time gauge of economic health.

“If a country gets brighter over time, it is growing. If it is getting darker then it probably needs an IMF programme,” he noted. Further sessions explored how the IMF could use machine learning — a popular field of artificial intelligence — to improve its influential but often faulty economic forecasts; and real-time shipping data to map global trade flows.

Sophisticated hedge funds have been mining some of these new “alternative” data sets for some time, but statistical agencies, central banks and multinational organisations such as the IMF and the World Bank are also starting to embrace the potential.

The amount of digital data around the world is already unimaginably vast. As more of our social and economic activity migrates online, the quantity and quality is going to increase exponentially. The potential is mind-boggling. Setting aside the obvious and thorny privacy issues, it is likely to lead to a revolution in the world of economic statistics. …

Yet the biggest issues are not the weaknesses of these new data sets — all statistics have inherent flaws — but their nature and location.

Firstly, it depends on the lax regulatory and personal attitudes towards personal data continuing, and there are signs of a (healthy) backlash brewing.

Secondly, almost all of this alternative data is being generated and stored in the private sector, not by government bodies such as the Bureau of Economic Analysis, Eurostat or the UK’s Office for National Statistics.

Public bodies are generally too poorly funded to buy or clean all this data themselves, meaning hedge funds will benefit from better economic data than the broader public. We might, in fact, need legislation mandating that statistical agencies receive free access to any aggregated private sector data sets that might be useful to their work.

That would ensure that our economic officials and policymakers don’t fly blind in an increasingly illuminated world….(More)”.

There Are Better Ways to Do Democracy


Article by Peter Coy: “The Brexit disaster has stained the reputation of direct democracy. The United Kingdom’s trauma began in 2016, when then-Prime Minister David Cameron miscalculated that he could strengthen Britain’s attachment to the European Union by calling a referendum on it. The Leave campaign made unkeepable promises about Brexit’s benefits. Voters spent little time studying the facts because there was a vanishingly small chance that any given vote would make the difference by breaking a tie. Leave won—and Google searches for “What is the EU” spiked after the polls closed.

Brexit is only one manifestation of a global problem. Citizens want elected officials to be as responsive as Uber drivers, but they don’t always take their own responsibilities seriously. This problem isn’t new. America’s Founding Fathers worried that democracy would devolve into mob rule; the word “democracy” appears nowhere in the Declaration of Independence or the Constitution.

While fears about democratic dysfunction are understandable, there are ways to make voters into real participants in the democratic process without giving in to mobocracy. Instead of referendums, which often become lightning rods for extremism, political scientists say it’s better to make voters think like jurors, whose decisions affect the lives and fortunes of others.

Guided deliberation, also known as deliberative democracy, is one way to achieve that. Ireland used it in 2016 and 2017 to help decide whether to repeal a constitutional amendment that banned abortion in most cases. A 99-person Citizens’ Assembly was selected to mirror the Irish population. It met over five weekends to evaluate input from lawyers and obstetricians, pro-life and pro-choice groups, and more than 13,000 written submissions from the public, guided by a chairperson from the Irish supreme court. Together they concluded that the legislature should have the power to allow abortion under a broader set of conditions, a recommendation that voters approved in a 2018 referendum; abortion in Ireland became legal in January 2019.

Done right, deliberative democracy brings out the best in citizens. “My experience shows that some of the most polarising issues can be tackled in this manner,” Louise Caldwell, an Irish assembly member, wrote in a column for the Guardian in January. India’s village assemblies, which involve all the adults in local decision-making, are a form of deliberative democracy on a grand scale. A March article in the journal Science says that “evidence from places such as Colombia, Belgium, Northern Ireland, and Bosnia shows that properly structured deliberation can promote recognition, understanding, and learning.” Even French President Emmanuel Macron has used it, convening a three-month “great debate” to solicit the public’s views on some of the issues raised by the sometimes-violent Yellow Vest movement. On April 8, Prime Minister Edouard Philippe presented one key finding: The French have “zero tolerance” for new taxes…(More)”.

Tracking Phones, Google Is a Dragnet for the Police


Jennifer Valentino-DeVries at the New York Times: “….The warrants, which draw on an enormous Google database employees call Sensorvault, turn the business of tracking cellphone users’ locations into a digital dragnet for law enforcement. In an era of ubiquitous data gathering by tech companies, it is just the latest example of how personal information — where you go, who your friends are, what you read, eat and watch, and when you do it — is being used for purposes many people never expected. As privacy concerns have mounted among consumers, policymakers and regulators, tech companies have come under intensifying scrutiny over their data collection practices.

The Arizona case demonstrates the promise and perils of the new investigative technique, whose use has risen sharply in the past six months, according to Google employees familiar with the requests. It can help solve crimes. But it can also snare innocent people.

Technology companies have for years responded to court orders for specific users’ information. The new warrants go further, suggesting possible suspects and witnesses in the absence of other clues. Often, Google employees said, the company responds to a single warrant with location information on dozens or hundreds of devices.

Law enforcement officials described the method as exciting, but cautioned that it was just one tool….

The technique illustrates a phenomenon privacy advocates have long referred to as the “if you build it, they will come” principle — anytime a technology company creates a system that could be used in surveillance, law enforcement inevitably comes knocking. Sensorvault, according to Google employees, includes detailed location records involving at least hundreds of millions of devices worldwide and dating back nearly a decade….(More)”.

The Privacy Project


The New York Times: “Companies and governments are gaining new powers to follow people across the internet and around the world, and even to peer into their genomes. The benefits of such advances have been apparent for years; the costs — in anonymity, even autonomy — are now becoming clearer. The boundaries of privacy are in dispute, and its future is in doubt. Citizens, politicians and business leaders are asking if societies are making the wisest tradeoffs. The Times is embarking on this months long project to explore the technology and where it’s taking us, and to convene debate about how it can best help realize human potential….(More)”

Does Privacy Matter?

What Do They Know, and How Do They Know It?

What Should Be Done About This?

What Can I Do?

(View all Privacy articles…)

Filling a gap: the clandestine gang fixing Rome illegally


Giorgio Ghiglione in The Guardian: “It is 6am on a Sunday and the streets of the Ostiense neighbourhood in southern Rome are empty. The metro has just opened and nearby cafes still await their first customers.

Seven men and women are working hard, their faces obscured by scarves and hoodies as they unload bags of cement and sand from a car near the Basilica of St Paul Outside the Walls.

They are not criminals. Members of the secret Gap organisation, they hide their identities because what they are doing – fixing a broken pavement without official permission – is technically illegal.

City maintenance – or the lack of it – has long been a hot-button issue in Italy’s capital. There are an estimated 10,000 potholesin the city – a source of frustration for the many Romans who travel by scooter. Garbage collection has also become a major problem since the city’s landfill was closed in 2013, with periodic “waste crises” where trash piles up in the streets. Cases of exploding buses and the collapse of a metro escalatormade international headlines.

The seven clandestine pavement-fixers are part of a network of about 20 activists quietly doing the work that the city authorities have failed to do. Gap stands for Gruppi Artigiani Pronto Intervento, (“groups of artisan emergency services”) but is also a tribute to the partisans of Gruppi di Azione Patriottica, who fought the fascists during the second world war.

“We chose this name because many of our parents or grandparents were partisans and we liked the idea of honouring their memory,” says one of the activists, a fiftysomething architect who goes by the pseudonym Renato. While the modern-day Gap aren’t risking their lives, their modus operandi is inspired by resistance saboteurs: they identify a target, strike and disappear unseen into the city streets.

Gap have been busy over the past few months. In December they repaired the fountain, built in the 1940s, of the Principe di Piemonte primary school. In January they painted a pedestrian crossing on a dangerous major road. Their latest work, the pavement fixing in Ostiense, involved filling a deep hole that regularly filled with water when it rained….(More)”.

The Automated Administrative State


Paper by Danielle Citron and Ryan Calo: “The administrative state has undergone radical change in recent decades. In the twentieth century, agencies in the United States generally relied on computers to assist human decision-makers. In the twenty-first century, computers are making agency decisions themselves. Automated systems are increasingly taking human beings out of the loop. Computers terminate Medicaid to cancer patients and deny food stamps to individuals. They identify parents believed to owe child support and initiate collection proceedings against them. Computers purge voters from the rolls and deem small businesses ineligible for federal contracts [1].

Automated systems built in the early 2000s eroded procedural safeguards at the heart of the administrative state. When government makes important decisions that affect our lives, liberty, and property, it owes us “due process”— understood as notice of, and a chance to object to, those decisions. Automated systems, however, frustrate these guarantees. Some systems like the “no-fly” list were designed and deployed in secret; others lacked record-keeping audit trails, making review of the law and facts supporting a system’s decisions impossible. Because programmers working at private contractors lacked training in the law, they distorted policy when translating it into code [2].

Some of us in the academy sounded the alarm as early as the 1990s, offering an array of mechanisms to ensure the accountability and transparency of automated administrative state [3]. Yet the same pathologies continue to plague government decision-making systems today. In some cases, these pathologies have deepened and extended. Agencies lean upon algorithms that turn our personal data into predictions, professing to reflect who we are and what we will do. The algorithms themselves increasingly rely upon techniques, such as deep learning, that are even less amenable to scrutiny than purely statistical models. Ideals of what the administrative law theorist Jerry Mashaw has called “bureaucratic justice” in the form of efficiency with a “human face” feel impossibly distant [4].

The trend toward more prevalent and less transparent automation in agency decision-making is deeply concerning. For a start, we have yet to address in any meaningful way the widening gap between the commitments of due process and the actual practices of contemporary agencies [5]. Nonetheless, agencies rush to automate (surely due to the influence and illusive promises of companies seeking lucrative contracts), trusting algorithms to tell us if criminals should receive probation, if public school teachers should be fired, or if severely disabled individuals should receive less than the maximum of state-funded nursing care [6]. Child welfare agencies conduct intrusive home inspections because some system, which no party to the interaction understands, has rated a poor mother as having a propensity for violence. The challenges of preserving due process in light of algorithmic decision-making is an area of renewed and active attention within academia, civil society, and even the courts [7].

Second, and routinely overlooked, we are applying the new affordances of artificial intelligence in precisely the wrong contexts…(More)”.