Using location data responsibly in cities and local government


Article by Ben Hawes: “City and local governments increasingly recognise the power of location data to help them deliver more and better services, exactly where and when they are needed. The use of this data is going to grow, with more pressure to manage resources and emerging challenges including responding to extreme weather events and other climate impacts.

But using location data to target and manage local services comes with risks to the equitable delivery of services, privacy and accountability. To make the best use of these growing data resources, city leaders and their teams need to understand those risks and address them, and to be able to explain their uses of data to citizens.

The Locus Charter, launched earlier this year, is a set of common principles to promote responsible practice when using location data. The Charter could be very valuable to local governments, to help them navigate the challenges and realise the rewards offered by data about the places they manage….

Compared to private companies, local public bodies already have special responsibilities to ensure transparency and fairness. New sources of data can help, but can also generate new questions. Local governments have generally been able to improve services as they learned more about the people they served. Now they must manage the risks of knowing too much about people, and acting intrusively. They can also risk distorting service provision because their data about people in places is uneven or unrepresentative.

Many city and local governments fully recognise that data-driven delivery comes with risks, and are developing specific local data ethics frameworks to guide their work. Some of these, like Kansas City’s, are specifically aimed at managing data privacy. Others cover broader uses of data, like Greater Manchester’s Declaration for Intelligent and Responsible Data Practice (DTPR). DTPR is an open source communication standard that helps people understand how data is being used in public places.

London is engaging citizens on an Emerging Technology Charter, to explore new and ethically charged questions around data. Govlab supports an AI Localism repository of actions taken by local decision-makers to address the use of AI within a city or community. The EU Sherpa programme (Shaping the Ethical Dimensions of Smart Information Systems) includes a smart cities strand, and has published a case-study on the Ethics of Using Smart City AI and Big Data.

Smart city applications make it potentially possible to collect data in many ways, for many purposes, but the technologies cannot answer questions about what is appropriate. In The Smart Enough City: Putting Technology in its Place to Reclaim Our Urban Future (2019), author Ben Green describes examples when some cities have failed and others succeeded in judging what smart applications should be used.

Attention to what constitutes ethical practice with location data can give additional help to leaders making that kind of judgement….(More)”

Licensure as Data Governance


Essay by Frank Pasquale: “…A licensure regime for data and the AI it powers would enable citizens to democratically shape data’s scope and proper use, rather than resigning ourselves to being increasingly influenced and shaped by forces beyond our control.To ground the case for more ex ante regulation, Part I describes the expanding scope of data collection, analysis, and use, and the threats that that scope poses to data subjects. Part II critiques consent-based models of data protection, while Part III examines the substantive foundation of licensure models. Part IV addresses a key challenge to my approach: the free expression concerns raised by the licensure of large-scale personal data collection, analysis, and use. Part V concludes with reflections on the opportunities created by data licensure frameworks and potential limitations upon them….(More)”.

Global citizens’ assembly to be chosen for UN climate talks


Article by Fiona Harvey: “One hundred people from around the world are to take part in a citizens’ assembly to discuss the climate crisis over the next month, before presenting their findings at the UN Cop26 climate summit.

The Global Citizens’ Assembly will be representative of the world’s population, and will invite people chosen by lottery to take part in online discussions that will culminate in November, during the fortnight-long climate talks that open in Glasgow on 31 October.

Funded with nearly $1m, from sources including the Scottish government and the European Climate Foundation, the assembly is supported by the UN and UK and run by a coalition of more than 100 organisations…

A team of international scientists and other experts will explain details of the climate crisis and potential solutions, and members of the assembly will discuss how these might work in practice, seeking to answer the question: “How can humanity address the climate and ecological crisis in a fair and effective way?”. The key messages from their discussions will be presented at Cop26 and a report will be published in March.

Alok Sharma, the UK cabinet minister who will act as president of the Cop26 summit, said: “The Global Assembly is a fantastic initiative and was selected for representation in the green zone [of the Cop26 presentation hall] because we recognise just how important its work is and also because we are committed to bringing the voice of global citizens into the heart of Cop26. It creates that vital link between local conversation and a global conference.”…(More)”.

Pandora Papers & Data Journalism: how investigative journalists use tech


Article at Moonshot: “The Pandora Papers’s 11.9 million records arrived from 14 different offshore services firms; a 2.94 terabyte data trove exposes the offshore secrets of wealthy elites from more than 200 countries and territories.

It contains data for 330 politicians and public officials, from more than 90 countries and territories, including 35 current and former country leaders, as well as celebrities, fraudsters, drug dealers, royal family members and leaders of religious groups around the world.

It involved more than 600 journalists from 150 media outlets in 117 countries.

It took ICIJ more than a year to structure, research and analyze the data, which will be incorporated into the Offshore Leaks database: The task involved three main elements: journalists, technology and time….(More)”

Volunteers Sped Up Alzheimer’s Research


Article by SciStarter: “Across the United States, 5.7 million people are living with Alzheimer’s disease, the seventh leading cause of death in America. But there is still no treatment or cure. Alzheimer’s hits close to home for many of us who have seen loved ones suffer and who feel hopeless in the face of this disease. With Stall Catchers, an online citizen science project, joining the fight against Alzheimer’s is as easy as playing an online computer game…

Scientists at Cornell University found a link between “stalled” blood vessels in the brain and the symptoms of Alzheimer’s. These stalled vessels limit blood flow to the brain by up to 30 percent. In experiments with laboratory mice, when the blood cells causing the stalls were removed, the mice performed better on memory tests.about:blankabout:blank

The researchers are working to develop Alzheimer’s treatments that remove the stalls in mice in the hope they can apply these methods to humans. But analyzing the brain images to find the stalled capillaries is hard and time consuming. It could take a trained laboratory technician six to 12 months to analyze each week’s worth of data collection.

So, Cornell researchers created Stall Catchers to make finding the stalled blood vessels into a game that anyone can play. The game relies on the power of the crowd — multiple confirmed answers — before determining whether a vessel is stalled or flowing…

Since its inception is 2016, he project has grown steadily, addressing various datasets and uncovering new insights about Alzheimer’s disease. Citizen scientists who play the game identify blood vessels as “flowing” or “stalled,” earning points for their classifications.

One way Stall Catchers makes this research fun is by allowing volunteers to form teams and engage in friendly competition…(More)”.

Putting data at the heart of policymaking will accelerate London’s recovery


Mel Hobson at Computer Weekly: “…London’s mayor, Sadiq Khan, knows how important this is. His re-election manifesto committed to rebuilding the London Datastore, currently home to over 700 freely available datasets, as the central register linking data across our city. That in turn will help analysts, researchers and policymakers understand our city and develop new ideas and solutions.

To help take the next step and create a data ecosystem that can improve millions of Londoners lives, businesses across our capital are committing their expertise and insights.

At London First, we have launched the London Data Charter, expertly put together by Pinsent Masons, and setting out the guiding principles for private and public sector data collaborations, which are key to creating this ecosystem. These include a focus on protecting privacy and security of data, promoting trust and sharing learnings with others – creating scalable solutions to meet the capital’s challenges….(More)”.

World Wide Weird: Rise of the Cognitive Ecosystem


Braden R. Allenby at Issues: “Social media, artificial intelligence, the Internet of Things, and the data economy are coming together in a way that transcends how humans understand and control our world.

In the beginning of the movie 2001: A Space Odyssey, an ape, after hugging a strange monolith, picks up a bone and randomly begins playing with it … and then, as Richard Strauss’s Also sprach Zarathustra rings in the background, the ape realizes that the bone it is holding is, in fact, a weapon. The ape, the bone, and the landscape remain exactly the same, yet something fundamental has changed: an ape casually holding a bone is a very different system than an ape consciously wielding a weapon. The warrior ape is an emergent cognitive phenomenon, neither required nor deterministically produced by the constituent parts: a bone, and an ape, in a savannah environment.

Cognition as an emergent property of techno-human systems is not a new phenomenon. Indeed, it might be said that the ability of humans and their institutions to couple to their technologies to create such techno-human systems is the source of civilization itself. Since humans began producing artifacts, and especially since we began creating artifacts designed to capture, preserve, and transmit information—from illuminated manuscripts and Chinese oracle bones to books and computers—humans have integrated with their technologies to produce emergent cognitive results.

And these combinations have transformed the world. Think of the German peasants, newly literate, who were handed populist tracts produced on then-newfangled printing presses in 1530: the Reformation happened. Thanks to the printers, information and strategies flowed between the thinkers and the readers faster, uniting people across time and space. Eventually, the result was another fundamental shift in the cognitive structure: the Enlightenment happened.

Since humans began producing artifacts, and especially artifacts designed to capture, preserve, and transmit information, humans have integrated with their technologies to produce emergent cognitive results.

In the 1980s Edwin Hutchins found another cognitive structure when he observed a pre-GPS crew navigating on a naval vessel: technology in the form of devices, charts, and books were combined with several individuals with specialized skills and training to produce knowledge of the ship’s position (the “fix”). No single entity, human or technological, contained the entire process; rather, as Hutchins observed: “An interlocking set of partial procedures can produce the overall observed pattern without there being a representation of that overall pattern anywhere in the system.” The fix arises as an emergent cognitive product that is nowhere found in the constituent pieces, be they technology or human; indeed, Hutchins speaks of “the computational ecology of navigation tools.”

Fast forward to today. It should be no surprise that at some point techno-human cognitive systems such as social media, artificial intelligence (AI), the Internet of Things (IoT), 5G, cameras, computers, and sensors should begin to form their own ecology—significantly different in character from human cognition….(More)”

The Downside to State and Local Privacy Regulations


GovTech: “To fight back against cyber threats, state and local governments have started to implement tighter privacy regulations. But is this trend a good thing? Or do stricter rules present more challenges than they do solutions?

According to Daniel Castro, vice president of the Information Technology and Innovation Foundation, one of the main issues with stricter privacy regulations is having no centralized rules for states to follow.

“Probably the biggest problem is states setting up a set of contradictory overlapping rules across the country,” Castro said. “This creates a serious cost on organizations and businesses. They can abide by 50 state privacy laws, but there could be different regulations across local jurisdictions.”

One example of a hurdle for organizations and businesses is local jurisdictions creating specific rules for facial recognition and biometric technology.

“Let’s say a company starts selling a smart doorbell service; because of these rules, this service might not be able to be legally sold in one jurisdiction,” Castro said.

Another concern relates to the distinction between government data collection and commercial data collection, said Washington state Chief Privacy Officer Katy Ruckle. Sometimes there is a notion that one law can apply to everything, but different data types involve different types of rights for individuals.

“An example I like to use is somebody that’s been committed to a mental health institution for mental health needs,” Ruckle said. “Their data collection is very different from somebody buying a vacuum cleaner off Amazon.”

On the topic of governments collecting data, Castro emphasized the importance of knowing how data will be utilized in order to set appropriate privacy regulations….(More)”

Greece used AI to curb COVID: what other nations can learn


Editorial at Nature: “A few months into the COVID-19 pandemic, operations researcher Kimon Drakopoulos e-mailed both the Greek prime minister and the head of the country’s COVID-19 scientific task force to ask if they needed any extra advice.

Drakopoulos works in data science at the University of Southern California in Los Angeles, and is originally from Greece. To his surprise, he received a reply from Prime Minister Kyriakos Mitsotakis within hours. The European Union was asking member states, many of which had implemented widespread lockdowns in March, to allow non-essential travel to recommence from July 2020, and the Greek government needed help in deciding when and how to reopen borders.

Greece, like many other countries, lacked the capacity to test all travellers, particularly those not displaying symptoms. One option was to test a sample of visitors, but Greece opted to trial an approach rooted in artificial intelligence (AI).

Between August and November 2020 — with input from Drakopoulos and his colleagues — the authorities launched a system that uses a machine-learning algorithm to determine which travellers entering the country should be tested for COVID-19. The authors found machine learning to be more effective at identifying asymptomatic people than was random testing or testing based on a traveller’s country of origin. According to the researchers’ analysis, during the peak tourist season, the system detected two to four times more infected travellers than did random testing.

The machine-learning system, which is among the first of its kind, is called Eva and is described in Nature this week (H. Bastani et al. Nature https://doi.org/10.1038/s41586-021-04014-z; 2021). It’s an example of how data analysis can contribute to effective COVID-19 policies. But it also presents challenges, from ensuring that individuals’ privacy is protected to the need to independently verify its accuracy. Moreover, Eva is a reminder of why proposals for a pandemic treaty (see Nature 594, 8; 2021) must consider rules and protocols on the proper use of AI and big data. These need to be drawn up in advance so that such analyses can be used quickly and safely in an emergency.

In many countries, travellers are chosen for COVID-19 testing at random or according to risk categories. For example, a person coming from a region with a high rate of infections might be prioritized for testing over someone travelling from a region with a lower rate.

By contrast, Eva collected not only travel history, but also demographic data such as age and sex from the passenger information forms required for entry to Greece. It then matched those characteristics with data from previously tested passengers and used the results to estimate an individual’s risk of infection. COVID-19 tests were targeted to travellers calculated to be at highest risk. The algorithm also issued tests to allow it to fill data gaps, ensuring that it remained up to date as the situation unfolded.

During the pandemic, there has been no shortage of ideas on how to deploy big data and AI to improve public health or assess the pandemic’s economic impact. However, relatively few of these ideas have made it into practice. This is partly because companies and governments that hold relevant data — such as mobile-phone records or details of financial transactions — need agreed systems to be in place before they can share the data with researchers. It’s also not clear how consent can be obtained to use such personal data, or how to ensure that these data are stored safely and securely…(More)”.

The Rise of the Pandemic Dashboard


Article by Marie Patino: “…All of these dashboards were launched very early in the pandemic,” said Damir Ivankovic, a PhD student at the University of Amsterdam. “Some of them were developed literally overnight, or over three sleepless nights in certain countries.” With Ph.D. researcher Erica Barbazza, Ivankovic has been leading a set of studies about Covid-19 dashboards with a network of researchers. For an upcoming paper that’s still unpublished, the pair have talked to more than 30 government dashboard teams across Europe and Asia to better understand their dynamics and the political decisions at stake in their creation. 

The dashboard craze can be traced back to Jan. 22, 2020, when graduate student Ensheng Dong, and Lauren Gardner, co-director of Johns Hopkins University’s Center for Systems Science and Engineering, launched the JHU interactive Covid dashboard. It would quickly achieve international fame, and screenshots of it started popping up in newspapers and on TV. The dashboard now racks up billions of daily hits. Soon after, cartography software company ESRI, through which the tool was made, spun off a variety of Covid resources and example dashboards, easy to customize and publish for those with a license. ESRI has provided about 5,000 organizations with a free license since the beginning of Covid.

That’s generated unprecedented traffic: The most-viewed public dashboards made using ESRI are all Covid-related, according to the company. The Johns Hopkins dash is number one. It made its data feed available for free, and now multiple other dashboards built by government and even news outlets, including Bloomberg, rely on Johns Hopkins to update their numbers. 

Public Health England’s dashboard is designed and hand-coded from scratch. But because of the pandemic’s urgency, many government agencies that lacked expertise in data analysis and visualization turned to off-the-shelf business analytics software to build their dashboards. Among those is ESRI, but also Tableau and Microsoft Power BI.

The pros? They provide ready-to-use templates and modules, don’t necessitate programming knowledge, are fast and easy to publish and provide users with a technical lifeline. The cons? They don’t enable design, can look clunky and cluttered, provide little wiggle room in terms of explaining the data and are rarely mobile-friendly. Also, many don’t provide multi-language support or accessibility features, and some don’t enable users to access the raw data that powers the tool. 

Dashboards everywhere
A compilation of government dashboards….(More)”.