Global citizens’ assembly to be chosen for UN climate talks


Article by Fiona Harvey: “One hundred people from around the world are to take part in a citizens’ assembly to discuss the climate crisis over the next month, before presenting their findings at the UN Cop26 climate summit.

The Global Citizens’ Assembly will be representative of the world’s population, and will invite people chosen by lottery to take part in online discussions that will culminate in November, during the fortnight-long climate talks that open in Glasgow on 31 October.

Funded with nearly $1m, from sources including the Scottish government and the European Climate Foundation, the assembly is supported by the UN and UK and run by a coalition of more than 100 organisations…

A team of international scientists and other experts will explain details of the climate crisis and potential solutions, and members of the assembly will discuss how these might work in practice, seeking to answer the question: “How can humanity address the climate and ecological crisis in a fair and effective way?”. The key messages from their discussions will be presented at Cop26 and a report will be published in March.

Alok Sharma, the UK cabinet minister who will act as president of the Cop26 summit, said: “The Global Assembly is a fantastic initiative and was selected for representation in the green zone [of the Cop26 presentation hall] because we recognise just how important its work is and also because we are committed to bringing the voice of global citizens into the heart of Cop26. It creates that vital link between local conversation and a global conference.”…(More)”.

Pandora Papers & Data Journalism: how investigative journalists use tech


Article at Moonshot: “The Pandora Papers’s 11.9 million records arrived from 14 different offshore services firms; a 2.94 terabyte data trove exposes the offshore secrets of wealthy elites from more than 200 countries and territories.

It contains data for 330 politicians and public officials, from more than 90 countries and territories, including 35 current and former country leaders, as well as celebrities, fraudsters, drug dealers, royal family members and leaders of religious groups around the world.

It involved more than 600 journalists from 150 media outlets in 117 countries.

It took ICIJ more than a year to structure, research and analyze the data, which will be incorporated into the Offshore Leaks database: The task involved three main elements: journalists, technology and time….(More)”

Volunteers Sped Up Alzheimer’s Research


Article by SciStarter: “Across the United States, 5.7 million people are living with Alzheimer’s disease, the seventh leading cause of death in America. But there is still no treatment or cure. Alzheimer’s hits close to home for many of us who have seen loved ones suffer and who feel hopeless in the face of this disease. With Stall Catchers, an online citizen science project, joining the fight against Alzheimer’s is as easy as playing an online computer game…

Scientists at Cornell University found a link between “stalled” blood vessels in the brain and the symptoms of Alzheimer’s. These stalled vessels limit blood flow to the brain by up to 30 percent. In experiments with laboratory mice, when the blood cells causing the stalls were removed, the mice performed better on memory tests.about:blankabout:blank

The researchers are working to develop Alzheimer’s treatments that remove the stalls in mice in the hope they can apply these methods to humans. But analyzing the brain images to find the stalled capillaries is hard and time consuming. It could take a trained laboratory technician six to 12 months to analyze each week’s worth of data collection.

So, Cornell researchers created Stall Catchers to make finding the stalled blood vessels into a game that anyone can play. The game relies on the power of the crowd — multiple confirmed answers — before determining whether a vessel is stalled or flowing…

Since its inception is 2016, he project has grown steadily, addressing various datasets and uncovering new insights about Alzheimer’s disease. Citizen scientists who play the game identify blood vessels as “flowing” or “stalled,” earning points for their classifications.

One way Stall Catchers makes this research fun is by allowing volunteers to form teams and engage in friendly competition…(More)”.

Putting data at the heart of policymaking will accelerate London’s recovery


Mel Hobson at Computer Weekly: “…London’s mayor, Sadiq Khan, knows how important this is. His re-election manifesto committed to rebuilding the London Datastore, currently home to over 700 freely available datasets, as the central register linking data across our city. That in turn will help analysts, researchers and policymakers understand our city and develop new ideas and solutions.

To help take the next step and create a data ecosystem that can improve millions of Londoners lives, businesses across our capital are committing their expertise and insights.

At London First, we have launched the London Data Charter, expertly put together by Pinsent Masons, and setting out the guiding principles for private and public sector data collaborations, which are key to creating this ecosystem. These include a focus on protecting privacy and security of data, promoting trust and sharing learnings with others – creating scalable solutions to meet the capital’s challenges….(More)”.

World Wide Weird: Rise of the Cognitive Ecosystem


Braden R. Allenby at Issues: “Social media, artificial intelligence, the Internet of Things, and the data economy are coming together in a way that transcends how humans understand and control our world.

In the beginning of the movie 2001: A Space Odyssey, an ape, after hugging a strange monolith, picks up a bone and randomly begins playing with it … and then, as Richard Strauss’s Also sprach Zarathustra rings in the background, the ape realizes that the bone it is holding is, in fact, a weapon. The ape, the bone, and the landscape remain exactly the same, yet something fundamental has changed: an ape casually holding a bone is a very different system than an ape consciously wielding a weapon. The warrior ape is an emergent cognitive phenomenon, neither required nor deterministically produced by the constituent parts: a bone, and an ape, in a savannah environment.

Cognition as an emergent property of techno-human systems is not a new phenomenon. Indeed, it might be said that the ability of humans and their institutions to couple to their technologies to create such techno-human systems is the source of civilization itself. Since humans began producing artifacts, and especially since we began creating artifacts designed to capture, preserve, and transmit information—from illuminated manuscripts and Chinese oracle bones to books and computers—humans have integrated with their technologies to produce emergent cognitive results.

And these combinations have transformed the world. Think of the German peasants, newly literate, who were handed populist tracts produced on then-newfangled printing presses in 1530: the Reformation happened. Thanks to the printers, information and strategies flowed between the thinkers and the readers faster, uniting people across time and space. Eventually, the result was another fundamental shift in the cognitive structure: the Enlightenment happened.

Since humans began producing artifacts, and especially artifacts designed to capture, preserve, and transmit information, humans have integrated with their technologies to produce emergent cognitive results.

In the 1980s Edwin Hutchins found another cognitive structure when he observed a pre-GPS crew navigating on a naval vessel: technology in the form of devices, charts, and books were combined with several individuals with specialized skills and training to produce knowledge of the ship’s position (the “fix”). No single entity, human or technological, contained the entire process; rather, as Hutchins observed: “An interlocking set of partial procedures can produce the overall observed pattern without there being a representation of that overall pattern anywhere in the system.” The fix arises as an emergent cognitive product that is nowhere found in the constituent pieces, be they technology or human; indeed, Hutchins speaks of “the computational ecology of navigation tools.”

Fast forward to today. It should be no surprise that at some point techno-human cognitive systems such as social media, artificial intelligence (AI), the Internet of Things (IoT), 5G, cameras, computers, and sensors should begin to form their own ecology—significantly different in character from human cognition….(More)”

The Downside to State and Local Privacy Regulations


GovTech: “To fight back against cyber threats, state and local governments have started to implement tighter privacy regulations. But is this trend a good thing? Or do stricter rules present more challenges than they do solutions?

According to Daniel Castro, vice president of the Information Technology and Innovation Foundation, one of the main issues with stricter privacy regulations is having no centralized rules for states to follow.

“Probably the biggest problem is states setting up a set of contradictory overlapping rules across the country,” Castro said. “This creates a serious cost on organizations and businesses. They can abide by 50 state privacy laws, but there could be different regulations across local jurisdictions.”

One example of a hurdle for organizations and businesses is local jurisdictions creating specific rules for facial recognition and biometric technology.

“Let’s say a company starts selling a smart doorbell service; because of these rules, this service might not be able to be legally sold in one jurisdiction,” Castro said.

Another concern relates to the distinction between government data collection and commercial data collection, said Washington state Chief Privacy Officer Katy Ruckle. Sometimes there is a notion that one law can apply to everything, but different data types involve different types of rights for individuals.

“An example I like to use is somebody that’s been committed to a mental health institution for mental health needs,” Ruckle said. “Their data collection is very different from somebody buying a vacuum cleaner off Amazon.”

On the topic of governments collecting data, Castro emphasized the importance of knowing how data will be utilized in order to set appropriate privacy regulations….(More)”

Greece used AI to curb COVID: what other nations can learn


Editorial at Nature: “A few months into the COVID-19 pandemic, operations researcher Kimon Drakopoulos e-mailed both the Greek prime minister and the head of the country’s COVID-19 scientific task force to ask if they needed any extra advice.

Drakopoulos works in data science at the University of Southern California in Los Angeles, and is originally from Greece. To his surprise, he received a reply from Prime Minister Kyriakos Mitsotakis within hours. The European Union was asking member states, many of which had implemented widespread lockdowns in March, to allow non-essential travel to recommence from July 2020, and the Greek government needed help in deciding when and how to reopen borders.

Greece, like many other countries, lacked the capacity to test all travellers, particularly those not displaying symptoms. One option was to test a sample of visitors, but Greece opted to trial an approach rooted in artificial intelligence (AI).

Between August and November 2020 — with input from Drakopoulos and his colleagues — the authorities launched a system that uses a machine-learning algorithm to determine which travellers entering the country should be tested for COVID-19. The authors found machine learning to be more effective at identifying asymptomatic people than was random testing or testing based on a traveller’s country of origin. According to the researchers’ analysis, during the peak tourist season, the system detected two to four times more infected travellers than did random testing.

The machine-learning system, which is among the first of its kind, is called Eva and is described in Nature this week (H. Bastani et al. Nature https://doi.org/10.1038/s41586-021-04014-z; 2021). It’s an example of how data analysis can contribute to effective COVID-19 policies. But it also presents challenges, from ensuring that individuals’ privacy is protected to the need to independently verify its accuracy. Moreover, Eva is a reminder of why proposals for a pandemic treaty (see Nature 594, 8; 2021) must consider rules and protocols on the proper use of AI and big data. These need to be drawn up in advance so that such analyses can be used quickly and safely in an emergency.

In many countries, travellers are chosen for COVID-19 testing at random or according to risk categories. For example, a person coming from a region with a high rate of infections might be prioritized for testing over someone travelling from a region with a lower rate.

By contrast, Eva collected not only travel history, but also demographic data such as age and sex from the passenger information forms required for entry to Greece. It then matched those characteristics with data from previously tested passengers and used the results to estimate an individual’s risk of infection. COVID-19 tests were targeted to travellers calculated to be at highest risk. The algorithm also issued tests to allow it to fill data gaps, ensuring that it remained up to date as the situation unfolded.

During the pandemic, there has been no shortage of ideas on how to deploy big data and AI to improve public health or assess the pandemic’s economic impact. However, relatively few of these ideas have made it into practice. This is partly because companies and governments that hold relevant data — such as mobile-phone records or details of financial transactions — need agreed systems to be in place before they can share the data with researchers. It’s also not clear how consent can be obtained to use such personal data, or how to ensure that these data are stored safely and securely…(More)”.

The Rise of the Pandemic Dashboard


Article by Marie Patino: “…All of these dashboards were launched very early in the pandemic,” said Damir Ivankovic, a PhD student at the University of Amsterdam. “Some of them were developed literally overnight, or over three sleepless nights in certain countries.” With Ph.D. researcher Erica Barbazza, Ivankovic has been leading a set of studies about Covid-19 dashboards with a network of researchers. For an upcoming paper that’s still unpublished, the pair have talked to more than 30 government dashboard teams across Europe and Asia to better understand their dynamics and the political decisions at stake in their creation. 

The dashboard craze can be traced back to Jan. 22, 2020, when graduate student Ensheng Dong, and Lauren Gardner, co-director of Johns Hopkins University’s Center for Systems Science and Engineering, launched the JHU interactive Covid dashboard. It would quickly achieve international fame, and screenshots of it started popping up in newspapers and on TV. The dashboard now racks up billions of daily hits. Soon after, cartography software company ESRI, through which the tool was made, spun off a variety of Covid resources and example dashboards, easy to customize and publish for those with a license. ESRI has provided about 5,000 organizations with a free license since the beginning of Covid.

That’s generated unprecedented traffic: The most-viewed public dashboards made using ESRI are all Covid-related, according to the company. The Johns Hopkins dash is number one. It made its data feed available for free, and now multiple other dashboards built by government and even news outlets, including Bloomberg, rely on Johns Hopkins to update their numbers. 

Public Health England’s dashboard is designed and hand-coded from scratch. But because of the pandemic’s urgency, many government agencies that lacked expertise in data analysis and visualization turned to off-the-shelf business analytics software to build their dashboards. Among those is ESRI, but also Tableau and Microsoft Power BI.

The pros? They provide ready-to-use templates and modules, don’t necessitate programming knowledge, are fast and easy to publish and provide users with a technical lifeline. The cons? They don’t enable design, can look clunky and cluttered, provide little wiggle room in terms of explaining the data and are rarely mobile-friendly. Also, many don’t provide multi-language support or accessibility features, and some don’t enable users to access the raw data that powers the tool. 

Dashboards everywhere
A compilation of government dashboards….(More)”.

Goldman Sachs will soon launch its own version of LinkedIn


Sarah Butcher at EFC: “Sometime soon, it will happen. After two years of construction, Goldman Sachs is expected to launch its own version of LinkedIn – first at Goldman, and then into the world at large. 

Known as Louisa, the platform was conceived by Rohan Doctor, a former head of bank solutions sales at Goldman Sachs in Hong Kong. Doctor submitted his idea for a kind of “internal LinkedIn network” to Accelerate, Goldman Sachs’ internal incubator program in 2019. He’s been building it from New York ever since. It’s thought to be ready soon.

Neither Doctor nor Goldman Sachs would comment for this article, but based on statements Doctor has made on his LinkedIn profile and recent job advertisements for members of his team, Louisa is a “collective intelligence platform” that will enable Goldman staff to connect with each other and to share information in a more meaningful and intuitive way. In doing so, it’s hoped that Goldman will be able to improve knowledge transfer within the firm and that Goldman people will be able to serve clients better as a result.

Goldman has built Louisa around artificial intelligence. When an employee asks Louisa a question, the platform uses natural language processing (NLP) techniques like named entity recognition, language modelling and query parsing to understand the kind of information that’s being sought. Data from user interactions is then used to build user preference feedback loops and user representation models that can target content to particular users and suggest topics. Network analysis is used to identify how users are engaging with each other, to suggest other users or groups of users to engage with, and to look at how Louisa’s features are being used by particular user clusters…(More)”.

Expertise, ‘Publics’ and the Construction of Government Policy


Introduction to Special Issue of Discover Society about the role of expertise and professional knowledge in democracy by John Holmwood: “In the UK, the vexed nature of the issue was, perhaps, best illustrated by (then Justice Secretary) Michael Gove’s comment during the Brexit campaign that he thought, “the people of this country have had enough of experts.” The comment is oft cited, and derided, especially in the context of the Covid-19 pandemic, where the public has, or so it is argued, found a new respect for a science that can guide public policy and deliver solutions.

Yet, Michael Gove’s point was more nuanced than is usually credited. It wasn’t scientific advice that he claimed people were fed up with, but “experts with organisations with acronyms saying that they know what is best and getting it consistently wrong.” In other words, his complaint was about specific organised advocacy groups and their intervention in public debate and reporting in the media.

… the Government has consistently mobilised the claimed expert opinion of organisations in justification of their policies

Michael Gove’s extended comment was disingenuous. After all, the Brexit campaign, no less than the Remain campaign, drew upon arguments from think tanks and lobby groups. Moreover, since the referendum, the Government has consistently mobilised the claimed expert opinion of organisations in justification of their policies. Indeed, as Layla Aitlhadj and John Holmwood in this special issue argue, they have deliberately ‘managed’ civil society groups and supposedly independent reviews, such as that currently underway into the Prevent counter extremism policy.

In fact, there is nothing straightforward about the relationship between expertise and democracy as Stephen Turner (2003) has observed. The development of liberal democracy involves the rise of professional and expert knowledge which underpins the everyday governance of public institutions. At the same time, wider publics are asked to trust that knowledge even where it impinges directly upon their preferences; they are not in a position to evaluate it, except through the mediation of other experts. Elected politicians and governments, in turn, are dependent on expert knowledge to guide their policy choices, which are duly constrained by what is possible on the basis of technical judgements….(More)”