Stefaan Verhulst
The Economist: “The global downturn of 2020 is probably the most quantified on record. Economists, firms and statisticians seeking to gauge the depth of the collapse in economic activity and the pace of the recovery have seized upon a new dashboard of previously obscure indicators. Investors eagerly await the release of mobility statistics from tech companies such as Apple or Google, or restaurant-booking data from OpenTable, in a manner once reserved for official inflation and unemployment estimates. Central bankers pepper their speeches with novel barometers of consumer spending. Investment-bank analysts and journalists tout hot new measures of economic activity in the way that hipsters discuss the latest bands. Those who prefer to wait for official measures are regarded as being like fans of u2, a sanctimonious Irish rock group: stuck behind the curve as the rest of the world has moved on.
The main attraction of real-time data to policymakers and investors alike is timeliness. Whereas official, so-called hard data, such as inflation, employment or output measures, tend to be released with a lag of several weeks, or even months, real-time data, as the name suggests, can offer a window on today’s economic conditions. The depth of the downturns induced by covid-19 has put a premium on swift intelligence. The case for hard data has always been their quality, but this has suffered greatly during the pandemic. Compilers of official labour-market figures have struggled to account for furlough schemes and the like, and have plastered their releases with warnings about unusually high levels of uncertainty. Filling in statisticians’ forms has probably fallen to the bottom of firms’ to-do lists, reducing the accuracy of official output measures….
The value of real-time measures will be tested once the swings in economic activity approach a more normal magnitude. Mobility figures for March and April did predict the scale of the collapse in gdp, but that could have been estimated just as easily by stepping outside and looking around (at least in the places where that sort of thing was allowed during lockdown). Forecasters in rich countries are more used to quibbling over whether economies will grow at an annual rate of 2% or 3% than whether output will shrink by 20% or 30% in a quarter. Real-time measures have disappointed before. Immediately after Britain’s vote to leave the European Union in 2016, for instance, the indicators then watched by economists pointed to a sharp slowdown. It never came.
Real-time data, when used with care, have been a helpful supplement to official measures so far this year. With any luck the best of the new indicators will help official statisticians improve the quality and timeliness of their own figures. But, much like u2, the official measures have been around for a long time thanks to their tried and tested formula—and they are likely to stick around for a long time to come….(More)”.
Blog by Alexandra Shaw, Andrew J. Zahuranec, Andrew Young, Stefaan G. Verhulst, Jennifer Requejo, Liliana Carvajal: “Adolescence is a unique stage of life. The brain undergoes rapid development; individuals face new experiences, relationships, and environments. These events can be exciting, but they can also be a source of instability and hardship. Half of all mental conditions manifest by early adolescence and between 10 and 20 percent of all children and adolescents report mental health conditions. Despite the increased risks and concerns for adolescents’ well-being, there remain significant gaps in availability of data at the country level for policymakers to address these issues.
In June, The GovLab partnered with colleagues at UNICEF’s Health and HIV team in the Division of Data, Analysis, Planning & Monitoring and the Data for Children Collaborative — a collaboration between UNICEF, the Scottish Government, and the University of Edinburgh — to design and apply a new methodology of participatory mapping and prioritization of key topics and issues associated with adolescent mental health that could be addressed through enhanced data collaboration….
The event led to three main takeaways. First, the topic mapping allows participants to deliberate and prioritize the various aspects of adolescent mental health in a more holistic manner. Unlike the “blind men and the elephant” parable, a topic map allows the participants to see and discuss the interrelated parts of the topic, including those which they might be less familiar with.
Second, the workshops demonstrated the importance of tapping into distributed expertise via participatory processes. While the topic map provided a starting point, the inclusion of various experts allowed the findings of the document to be reviewed in a rapid, legitimate fashion. The diverse inputs helped ensure the individual aspects could be prioritized without a perspective being ignored.
Lastly, the approach showed the importance of data initiatives being driven and validated by those individuals representing the demand. By soliciting the input of those who would actually use the data, the methodology ensured data initiatives focused on the aspects thought to be most relevant and of greatest importance….(More)”
Paper by Eric Windholz: “Emergencies require governments to govern differently. In Australia, the changes wrought by the COVID-19 pandemic have been profound. The role of lawmaker has been assumed by the executive exercising broad emergency powers. Parliaments, and the debate and scrutiny they provide, have been marginalised. The COVID-19 response also has seen the medical-scientific expert metamorphose from decision-making input into decision-maker. Extensive legislative and executive decision-making authority has been delegated to them – directly in some jurisdictions; indirectly in others. Severe restrictions on an individual’s freedom of movement, association and to earn a livelihood have been declared by them, or on their advice. Employing the analytical lens of regulatory legitimacy, this article examines and seeks to understand this shift from parliamentary sovereignty to autocratic technocracy. How has it occurred? Why has it occurred? What have been the consequences and risks of vesting significant legislative and executive power in the hands of medical-scientific experts; what might be its implications? The article concludes by distilling insights to inform the future design and deployment of public health emergency powers….(More)”.
Centre for Data Ethics and Innovation: “Data sharing is fundamental to effective government and the running of public services. But it is not an end in itself. Data needs to be shared to drive improvements in service delivery and benefit citizens. For this to happen sustainably and effectively, public trust in the way data is shared and used is vital. Without such trust, the government and wider public sector risks losing society’s consent, setting back innovation as well as the smooth running of public services. Maximising the benefits of data driven technology therefore requires a solid foundation of societal approval.
AI and data driven technology offer extraordinary potential to improve decision making and service delivery in the public sector – from improved diagnostics to more efficient infrastructure and personalised public services. This makes effective use of data more important than it has ever been, and requires a step-change in the way data is shared and used. Yet sharing more data also poses risks and challenges to current governance arrangements.
The only way to build trust sustainably is to operate in a trustworthy way. Without adequate safeguards the collection and use of personal data risks changing power relationships between the citizen and the state. Insights derived by big data and the matching of different data sets can also undermine individual privacy or personal autonomy. Trade-offs are required which reflect democratic values, wider public acceptability and a shared vision of a data driven society. CDEI has a key role to play in exploring this challenge and setting out how it can be addressed. This report identifies barriers to data sharing, but focuses on building and sustaining the public trust which is vital if society is to maximise the benefits of data driven technology.
There are many areas where the sharing of anonymised and identifiable personal data by the public sector already improves services, prevents harm, and benefits the public. Over the last 20 years, different governments have adopted various measures to increase data sharing, including creating new legal sharing gateways. However, despite efforts to increase the amount of data sharing across the government, and significant successes in areas like open data, data sharing continues to be challenging and resource-intensive. This report identifies a range of technical, legal and cultural barriers that can inhibit data sharing.
Barriers to data sharing in the public sector
Technical barriers include limited adoption of common data standards and inconsistent security requirements across the public sector. Such inconsistency can prevent data sharing, or increase the cost and time for organisations to finalise data sharing agreements.
While there are often pre-existing legal gateways for data sharing, underpinned by data protection legislation, there is still a large amount of legal confusion on the part of public sector bodies wishing to share data which can cause them to start from scratch when determining legality and commit significant resources to legal advice. It is not unusual for the development of data sharing agreements to delay the projects for which the data is intended. While the legal scrutiny of data sharing arrangements is an important part of governance, improving the efficiency of these processes – without sacrificing their rigour – would allow data to be shared more quickly and at less expense.
Even when legal, the permissive nature of many legal gateways means significant cultural and organisational barriers to data sharing remain. Individual departments and agencies decide whether or not to share the data they hold and may be overly risk averse. Data sharing may not be prioritised by a department if it would require them to bear costs to deliver benefits that accrue elsewhere (i.e. to those gaining access to the data). Departments sharing data may need to invest significant resources to do so, as well as considering potential reputational or legal risks. This may hold up progress towards finding common agreement on data sharing. When there is an absence of incentives, even relatively small obstacles may mean data sharing is not deemed worthwhile by those who hold the data – despite the fact that other parts of the public sector might benefit significantly….(More)”.
Philip Oltermann at the Guardian: “German scientists are planning to equip 4,000 pop music fans with tracking gadgets and bottles of fluorescent disinfectant to get a clearer picture of how Covid-19 could be prevented from spreading at large indoor concerts.
As cultural mass gatherings across the world remain on hold for the foreseeable future, researchers in eastern Germany are recruiting volunteers for a “coronavirus experiment” with the singer-songwriter Tim Bendzko, to be held at an indoor stadium in the city of Leipzig on 22 August.
Participants, aged between 18 and 50, will wear matchstick-sized “contact tracer” devices on chains around their necks that transmit a signal at five-second intervals and collect data on each person’s movements and proximity to other members of the audience.
Inside the venue, they will also be asked to disinfect their hands with a fluorescent hand-sanitiser – designed to not just add a layer of protection but allow scientists to scour the venue with UV lights after the concerts to identify surfaces where a transmission of the virus through smear infection is most likely to take place.
Vapours from a fog machine will help visualise the possible spread of coronavirus via aerosols, which the scientists will try to predict via computer-generated models in advance of the event.
The €990,000 cost of the Restart-19 project will be shouldered between the federal states of Saxony and Saxony-Anhalt. The project’s organisers say the aim is to “identify a framework” for how larger cultural and sports events could be held “without posing a danger for the population” after 30 September….
To stop the Leipzig experiment from becoming the source of a new outbreak, signed-up volunteers will be sent a DIY test kit and have a swab at a doctor’s practice or laboratory 48 hours before the concert starts. Those who cannot show proof of a negative test at the door will be denied entry….(More)”.
Gov.UK: “The document provides guidance and advice to help policy officials follow open government principles when carrying out their work…
The Playbook has been developed as a response to the growing demand from policymakers, communications, and digital professionals to integrate the principles of open government in their roles. The content of the Playbook was drafted using existing resources (see the ‘further reading’ section), and was consulted with open government experts from Involve and Open Government Partnership….(More)”.
Paper by Albert Meijer & Marcel Thaens: “The positive features of innovation are well known but the dark side of public innovation has received less attention. To fill this gap, this article develops a theoretical understanding of the dark side of public innovation. We explore a diversity of perverse effects on the basis of a literature review and an expert consultation. We indicate that these perverse effects can be categorized on two dimensions: low public value and low public control. We confront this exploratory analysis with the literature and conclude that the perverse effects are not coincidental but emerge from key properties of innovation processes such as creating niches for innovation and accepting uncertainty about public value outcomes. To limit perverse effects, we call for the dynamic assessment of public innovation. The challenge for innovators is to acknowledge the dark side and take measures to prevent perverse effects without killing the innovativeness of organizations…(More)“.
Kathy Peach at The Conversation: “It should have been artificial intelligence’s moment in the sun. With billions of dollars of investment in recent years, AI has been touted as a solution to every conceivable problem. So when the COVID-19 pandemic arrived, a multitude of AI models were immediately put to work.
Some hunted for new compounds that could be used to develop a vaccine, or attempted to improve diagnosis. Some tracked the evolution of the disease, or generated predictions for patient outcomes. Some modelled the number of cases expected given different policy choices, or tracked similarities and differences between regions.
The results, to date, have been largely disappointing. Very few of these projects have had any operational impact – hardly living up to the hype or the billions in investment. At the same time, the pandemic highlighted the fragility of many AI models. From entertainment recommendation systems to fraud detection and inventory management – the crisis has seen AI systems go awry as they struggled to adapt to sudden collective shifts in behaviour.
The unlikely hero
The unlikely hero emerging from the ashes of this pandemic is instead the crowd. Crowds of scientists around the world sharing data and insights faster than ever before. Crowds of local makers manufacturing PPE for hospitals failed by supply chains. Crowds of ordinary people organising through mutual aid groups to look after each other.
COVID-19 has reminded us of just how quickly humans can adapt existing knowledge, skills and behaviours to entirely new situations – something that highly-specialised AI systems just can’t do. At least yet….
In one of the experiments, researchers from the Istituto di Scienze e Tecnologie della Cognizione in Rome studied the use of an AI system designed to reduce social biases in collective decision-making. The AI, which held back information from the group members on what others thought early on, encouraged participants to spend more time evaluating the options by themselves.
The system succeeded in reducing the tendency of people to “follow the herd” by failing to hear diverse or minority views, or challenge assumptions – all of which are criticisms that have been levelled at the British government’s scientific advisory committees throughout the pandemic…(More)”.
Paper by Kaustav Bhattacharjee, Min Chen. and Aritra Dasgupta: “Preservation of data privacy and protection of sensitive information from potential adversaries constitute a key socio‐technical challenge in the modern era of ubiquitous digital transformation. Addressing this challenge needs analysis of multiple factors: algorithmic choices for balancing privacy and loss of utility, potential attack scenarios that can be undertaken by adversaries, implications for data owners, data subjects, and data sharing policies, and access control mechanisms that need to be built into interactive data interfaces.
Visualization has a key role to play as part of the solution space, both as a medium of privacy‐aware information communication and also as a tool for understanding the link between privacy parameters and data sharing policies. The field of privacy‐preserving data visualization has witnessed progress along many of these dimensions. In this state‐of‐the‐art report, our goal is to provide a systematic analysis of the approaches, methods, and techniques used for handling data privacy in visualization. We also reflect on the road‐map ahead by analyzing the gaps and research opportunities for solving some of the pressing socio‐technical challenges involving data privacy with the help of visualization….(More)”.
Paper by Christopher Loynes, Jamal Ouenniche & Johannes De Smedt: “This paper provides the humanitarian community with an automated tool that can detect a disaster using tweets posted on Twitter, alongside a portal to identify local and regional Non-Governmental Organisations (NGOs) that are best-positioned to provide support to people adversely affected by a disaster. The proposed disaster detection tool uses a linear Support Vector Classifier (SVC) to detect man-made and natural disasters, and a density-based spatial clustering of applications with noise (DBSCAN) algorithm to accurately estimate a disaster’s geographic location. This paper provides two original contributions. The first is combining the automated disaster detection tool with the prototype portal for NGO identification. This unique combination could help reduce the time taken to raise awareness of the disaster detected, improve the coordination of aid, increase the amount of aid delivered as a percentage of initial donations and improve aid effectiveness. The second contribution is a general framework that categorises the different approaches that can be adopted for disaster detection. Furthermore, this paper uses responses obtained from an on-the-ground survey with NGOs in the disaster-hit region of Uttar Pradesh, India, to provide actionable insights into how the portal can be developed further…(More)”.