Public Sector Tech: New tools for the new normal


Special issue by ZDNet exploring “how new technologies like AI, cloud, drones, and 5G are helping government agencies, public organizations, and private companies respond to the events of today and tomorrow…:

Quantified Storytelling: A Narrative Analysis of Metrics on Social Media


Book by Alex Georgakopoulou, Stefan Iversen and Carsten Stage: “This book interrogates the role of quantification in stories on social media: how do visible numbers (e.g. of views, shares, likes) and invisible algorithmic measurements shape the stories we post and engage with? The links of quantification with stories have not been explored sufficiently in storytelling research or in social media studies, despite the fact that platforms have been integrating sophisticated metrics into developing facilities for sharing stories, with a massive appeal to ordinary users, influencers and businesses alike.

With case-studies from Instagram, Reddit and Snapchat, the authors show how three types of metrics, namely content metrics, interface metrics and algorithmic metrics, affect the ways in which cancer patients share their experiences, the circulation of specific stories that mobilize counter-publics and the design of stories as facilities on platforms. The analyses document how numbers structure elements in stories, indicate and produce engagement and become resources for the tellers’ self-presentation….(More)”.

Improving data access democratizes and diversifies science


Research article by Abhishek Nagaraj, Esther Shears, and Mathijs de Vaan: “Data access is critical to empirical research, but past work on open access is largely restricted to the life sciences and has not directly analyzed the impact of data access restrictions. We analyze the impact of improved data access on the quantity, quality, and diversity of scientific research. We focus on the effects of a shift in the accessibility of satellite imagery data from Landsat, a NASA program that provides valuable remote-sensing data. Our results suggest that improved access to scientific data can lead to a large increase in the quantity and quality of scientific research. Further, better data access disproportionately enables the entry of scientists with fewer resources, and it promotes diversity of scientific research….(More)”

Research 4.0: research in the age of automation


Report by Rob Procter, Ben Glover, and Elliot Jones: “There is a growing consensus that we are at the start of a fourth industrial revolution, driven by developments in Artificial Intelligence, machine learning, robotics, the Internet of Things, 3-D printing, nanotechnology, biotechnology, 5G, new forms of energy storage and quantum computing. This report seeks to understand what impact AI is having on the UK’s research sector and what implications it has for its future, with a particular focus on academic research.

Building on our interim report, we find that AI is increasingly deployed in academic research in the UK in a broad range of disciplines. The combination of an explosion of new digital data sources with powerful new analytical tools represents a ‘double dividend’ for researchers. This is allowing researchers to investigate questions that would have been unanswerable just a decade ago. Whilst there has been considerable take-up of AI in academic research, the report highlights that steps could be taken to ensure even wider adoption of these new techniques and technologies, including wider training in the necessary skills for effective utilisation of AI, faster routes to culture change and greater multi-disciplinary collaboration.

This report recognises that the Covid-19 pandemic means universities are currently facing significant pressures, with considerable demands on their resources whilst simultaneously facing threats to income. But as we emerge from the current crisis, we urge policy makers and universities to consider the report’s recommendations and take steps to fortify the UK’s position as a place of world-leading research. Indeed, the current crisis has only reminded us of the critical importance of a highly functioning and flourishing research sector. The report recommends:

The current post-16 curriculum should be reviewed to ensure all pupils receive a grounding in basic digital, quantitative and ethical skills necessary to ensure the effective and appropriate utilisation of AI.A UK-wide audit of research computing and data infrastructure provision is conducted to consider how access might be levelled up.

UK Research and Innovation (UKRI) should consider incentivising institutions to utilise AI wherever it can offer benefits to the economy and society in their future spending on research and development.

Universities should take steps to ensure that it is easier for researchers to move between academia and industry, for example, by putting less emphasis on publications, and recognise other outputs and measures of achievement when hiring for academic posts….(More)”.

Reimagining Help


Guide by Nesta: “Now more than ever, there is a need to help people live well in their homes and communities. The coronavirus pandemic has highlighted the importance of diversifying sources of help beyond the hospital, and of drawing on support from friends, neighbours, local organisations and charities to ensure people can live healthy lives. We must think more flexibly about what ‘help’ means, and how the right help can make a huge difference.

While medical care is fundamental to saving lives, people need more than a ‘fix’ to live well every day. If we are to support people to reach their goals, we must move away from ʻexpertsʼ holding the knowledge and power, and instead draw on people’s own knowledge, relationships, strengths and purpose to determine solutions that work best for them.

We believe there is an opportunity to ‘reimagine help’ by applying insights from the field of behaviour change research to a wide range of organisations and places – community facilities, local charities and businesses, employment and housing support, as well as health and care services, all of which play a role in supporting people to reach their goals in a way that feels right for them….

Nesta, Macmillan Cancer Support, the British Heart Foundation and the UCL Centre for Behaviour Change have worked together to develop a universal model of ‘Good Help’ underpinned by behavioural evidence, which can be understood and accessed by everyone. We analysed and simplified decades of behaviour change research and practice, and worked with a group of 30 practitioners and people with lived experience to iterate and cross-check the behavioural evidence against real life experiences. Dartington Service Design Lab helped to structure and format the evidence in a way that makes it easy for everyone to understand.

Collectively, we have produced a guide which outlines eight characteristics of Good Help, which aims to support practitioners, system leaders (such as service managers, charity directors or commissioners) and any person working in a direct ‘helping’ organisation to:

  • Understand the behaviour change evidence that underpins Good Help
  • Develop new ideas or adapt offers of Good Help, which can be tested out in their own organisations or local communities….(More)”.

Digital Minilateralism: How governments cooperate on digital governance


A policy paper by Tanya Filer and Antonio Weiss: “New research from the Digital State Project argues for the critical function of small, agile, digitally enabled and focused networks of leaders to foster strong international cooperation on digital governance issues.

This type of cooperative working, described as ‘digital minilateralism’, has a role to play in shaping how individual governments learn, adopt and govern the use of new and emerging technologies, and how they create common or aligned policies. It is also important as cross-border digital infrastructure and services become increasingly common….

Key findings: 

  • Already beginning to prove effective, digital minilateralism has a role to play in shaping how individual governments learn, adopt and govern the use of new and emerging technologies, and how they create common or aligned policy.
  • National governments should recognise and reinforce the strategic value of digital minilaterals without stamping out, through over-bureaucratisation, the qualities of trust, open conversation, and ad-hocness in which their value lies.
  • As digital minilateral networks grow and mature, they will need to find mechanisms through which to retain (or adapt) their core principles while scaling across more boundaries.
  • To demonstrate their value to the global community, digital multilaterals must feed into formal multilateral conversations and arrangements. …(More)“.

Sortition, its advocates and its critics: An empirical analysis of citizens’ and MPs’ support for random selection as a democratic reform proposal


Paper by Vincent Jacquet et al: “This article explores the prospects of an increasingly debated democratic reform: assigning political offices by lot. While this idea is advocated by political theorists and politicians in favour of participatory and deliberative democracy, the article investigates the extent to which citizens and MPs actually endorse different variants of ‘sortition’. We test for differences among respondents’ social status, disaffection with elections and political ideology. Our findings suggest that MPs are largely opposed to sortitioning political offices when their decision-making power is more than consultative, although leftist MPs tend to be in favour of mixed assemblies (involving elected and sortitioned members). Among citizens, random selection seems to appeal above all to disaffected individuals with a lower social status. The article ends with a discussion of the political prospects of sortition being introduced as a democratic reform…(More).”

The Cruel New Era of Data-Driven Deportation


Article by Alvaro M. Bedoya: “For a long time, mass deportations were a small-data affair, driven by tips, one-off investigations, or animus-driven hunches. But beginning under George W. Bush, and expanding under Barack Obama, ICE leadership started to reap the benefits of Big Data. The centerpiece of that shift was the “Secure Communities” program, which gathered the fingerprints of arrestees at local and state jails across the nation and compared them with immigration records. That program quickly became a major driver for interior deportations. But ICE wanted more data. The agency had long tapped into driver address records through law enforcement networks. Eyeing the breadth of DMV databases, agents began to ask state officials to run face recognition searches on driver photos against the photos of undocumented people. In Utah, for example, ICE officers requested hundreds of face searches starting in late 2015. Many immigrants avoid contact with any government agency, even the DMV, but they can’t go without heat, electricity, or water; ICE aimed to find them, too. So, that same year, ICE paid for access to a private database that includes the addresses of customers from 80 national and regional electric, cable, gas, and telephone companies.

Amid this bonanza, at least, the Obama administration still acknowledged red lines. Some data were too invasive, some uses too immoral. Under Donald Trump, these limits fell away.

In 2017, breaking with prior practice, ICE started to use data from interviews with scared, detained kids and their relatives to find and arrest more than 500 sponsors who stepped forward to take in the children. At the same time, ICE announced a plan for a social media monitoring program that would use artificial intelligence to automatically flag 10,000 people per month for deportation investigations. (It was scuttled only when computer scientists helpfully indicated that the proposed system was impossible.) The next year, ICE secured access to 5 billion license plate scans from public parking lots and roadways, a hoard that tracks the drives of 60 percent of Americans—an initiative blocked by Department of Homeland Security leadership four years earlier. In August, the agency cut a deal with Clearview AI, whose technology identifies people by comparing their faces not to millions of driver photos, but to 3 billion images from social media and other sites. This is a new era of immigrant surveillance: ICE has transformed from an agency that tracks some people sometimes to an agency that can track anyone at any time….(More)”.

AI planners in Minecraft could help machines design better cities


Article by Will Douglas Heaven: “A dozen or so steep-roofed buildings cling to the edges of an open-pit mine. High above them, on top of an enormous rock arch, sits an inaccessible house. Elsewhere, a railway on stilts circles a group of multicolored tower blocks. Ornate pagodas decorate a large paved plaza. And a lone windmill turns on an island, surrounded by square pigs. This is Minecraft city-building, AI style.

Minecraft has long been a canvas for wild invention. Fans have used the hit block-building game to create replicas of everything from downtown Chicago and King’s Landing to working CPUs. In the decade since its first release, anything that can be built has been.

Since 2018, Minecraft has also been the setting for a creative challenge that stretches the abilities of machines. The annual Generative Design in Minecraft (GDMC) competition asks participants to build an artificial intelligence that can generate realistic towns or villages in previously unseen locations. The contest is just for fun, for now, but the techniques explored by the various AI competitors are precursors of ones that real-world city planners could use….(More)”.

Smart Rural: The Open Data Gap


Paper by Johanna Walker et al: “The smart city paradigm has underpinned a great deal of thevuse and production of open data for the benefit of policymakers and citizens. This paper posits that this further enhances the existing urban rural divide. It investigates the availability and use of rural open data along two parameters: pertaining to rural populations, and to key parts of the rural economy (agriculture, fisheries and forestry). It explores the relationship between key statistics of national / rural economies and rural open data; and the use and users of rural open data where it is available. It finds that although countries with more rural populations are not necessarily earlier in their Open Data Maturity journey, there is still a lack of institutionalisation of open data in rural areas; that there is an apparent gap between the importance of agriculture to a country’s GDP and the amount of agricultural data published openly; and lastly, that the smart
city paradigm cannot simply be transferred to the rural setting. It suggests instead the adoption of the emerging ‘smart region’ paradigm as that most likely to support the specific data needs of rural areas….(More)”.