Report by the World Economic Forum: “When a new technology is introduced in healthcare, especially one based on AI, it invites meticulous scrutiny. The COVID-19 pandemic has accelerated the adoption of chatbots in healthcare applications and as a result, careful consideration is required to promote their responsible use. To address these governance challenges, the World Economic Forum has assembled a multistakeholder community, which has co-created Chatbots RESET, a framework for governing the responsible use of chatbots in healthcare. The framework outlined in this paper offers an actionable guide for stakeholders to promote the responsible use of chatbots in healthcare applications…(More)”.
Digital Politics in Canada: Promises and Realities
Book edited by Tamara A. Small and Harold J. Jansen: “Digital Politics in Canada addresses a significant gap in the scholarly literature on both media in Canada and Canadian political science. Using a comprehensive, multidisciplinary, historical, and focused analysis of Canadian digital politics, this book covers the full scope of actors in the Canadian political system, including traditional political institutions of the government, elected officials, political parties, and the mass media. At a time when issues of inclusion are central to political debate, this book features timely chapters on Indigenous people, women, and young people, and takes an in-depth look at key issues of online surveillance and internet voting. Ideal for a wide-ranging course on the impact of digital technology on the Canadian political system, this book encourages students to critically engage in discussions about the future of Canadian politics and democracy….(More)”.
Silicon Valley’s next goal is 3D maps of the world — made by us
Tim Bradshaw at the Financial Times: “When technology transformed the camera, the shift from film to digital sensors was just the beginning. As standalone cameras were absorbed into our phones, they gained software smarts, enabling them not only to capture light but also to understand the contents of a photo and even recognise people in it.
A similar transformation is now starting to happen to maps — and it too is powered by those advances in camera technology. In the next 20 years, our collective understanding of a “map” will be unrecognisable from the familiar grid of roads and places that has endured even as the A-Z street atlas has been supplanted by Google Maps.
Before long, countless objects and places will be captured and recreated in 3D digital models that we can view through our phones or even, at some stage, on headsets. This digital world might be populated by our avatars, turned into a playing field for new kinds of games or used to discover routes, buildings and services around us.
Nobody seems sure yet what the killer app for this “digital twin” of Planet Earth might be, but that hasn’t stopped Silicon Valley from racing to build it anyway. Facebook, Apple, Google and Microsoft, as well as the developers of Snapchat and Pokémon Go, are all hoping to bring this “mirrorworld” to life, as a precursor to the augmented-reality (AR) glasses that many in tech see as the next big thing.
To place virtual objects in our world, our devices need to know the textures and contours of their surroundings, which GPS cannot see. But instead of sending out cars with protruding cameras to scan the world, as Google did to build Street View over the past decade and a half, these maps will be plotted by hundreds of millions of users like you and me. The question is whether we even realise that we have been dragooned into Silicon Valley’s army of cartographers. They cannot do it without us.
This month, Google said it would ask Maps users to upload photos to Street View using their smartphones for the first time. Only handsets running its AR software can contribute. As Michael Abrash, chief scientist at Facebook’s Oculus headset unit, recently told Fast Company magazine: “Crowdsourcing has to be the primary way that this works. There is no other way to scale.”…(More)”.
Lawmakers are trying to create a database with free access to court records. Judges are fighting against it.
Ann Marimow in the Washington Post: “Leaders of the federal judiciary are working to block bipartisan legislation designed to create a national database of court records that would provide free access to case documents.
Backers of the bill, who are pressing for a House vote in the coming days, envision a streamlined, user-friendly system that would allow citizens to search for court documents and dockets without having to pay. Under the current system, users pay 10 cents per page to view the public records through the service known as PACER, an acronym for Public Access to Court Electronic Records.
“Everyone wants to have a system that is technologically first class and free,” said Rep. Hank Johnson (D-Ga.), a sponsor of the legislation with Rep. Douglas A. Collins (R-Ga.).
A modern system, he said, “is more efficient and brings more transparency into the equation and is easier on the pocketbooks of regular people.”…(More)”.
Why Predictive Algorithms are So Risky for Public Sector Bodies
Paper by Madeleine Waller and Paul Waller: “This paper collates multidisciplinary perspectives on the use of predictive analytics in government services. It moves away from the hyped narratives of “AI” or “digital”, and the broad usage of the notion of “ethics”, to focus on highlighting the possible risks of the use of prediction algorithms in public administration. Guidelines for AI use in public bodies are currently available, however there is little evidence these are being followed or that they are being written into new mandatory regulations. The use of algorithms is not just an issue of whether they are fair and safe to use, but whether they abide with the law and whether they actually work.
Particularly in public services, there are many things to consider before implementing predictive analytics algorithms, as flawed use in this context can lead to harmful consequences for citizens, individually and collectively, and public sector workers. All stages of the implementation process of algorithms are discussed, from the specification of the problem and model design through to the context of their use and the outcomes.
Evidence is drawn from case studies of use in child welfare services, the US Justice System and UK public examination grading in 2020. The paper argues that the risks and drawbacks of such technological approaches need to be more comprehensively understood, and testing done in the operational setting, before implementing them. The paper concludes that while algorithms may be useful in some contexts and help to solve problems, it seems those relating to predicting real life have a long way to go to being safe and trusted for use. As “ethics” are located in time, place and social norms, the authors suggest that in the context of public administration, laws on human rights, statutory administrative functions, and data protection — all within the principles of the rule of law — provide the basis for appraising the use of algorithms, with maladministration being the primary concern rather than a breach of “ethics”….(More)”
Data Readiness: Lessons from an Emergency
The DELVE Initiative: “Responding to the COVID-19 pandemic has required rapid decision-making in changing circumstances. Those decisions and their effects on the health and wealth of the nation can be better informed with data. Today, technologies that can acquire data are pervasive. Data is continually produced by devices like mobile phones, payment points and road traffic sensors. This creates opportunities for nowcasting of important metrics such as GDP, population movements and disease prevalence, which can be used to design policy interventions that are targeted to the needs of specific sectors or localities. The data collected as a by-product of daily activities is different to epidemiological or other population research data that might be used to drive the decisions of state. These new forms of data are happenstance, in that they are not originally collected with a particular research or policy question in mind but are created through the normal course of events in our digital lives, and our interactions with digital systems and services.
This happenstance data pertains to individual citizens and their daily activities. To be useful it needs to be anonymized, aggregated and statistically calibrated to provide meaningful metrics for robust decision making while managing concerns about individual privacy or business value. This process necessitates particular technical and domain expertise that is often found in academia, but it must be conducted in partnership with the industries, and public sector organisations, that collect or generate the data and government authorities that take action based on those insights. Such collaborations require governance mechanisms that can respond rapidly to emerging areas of need, a common language between partners about how data is used and how it is being protected, and careful stewardship to ensure appropriate balancing of data subjects’ rights and the benefit of using this data. This is the landscape of data readiness; the availability and quality of the UK nation’s data dictates our ability to respond in an agile manner to evolving events….(More)”.
Fixing financial data to assess systemic risk
Report by Greg Feldberg: “The COVID-19 market disruption again highlighted the flaws in the data that the public and the authorities use to assess risks in the financial system. We don’t have the right data, we can’t analyze the data we do have, and there are all sorts of holes. Amidst extreme uncertainty in times like this, market participants need better data to manage their risks, just as policymakers need better data to calibrate their crisis interventions. This paper argues that the new administration should make it a priority to fix financial regulatory data, starting during the transition.
The incoming administration should, first, emphasize data when vetting candidates for top financial regulatory positions. Every agency head should recognize the problem and the roles they must play in the solution. They should recognize how the Evidence Act of 2018 and other recent legislation help define those roles. And every agency head should recognize the role of the Office of Financial Research (OFR) within the regulatory community. Only the OFR has the mandate and experience to provide the necessary leadership to address these problems.
The incoming administration should empower the OFR to do its job and coordinate a systemwide financial data strategy, working with the regulators. That strategy should set a path for identifying key data gaps that impede risk analysis; setting data standards; sharing data securely, among authorities and with the public; and embracing new technologies that make it possible to manage data far more efficiently and securely than ever before. These are ambitious goals, but the administration may be able to accomplish them with vision and leadership…(More)”.
Public Value Science
Barry Bozeman in Issues in Science and Technology: “Why should the United States government support science? That question was apparently settled 75 years ago by Vannevar Bush in Science, the Endless Frontier: “Since health, well-being, and security are proper concerns of Government, scientific progress is, and must be, of vital interest to Government. Without scientific progress the national health would deteriorate; without scientific progress we could not hope for improvement in our standard of living or for an increased number of jobs for our citizens; and without scientific progress we could not have maintained our liberties against tyranny.”
Having dispensed with the question of why, all that remained was for policy-makers to decide, how much? Even at the dawn of modern science policy, costs and funding needs were at the center of deliberations. Though rarely discussed anymore, Endless Frontier did give specific attention to the question of how much. The proposed amounts seem, by today’s standards, modest: “It is estimated that an adequate program for Federal support of basic research in the colleges, universities, and research institutes and for financing important applied research in the public interest, will cost about 10 million dollars at the outset and may rise to about 50 million dollars annually when fully underway at the end of perhaps 5 years.”
In today’s dollars, $50 million translates to about $535 million, or less than 2% of what the federal government actually spent for basic research in 2018. One way to look at the legacy of Endless Frontier is that by answering the why question so convincingly, it logically followed that the how much question could always be answered simply by “more.”
In practice, however, the why question continues to seem so self-evident because it fails to consider a third question, who? As in, who benefits from this massive federal investment in research, and who does not? The question of who was also seemingly answered by Endless Frontier, which not only offered full employment as a major goal for expanded research but also embraced “the sound democratic principle that there should be no favored classes or special privilege.”
But I argue that this principle has now been soundly falsified. In an economic environment characterized by growth but also by extreme inequality, science and technology not only reinforce inequality but also, in some instances, help widen the gap. Science and technology can be a regressivefactor in the economy. Thus, it is time to rethink the economic equation justifying government support for science not just in terms of why and how much, but also in terms of who.
What logic supports my claim that under conditions of conspicuous inequality, science and technology research is often a regressive force? Simple: except in the case of the most basic of basic research (such as exploration of other galaxies), effects are never randomly distributed. Both the direct and indirect effects of science and technology tend to differentially affect citizens according to their socioeconomic power and purchasing power….(More)”.
Open Data Inventory 2020
Report by the Open Data Watch: “The 2020/21 Open Data Inventory (ODIN) is the fifth edition of the index compiled by Open Data Watch. ODIN 2020/21 provides an assessment of the coverage and openness of official statistics in 187 countries, an increase of 9 countries compared to ODIN 2018/19. The year 2020 was a challenging year for the world as countries grappled with the COVID-19 pandemic. Nonetheless, and despite the pandemic’s negative impact on the capacity of statistics producers, 2020 saw great progress in open data.
However, the news on data this year isn’t all good. Countries in every region still struggle to publish gender data and many of the same countries are unable to provide sex-disaggregated data on the COVID-19 pandemic. In addition, low-income countries continue to need more support with capacity building and financial resources to overcome the barriers to publishing open data.
ODIN is an evaluation of the coverage and openness of data provided on the websites maintained by national statistical offices (NSOs) and any official government website that is accessible from the NSO site. The overall ODIN score is an indicator of how complete and open an NSO’s data offerings are. It is comprised of both a coverage and openness subscore. Openness is measured against standards set by the Open Definition and Open Data Charter. ODIN 2020/21 includes 22 data categories, grouped under social, economic and financial, and environmental statistics. ODIN scores are represented on a range between 0 and 100, with 100 representing the best performance on open data… The full report will be released in February 2021….(More)”.
Intentional and Unintentional Sludge
Essay by Crawford Hollingworth and Liz Barker: “…Both of these stories are illustrations of what many mums and gym–goers may have experienced across the United Kingdom and United States as they tried to cope with the pandemic. We, along with other behavioral scientists, would label both as sludge—when users face high levels of friction obstructing their efforts to achieve something that is in their best interest, or are misled or encouraged to take action that is not in their best interest.
We can think of what the English mum goes through as unintentional sludge—friction due to factors like rushed design, poor infrastructure, and inadequate oversight. The mother is trying to access a benefit that will help her and which she has a right to claim, and which the government genuinely wants her to access. Yet multiple barriers prevented her from accessing the voucher that would help feed her children. Millions of parents found themselves in this situation as schools closed in England earlier this year. All over the country schools ended up paying for food parcels and gift vouchers out of their own budgets to help families who were going hungry.
What the New York gym-goer faces is different. It is intentional sludge—friction put in place knowingly to benefit an organization at the expense of the user. The gym doesn’t want him to cancel the membership, which would mean lost revenue. Even absent the pandemic, the policy would be considered unnecessarily difficult to cancel. The gym’s hope is that people forget, give up, or don’t bother canceling in person or over the phone, or that it takes them longer to do so. This translates into revenue for them, without any of the costs of providing a service. Stories like this have resulted in class-action lawsuits against companies that make it overly difficult or impossible to cancel gym memberships. One lawsuit alleged that one large gym company was stealing over $30 million per month from customers….(More)”.