Data Science and Official Statistics: Toward a New Data Culture


Essay by Stefan Schweinfest and Ronald Jansen: “In the digital age, data are generated continuously by many different devices and are being used by many different actors. National statistical offices (NSOs) should benefit from these opportunities to improve data for decision-making. What could be the expanding role for official statistics in this context and how does this relate to emerging disciplines like data science? This article explores some new ideas. In the avalanche of new data, society may need a data steward, and the NSO could take on that role, while paying close attention to the protection of privacy. Data science will become increasingly important for extracting meaningful information from large amounts of data. NSOs will need to hire data scientists and data engineers and will need to train their staff in these fast-developing fields. NSOs will also need to clearly communicate new and experimental data and foster a good understanding of statistics. Collaboration of official statistics with the private sector, academia, and civil society will be the new way of working and the fundamental principles of official statistics may have to apply to all those actors. This article envisions that we are gradually working toward such a new data culture…(More)”.

The State of Open Data 2021


Report by Digital Science (Australia): “Since 2016, we have monitored levels of data sharing and usage. Over the years, we have had 21,000 responses from researchers worldwide providing unparalleled insight into their motivations, challenges, perceptions, and behaviours toward open data.

In our sixth survey, we asked about motivations as well as perceived discoverability and credibility of data that is shared openly. The State of Open Data is a critical piece of information that enables us to identify the barriers to open data from a researcher perspective, laying the foundation for future action. 

Key findings from this year’s survey

  • 73% support the idea of a national mandate for making research data openly available
  • 52% said funders should make the sharing of research data part of their requirements for awarding grants
  • 47% said they would be motivated to share their data if there was a journal or publisher requirement to do so
  • About a third of respondents indicated that they have reused their own or someone else’s openly accessible data more during the pandemic than before
  • There are growing concerns over misuse and lack of credit for open sharing…(More)”

Contracting and Contract Law in the Age of Artificial Intelligence



Book edited by Martin Ebers, Cristina Poncibò, and Mimi Zou: “This book provides original, diverse, and timely insights into the nature, scope, and implications of Artificial Intelligence (AI), especially machine learning and natural language processing, in relation to contracting practices and contract law. The chapters feature unique, critical, and in-depth analysis of a range of topical issues, including how the use of AI in contracting affects key principles of contract law (from formation to remedies), the implications for autonomy, consent, and information asymmetries in contracting, and how AI is shaping contracting practices and the laws relating to specific types of contracts and sectors.

The contributors represent an interdisciplinary team of lawyers, computer scientists, economists, political scientists, and linguists from academia, legal practice, policy, and the technology sector. The chapters not only engage with salient theories from different disciplines, but also examine current and potential real-world applications and implications of AI in contracting and explore feasible legal, policy, and technological responses to address the challenges presented by AI in this field.

The book covers major common and civil law jurisdictions, including the EU, Italy, Germany, UK, US, and China. It should be read by anyone interested in the complex and fast-evolving relationship between AI, contract law, and related areas of law such as business, commercial, consumer, competition, and data protection laws….(More)”.

A Fix-It Job for Government Tech


Shira Ovide at the New York Times: “U.S. government technology has a mostly deserved reputation for being expensive and awful.

Computer systems sometimes operate with Sputnik-era software. A Pentagon project to modernize military technology has little to show after five years. During the coronavirus pandemic, millions of Americans struggled to get government help like unemployment insurancevaccine appointments and food stamps because of red tape, inflexible technology and other problems.

Whether you believe that the government should be more involved in Americans’ lives or less, taxpayers deserve good value for the technology we pay for. And we often don’t get it. It’s part of Robin Carnahan’s job to take on this problem.

A former secretary of state for Missouri and a government tech consultant, Carnahan had been one of my guides to how public sector technology could work better. Then in June, she was confirmed as the administrator of the General Services Administration, the agency that oversees government acquisitions, including of technology.

Carnahan said that she and other Biden administration officials wanted technology used for fighting wars or filing taxes to be as efficient as our favorite app.

“Bad technology sinks good policy,” Carnahan told me. “We’re on a mission to make government tech more user-friendly and be smarter about how we buy it and use it.”

Carnahan highlighted three areas she wanted to address: First, change the process for government agencies to buy technology to recognize that tech requires constant updates. Second, simplify the technology for people using government services. And third, make it more appealing for people with tech expertise to work for the government, even temporarily.

All of that is easier said than done, of course. People in government have promised similar changes before, and it’s not a quick fix. Technology dysfunction is also often a symptom of poor policies.

But in Carnahan’s view, one way to build faith in government is to prove that it can be competent. And technology is an essential area to show that…(More)”.

How Courts Embraced Technology, Met the Pandemic Challenge, and Revolutionized Their Operations


Report by The Pew Charitable Trusts: “To begin to assess whether, and to what extent, the rapid improvements in court technology undertaken in 2020 and 2021 made the civil legal system easier to navigate, The Pew Charitable Trusts examined pandemic-related emergency orders issued by the supreme courts of all 50 states and Washington, D.C. The researchers supplemented that review with an analysis of court approaches to virtual hearings, e-filing, and digital notarization, with a focus on how these tools affected litigants in three of the most common types of civil cases: debt claims, evictions, and child support. The key findings of this research are:

  • Civil courts’ adoption of technology was unprecedented in pace and scale. Despite having almost no history of using remote civil court proceedings, beginning in March 2020 every state and D.C. initiated online hearings at record rates to resolve many types of cases. For example, the Texas court system, which had never held a civil hearing via video before the pandemic, conducted 1.1 million remote proceedings across its civil and criminal divisions between March 2020 and February 2021. Similarly, Michigan courts held more than 35,000 video hearings totaling nearly 200,000 hours between April 1 and June 1, 2020, compared with no such hearings during the same two months in 2019.Courts moved other routine functions online as well. Before the pandemic, 37 states and D.C. allowed people without lawyers to electronically file court documents in at least some civil cases. But since March 2020, 10 more states have created similar processes, making e-filing available to more litigants in more jurisdictions and types of cases. In addition, after 11 states and D.C. made pandemic-driven changes to their policies on electronic notarization (e-notarization), 42 states and D.C. either allowed it or had waived notarization requirements altogether as of fall 2020.
  • Courts leveraged technology not only to stay open, but also to improve participation rates and help users resolve disputes more efficiently. Arizona civil courts, for example, saw an 8% drop year-over-year in June 2020 in the rate of default, or automatic, judgment—which results when defendants fail to appear in court—indicating an increase in participation. Although national and other state data is limited, court officials across the country, including judges, administrators, and attorneys, report increases in civil court appearance rates.
  • The accelerated adoption of technology disproportionately benefited people and businesses with legal representation—and in some instances, made the civil legal system more difficult to navigate for those without...(More)”.

NativeDATA


About: “NativeDATA is a free online resource that offers practical guidance for Tribes and Native-serving organizations. For this resource, Native-serving organizations includes Tribal and urban Indian organizations and Tribal Epidemiology Centers (TECs). 

Tribal and urban Indian communities need correct health information (data), so that community leaders can:

  • Watch disease trends
  • Respond to health threats
  • Create useful health policies…

Throughout, this resource offers practical guidance for obtaining and sharing health data in ways that honor Tribal sovereignty, data sovereignty, and public health authorityis the authority of a sovereign government to protect the health, safety, and welfare of its citizens. As sovereign nations, Tribes have the power to define how they will use this authority to protect and promote the health of their communities. The federal government recognizes Tribes and Tribal Epidemiology Centers (TECs) as public health authorities under federal law. More.

Inside you will find expert advice to help you:

Evaluation Guidelines for Representative Deliberative Processes


OECD Report: “Evaluations of representative deliberative processes do not happen regularly, not least due to the lack of specific guidance for their evaluation. To respond to this need, together with an expert advisory group, the OECD has developed Evaluation Guidelines for Representative Deliberative Processes. They aim to encourage public authorities, organisers, and evaluators to conduct more comprehensive, objective, and comparable evaluations.

These evaluation guidelines establish minimum standards and criteria for the evaluation of representative deliberative processes as a foundation on which more comprehensive evaluations can be built by adding additional criteria according to specific contexts and needs.

The guidelines suggest that independent evaluations are the most comprehensive and reliable way of evaluating a deliberative process.

For smaller and shorter deliberative processes, evaluation in the form of self-reporting by members and/or organisers of a deliberative process can also contribute to the learning process…(More)”.

UK government publishes pioneering standard for algorithmic transparency


UK Government Press Release: “The UK government has today launched one of the world’s first national standards for algorithmic transparency.

This move delivers on commitments made in the National AI Strategy and National Data Strategy, and strengthens the UK’s position as a global leader in trustworthy AI.

In its landmark review into bias in algorithmic decision-making, the Centre for Data Ethics and Innovation (CDEI) recommended that the UK government should place a mandatory transparency obligation on public sector organisations using algorithms to support significant decisions affecting individuals….

The Cabinet Office’s Central Digital and Data Office (CDDO) has worked closely with the CDEI to design the standard. It also consulted experts from across civil society and academia, as well as the public. The standard is organised into two tiers. The first includes a short description of the algorithmic tool, including how and why it is being used, while the second includes more detailed information about how the tool works, the dataset/s that have been used to train the model and the level of human oversight. The standard will help teams be meaningfully transparent about the way in which algorithmic tools are being used to support decisions, especially in cases where they might have a legal or economic impact on individuals.

The standard will be piloted by several government departments and public sector bodies in the coming months. Following the piloting phase, CDDO will review the standard based on feedback gathered and seek formal endorsement from the Data Standards Authority in 2022…(More)”.

Data Powered Positive Deviance Handbook


Handbook by GIZ and UNDP: “Positive Deviance (PD) is based on the observation that in every community or organization, there are a few individuals who achieve significantly better outcomes than their peers, despite having similar challenges and resources. These individuals are referred to as positive deviants, and adopting their solutions is what is referred to as the PD approach.
The method described in this Handbook follows the same logic as the PD approach but uses pre-existing, non-traditional data sources instead of — or in conjunction with — traditional data sources. Non-traditional data in this context broadly refers to data that is digitally captured (e.g. mobile phone records and financial data), mediated (e.g. social media and online data), or observed (e.g. satellite imagery). The integration of such data to complement traditional data sources generally used in PD is what we refer to as Data Powered Positive Deviance (DPPD)…(More)”.

Creating and governing social value from data


Paper by Diane Coyle and Stephanie Diepeveen: “Data is increasingly recognised as an important economic resource for innovation and growth, but its innate characteristics mean market-based valuations inadequately account for the impact of its use on social welfare. This paper extends the literature on the value of data by providing a framework that takes into account its non-rival nature and integrates its inherent positive and negative externalities. Positive externalities consist of the scope for combining different data sets or enabling innovative uses of existing data, while negative externalities include potential privacy loss. We propose a framework integrating these and explore the policy trade-offs shaping net social welfare through a case study of geospatial data and the transport sector in the UK, where insufficient recognition of the trade-offs has contributed to suboptimal policy outcomes. We conclude by proposing methods for empirical approaches to social data valuation, essential evidence for decisions regarding the policy trade-offs . This article therefore lays important groundwork for novel approaches to the measurement of the net social welfare contribution of data, and hence illuminating opportunities for greater and more equitable creation of value from data in our societies….(More)”