Feedback Loops in Open Data Ecosystems


Paper by Daniel Rudmark and Magnus Andersson: “Public agencies are increasingly publishing open data to increase transparency and fuel data-driven innovation. For these organizations, maintaining sufficient data quality is key to continuous re-use but also heavily dependent on feedback loops being initiated between data publishers and users. This paper reports from a longitudinal engagement with Scandinavian transportation agencies, where such feedback loops have been successfully established. Based on these experiences, we propose four distinct types of data feedback loops in which both data publishers and re-users play critical roles…(More)”.

UNCTAD calls on countries to make digital data flow for the benefit of all


Press Release: “The world needs a new global governance approach to enable digital data to flow across borders as freely as necessary and possible, says UNCTAD’s Digital Economy Report 2021 released on 29 September.

The UN trade and development body says the new approach should help maximize development gains, ensure those gains are equitably distributed and minimize risks and harms.

It should also enable worldwide data sharing, develop global digital public goods, increase trust and reduce uncertainty in the digital economy.

The report says the new global system should also help avoid further fragmentation of the internet, address policy challenges emerging from the dominant positions of digital platforms and narrow existing inequalities.

“It is more important than ever to embark on a new path for digital and data governance,” says UN Secretary-General António Guterres in his preface to the report.

“The current fragmented data landscape risks us failing to capture value that could accrue from digital technologies and it may create more space for substantial harms related to privacy breaches, cyberattacks and other risks.”

UNCTAD Secretary-General Rebeca Grynspan said: “We urgently need a renewed focus on achieving global digital and data governance, developing global digital public goods, increasing trust and reducing uncertainty in the digital economy. The pandemic has shown the critical importance of sharing health data globally – the issue of digital governance can no longer be postponed.”

Pandemic underscores need for new governance

Digital data play an increasingly important role as an economic and strategic resource, a trend reinforced by the COVID-19 pandemic.

The pandemic has shown the importance of sharing health data globally to help countries cope with its consequences, and for research purposes in finding vaccines.

“The increased interconnection and interdependence challenges in the global data economy call for moving away from the silo approach towards a more holistic, coordinated global approach,” UNCTAD Deputy Secretary-General Isabelle Durant said.

“Moreover, new and innovative ways of global governance are urgently needed, as the old ways may not be well suited to respond to the new context,” she added.

New UN data-related body proposed

UNCTAD proposes the formation of a new United Nations coordinating body, with a focus on, and with the skills for, assessing and developing comprehensive global digital and data governance. Its work should be multilateral, multi-stakeholder and multidisciplinary.

It should also seek to remedy the current underrepresentation of developing countries in global and regional data governance initiatives.

The body should also function as a complement to and in coherence with national policies and provide sufficient policy space to ensure countries with different levels of digital readiness and capacities can benefit from the data-driven digital economy…(More)”.

Statement of Principles to support proactive disclosure of government-held information


Statement of principles by the  Australian information commissioners and ombudsmen: “Information commissioners and ombudsmen across Australia oversight and promote citizens’ rights to access government-held information and have powers to review agency decisions under the applicable right to information (RTI) legislation. Beyond formal rights of access, the proactive disclosure of government-held information promotes open government and advances our system of representative democracy.

All Australian governments (Commonwealth, state, territory, and local) and public institutions are strongly encouraged to commit to being Open by Design by building a culture of transparency and by prioritising, promoting and resourcing proactive disclosure.

These Principles recognise that:

  1. information held by government and public institutions is a public resource and, to the greatest extent possible, should be published promptly and proactively at the lowest reasonable cost, without the need for a formal access request, and
  2. a culture of transparency within government is everyone’s responsibility requiring action by all public sector leaders and officers to encourage and support the proactive disclosure of information, and
  3. appropriate, prompt and proactive disclosure of government-held information:
  • informs community – proactive disclosure leads to a more informed community, and awareness raising of government and public institutions’ strategic intentions and initiatives, driving innovation and improving standards. Transparent and coherent public communication can also address misinformation
  • increases participation and enhances decision-making – proactive disclosure increases citizen participation in government processes and promotes better informed decision-making through increased scrutiny, discussion, comment and review of government and public institutions’ decisions
  • builds trust and confidence – proactive disclosure enhances public sector accountability and integrity, builds public trust and confidence in decision-making by government and public institutions and strengthens principles of liberal democracy
  • improves service delivery – proactive disclosure improves service delivery by providing access to information faster and more easily than formal access regimes, providing the opportunity to decide when and how information is provided, and to contextualise and explain information
  • is required or permitted by law – proactive disclosure is mandated, permitted, or protected by law in all Australian states and territories and the Commonwealth
  • improves efficiency – proactive disclosure reduces the administrative burden on departments and agencies and the need for citizens to make a formal information access request.

 Australian information commissioners and ombudsmen recommend that public sector agencies:

  1. Embed a proactive disclosure culture in all public sector agencies and public institutions by…(More)”.

Secondary use of health data in Europe


Report by Mark Boyd, Dr Milly Zimeta, Dr Jeni Tennison and Mahad Alassow: “Open and trusted health data systems can help Europe respond to the many urgent challenges facing its society and economy today. The global pandemic has already altered many of our societal and economic systems, and data has played a key role in enabling cross-border and cross-sector collaboration in public health responses.

Even before the pandemic, there was an urgent need to optimise healthcare systems and manage limited resources more effectively, to meet the needs of growing, and often ageing, populations. Now, there is a heightened need to develop early-diagnostic and health-surveillance systems, and more willingness to adopt digital healthcare solutions…

By reusing health data in different ways, we can increase the value of this data and help to enable these improvements. Clinical data, such as incidences of healthcare and clinical trials data, can be combined with data collected from other sources, such as sickness and insurance claims records, and from devices and wearable technologies. This data can then be anonymised and aggregated to generate new insights and optimise population health, improve patients’ health and experiences, create more efficient healthcare systems, and foster innovation.

This secondary use of health data can enable a wide range of benefits across the entire healthcare system. These include opportunities to optimise service, reduce health inequalities by better allocating resources, and enhance personalised healthcare –for example, by comparing treatments for people with similar characteristics. It can also help encourage innovation by extending research data to assess whether new therapies would work for a broader population….(More)”.

Government data management for the digital age


Essay by Axel Domeyer, Solveigh Hieronimus, Julia Klier, and Thomas Weber: “Digital society’s lifeblood is data—and governments have lots of data, representing a significant latent source of value for both the public and private sectors. If used effectively, and keeping in mind ever-increasing requirements with regard to data protection and data privacy, data can simplify delivery of public services, reduce fraud and human error, and catalyze massive operational efficiencies.

Despite these potential benefits, governments around the world remain largely unable to capture the opportunity. The key reason is that data are typically dispersed across a fragmented landscape of registers (datasets used by government entities for a specific purpose), which are often managed in organizational silos. Data are routinely stored in formats that are hard to process or in places where digital access is impossible. The consequence is that data are not available where needed, progress on digital government is inhibited, and citizens have little transparency on what data the government stores about them or how it is used.

Only a handful of countries have taken significant steps toward addressing these challenges. As other governments consider their options, the experiences of these countries may provide them with valuable guidance and also reveal five actions that can help governments unlock the value that is on their doorsteps.

As societies take steps to enhance data management, questions on topics such as data ownership, privacy concerns, and appropriate measures against security breaches will need to be answered by each government. The purpose of this article is to outline the positive benefits of modern data management and provide a perspective on how to get there…(More)”.

Little Rock Shows How Open Data Drives Resident Engagement


Blog by  Ross Schwartz: “The 12th Street corridor is in the heart of Little Rock, stretching west from downtown across multiple neighborhoods. But for years the area had suffered from high crime rates and disinvestment, and is considered a food desert.

With the intention of improving public safety and supporting efforts to revitalize the area, the City built a new police station in 2014 on the street. And, in the years following, as city staff ramped up efforts to place data at the center of problem-solving, it began to hold two-day-long “Data Academy” trainings for city employees and residents on foundational data practices, including data analysis.

Responding to public safety concerns, a 2018 Data Academy training focused on 12th Street. A cross-department team dug into data sets to understand the challenges facing the area, looking at variables including crime, building code violations, and poverty. It turned out the neighborhood with the highest levels of crime and blight was actually blocks away from 12th Street itself, in Midtown. A predominantly African-American neighborhood just east of the University of Arkansas at Little Rock campus, Midtown has a mix of older longtime homeowners and younger renters.

“It was a real data-driven ‘a-ha’ moment — an example of what you can understand about a city if you have the right data sets and look in the right places,” says Melissa Bridges, Little Rock’s performance and innovation coordinator. With support from What Works Cities (WWC), for the last five years she’s led Little Rock’s efforts to build open data and performance measurement resources and infrastructure…

Newly aware of Midtown’s challenges, city officials decided to engage residents in the neighborhood and adjacent areas. Data Academy members hosted a human-centered design workshop, during which residents were given the opportunity to self-prioritize their pressing concerns. Rather than lead the workshop, officials from various city departments quietly observed the discussion.

The main issue that emerged? Many parts of Midtown were poorly lit due to broken or blocked streetlights. Many residents didn’t feel safe and didn’t know how to alert the City to get lights fixed or vegetation cut back. A review of 311 request data showed that few streetlight problems in the area were ever reported to the City.

Aware of studies showing the correlation between dark streets and crime, the City designed a streetlight canvassing project in partnership with area neighborhood associations to engage and empower residents. Bridges and her team built canvassing route maps using Google Maps and Little Rock Citizen Connect, which collects 311 requests and other data sets. Then they gathered resident volunteers to walk or drive Midtown’s streets on a Friday night, using the City’s 311 mobile app to make a light service request and tag the location….(More)”.

New report confirms positive momentum for EU open science


Press release: “The Commission released the results and datasets of a study monitoring the open access mandate in Horizon 2020. With a steadily increase over the years and an average success rate of 83% open access to scientific publications, the European Commission is at the forefront of research and innovation funders concluded the consortium formed by the analysis company PPMI (Lithuania), research and innovation centre Athena (Greece) and Maastricht University (the Netherlands).

The Commission sought advice on a process and reliable metrics through which to monitor all aspects of the open access requirements in Horizon 2020, and inform how to best do it for Horizon Europe – which has a more stringent and comprehensive set of rights and obligations for Open Science.

The key findings of the study indicate that the early European Commission’s leadership in the Open Science policy has paid off. The Excellent Science pillar in Horizon 2020 has led the success story, with an open access rate of 86%. Of the leaders within this pillar are the European Research Council (ERC) and the Future and Emerging Technologies (FET) programme, with open access rates of over 88%.

Other interesting facts:

  • In terms of article processing charges (APCs), the study estimated the average cost in Horizon 2020 of publishing an open access article to be around EUR 2,200.  APCs for articles published in ‘hybrid’ journals (a cost that will no longer be eligible under Horizon Europe), have a higher average cost of EUR 2,600
  • Compliance in terms of depositing open access publications in a repository (even when publishing open access through a journal) is relatively high (81.9%), indicating that the current policy of depositing is well understood and implemented by researchers.
  • Regarding licences, 49% of Horizon 2020 publications were published using Creative Commons (CC) licences, which permit reuse (with various levels of restrictions) while 33% use publisher-specific licences that place restrictions on text and data mining (TDM).
  • Institutional repositories have responded in a satisfactory manner to the challenge of providing FAIR access to their publications, amending internal processes and metadata to incorporate necessary changes: 95% of deposited publications include in their metadata some type of persistent identifier (PID).
  • Datasets in repositories present a low compliance level as only approximately 39% of Horizon 2020 deposited datasets are findable, (i.e., the metadata includes a PID and URL to the data file), and only around 32% of deposited datasets are accessible (i.e., the data file can be fetched using a URL link in the metadata).  Horizon Europe will hopefully allow to achieve better results.
  • The study also identified gaps in the existing Horizon 2020 open access monitoring data, which pose further difficulties in assessing compliance. Self-reporting by beneficiaries also highlighted a number of issues…(More)”

No revolution: COVID-19 boosted open access, but preprints are only a fraction of pandemic papers


Article by Jeffrey Brainard: “In January 2020, as COVID-19 spread insidiously, research funders and journal publishers recognized their old ways wouldn’t do. They needed to hit the gas pedal to meet the desperate need for information that could help slow the disease.

One major funder, the Wellcome Trust, issued a call for changing business as usual. Authors should put up COVID-19 manuscripts as preprints, it urged, because those are publicly posted shortly after they’re written, before being peer reviewed. Scientists should share their data widely. And publishers should make journal articles open access, or free to read immediately when published.

Dozens of the world’s leading funders, publishers, and scientific societies (including AAAS, publisher of Science) signed Wellcome’s statement. Critics of the tradition-bound world of scientific publishing saw a rare opportunity to tackle long-standing complaints—for example, that journals place many papers behind paywalls and take months to complete peer review. They hoped the pandemic could help birth a new publishing system.

But nearly 2 years later, hopes for a wholesale revolution are fading. Preprints by medical researchers surged, but they remain a small fraction of the literature on COVID-19. Much of that literature is available for free, but access to the underlying data is spotty. COVID-19 journal articles were reviewed faster than previous papers, but not dramatically so, and some ask whether that gain in speed came at the expense of quality. “The overall system demonstrated what could be possible,” says Judy Luther, president of Informed Strategies, a publishing consulting firm.

One thing is clear. The pandemic prompted an avalanche of new papers: more than 530,000, released either by journals or as preprints, according to the Dimensions bibliometric database. That fed the largest 1-year increase in all scholarly articles, and the largest annual total ever. That response is “bonkers,” says Vincent Larivière of the University of Montreal, who studies scholarly publishing. “Everyone had to have their COVID moment and write something.”…(More)”.

The Innovation Project: Can advanced data science methods be a game-change for data sharing?


Report by JIPS (Joint Internal Displacement Profiling Service): “Much has changed in the humanitarian data landscape in the last decade and not primarily with the arrival of big data and artificial intelligence. Mostly, the changes are due to increased capacity and resources to collect more data quicker, leading to the professionalisation of information management as a domain of work. Larger amounts of data are becoming available in a more predictable way. We believe that as the field has progressed in filling critical data gaps, the problem is not the availability of data, but the curation and sharing of that data between actors as well as the use of that data to its full potential.

In 2018, JIPS embarked on an innovation journey to explore the potential of state-of-the-art technologies to incentivise data sharing and collaboration. This report covers the first phase of the innovation project and launches a series of articles in which we will share more about the innovation journey itself, discuss safe data sharing and collaboration, and look at the prototype we developed – made possible by the UNHCR Innovation Fund.

We argue that by making data and insights safe and secure to share between stakeholders, it will allow for a more efficient use of available data, reduce the resources needed to collect new data, strengthen collaboration and foster a culture of trust in the evidence-informed protection of people in displacement and crises.

The paper first defines the problem and outlines the processes through which data is currently shared among the humanitarian community. It explores questions such as: what are the existing data sharing methods and technologies? Which ones constitute a feasible option for humanitarian and development organisations? How can different actors share and collaborate on datasets without impairing confidentiality and exposing them to disclosure threats?…(More)”.

The “Onion Model”: A Layered Approach to Documenting How the Third Wave of Open Data Can Provide Societal Value


Blog post by Andrew Zahuranec, Andrew Young and Stefaan Verhulst: “There’s a lot that goes into data-driven decision-making. Behind the datasets, platforms, and analysts is a complex series of processes that inform what kinds of insight data can produce and what kinds of ends it can achieve. These individual processes can be hard to understand when viewed together but, by separating the stages out, we can not only track how data leads to decisions but promote better and more impactful data management.

Earlier this year, The Open Data Policy Lab published the Third Wave of Open Data Toolkit to explore the elements of data re-use. At the center of this toolkit was an abstraction that we call the Open Data Framework. Divided into individual, onion-like layers, the framework shows all the processes that go into capitalizing on data in the third wave, starting with the creation of a dataset through data collaboration, creating insights, and using those insights to produce value.

This blog tries to re-iterate what’s included in each layer of this data “onion model” and demonstrate how organizations can create societal value by making their data available for re-use by other parties….(More)”.