How are Italian Companies Embracing Open Data?


open-data-200-italy (1)Are companies embracing the use of open government data? How, why and what data is being leveraged? To answer these questions, the GovLab started a project three years ago, Open Data 500, to map and assess — in a comparative manner, across sectors and countries — the private sector’s use of open data to develop new products and services, and create social value.

Today we are launching Open Data 200 Italy, in partnership with Fondazione Bruno Kessler, which seeks to showcase the breadth and depth of companies using open data in Italy.

OD200 Italy is the first and only platform to map the use of open data by companies in Italy. 

Our findings show there is a growing ecosystem around open data in Italy that goes beyond traditional open data advocates. …

The OD200 Italy project shows the diversity of data being used, which makes it necessary to keep open data broad and sustained.

“The merits and use of open data for businesses are often praised but not supported by evidence. OD200 Italy is a great contribution to the evidence base of who, how and why corporations are leveraging open data,” said Stefaan Verhulst, Co-Founder of The GovLab and Chief Research and Development Officer. “Policy makers, practitioners and researchers can leverage the data generated by this initiative to improve the supply and use of open data, or to generate new insights. As such, OD200 Italy is a new open data set on open data.”…(More)”.

The Death of Public Knowledge? How Free Markets Destroy the General Intellect


Book edited by Aeron Davis: “...argues for the value and importance of shared, publicly accessible knowledge, and suggests that the erosion of its most visible forms, including public service broadcasting, education, and the network of public libraries, has worrying outcomes for democracy.

With contributions from both activists and academics, this collection of short, sharp essays focuses on different aspects of public knowledge, from libraries and education to news media and public policy. Together, the contributors record the stresses and strains placed upon public knowledge by funding cuts and austerity, the new digital economy, quantification and target-setting, neoliberal politics, and inequality. These pressures, the authors contend, not only hinder democracies, but also undermine markets, economies, and social institutions and spaces everywhere.

Covering areas of international public concern, these polemical, accessible texts include reflections on the fate of schools and education, the takeover of public institutions by private interests, and the corruption of news and information in the financial sector. They cover the compromised Greek media during recent EU negotiations, the role played by media and political elites in the Irish property bubble, the compromising of government policy by corporate interests in the United States and Korea, and the squeeze on public service media in the United Kingdom, New Zealand, and the United States.

Individually and collectively, these pieces spell out the importance of maintaining public, shared knowledge in all its forms, and offer a rallying cry for doing so, asserting the need for strong public, financial, and regulatory support….(More)”

Intragovernmental Collaborations: Pipedreams or the Future of the Public Sector?


Sarah Worthing at the Stanford Social Innovation Review:Despite the need for concerted, joint efforts among public sector leaders, those working with or in government know too well that such collaborations are rare. The motivation and ability to collaborate in government is usually lacking. So how did these leaders—some with competing agendas—manage to do it?

A new tool for collaboration

Policy labs are units embedded within the public sector—“owned” by one or several ministries—that anchor systematic public sector innovation efforts by facilitating creative approaches to policymaking. Since the inception of the first labs over a decade ago, many innovation experts and academics have touted labs as the leading-edge of public policy innovation. They can generate novel, citizen-centric, effective policies and service provisions, because they include a wide range of governmental and, in many cases, non-governmental actors in tackling complex public policy issues like social inequality, mass migration, and terrorism. MindLab in Denmark, for example, brought together government decision makers from across five ministries in December 2007 to co-create policy strategies on tackling climate change while also propelling new business growth. The collaboration resulted in a range of business strategies for climate change that were adopted during the 2009 UN COP15 Summit in Copenhagen. Under normal circumstances, these government leaders often push conflicting agendas, compete over resources, and are highly risk-adverse in undertaking intragovermental partnerships—all “poison pills” for the kind of collaboration successful public sector innovation needs. However, policy labs like MindLab, Policy Lab UK, and almost 100 similar cross-governmental units are finding ways to overcome these barriers and drive public sector innovation.

Five ways policy labs facilitate successful intragovermental collaboration

To examine how labs do this, we conducted a multiple-case analysis of policy labs in the European Union and United States.

1. Reducing potential future conflict through experiential on-boarding processes. Policy labs conduct extensive screening and induction activities to provide policymakers with both knowledge of and faith in the policy lab’s approach to policymaking. …

2. Utilization of spatial cues to flatten hierarchical and departmental differences. Policy labs strategically use non-traditional spatial elements such as moveable whiteboards, tactile and colorful prototyping materials, and sitting cubes, along with the absence of expected elements such as conference tables and chairs, to indicate that unconventional norms—non-hierarchical and relational norms—govern lab spaces….

3. Reframing policy issues to focus on affected citizens. Policy labs highlight individual citizens’ stories to help reconstruct policymakers’ perceptions toward a more common and human-centered understanding of a policy issue…

4. Politically neutral, process-focused facilitation. Lab practitioners employ design methods that can help bring together divided policymakers and break scripted behavior patterns. Many policy labs use variations of design thinking and foresight methods, with a focus on iterative prototyping and testing, stressing the need for skilled but politically neutral facilitation to work through points of conflict and reach consensus on solutions. …

5. Mitigating risk through policy lab branding….(More)”.

Systems Approaches to Public Sector Challenges


New Report by the OECD: “Complexity is a core feature of most policy issues today and in this context traditional analytical tools and problem-solving methods no longer work. This report, produced by the OECD Observatory of Public Sector Innovation, explores how systems approaches can be used in the public sector to solve complex or “wicked” problems . Consisting of three parts, the report discusses the need for systems thinking in the public sector; identifies tactics that can be employed by government agencies to work towards systems change; and provides an in-depth examination of how systems approaches have been applied in practice. Four cases of applied systems approaches are presented and analysed: preventing domestic violence (Iceland), protecting children (the Netherlands), regulating the sharing economy (Canada) and designing a policy framework to conduct experiments in government (Finland). The report highlights the need for a new approach to policy making that accounts for complexity and allows for new responses and more systemic change that deliver greater value, effectiveness and public satisfaction….(More)”.

Massive Ebola data site planned to combat outbreaks


Amy Maxmen at Nature: “More than 11,000 people died when Ebola tore through West Africa between 2014 and 2016, and yet clinicians still lack data that would enable them to reliably identify the disease when a person first walks into a clinic. To fill that gap and others before the next outbreak hits, researchers are developing a platform to organize and share Ebola data that have so far been scattered beyond reach.

The information system is coordinated by the Infectious Diseases Data Observatory (IDDO), an international research network based at the University of Oxford, UK, and is expected to launch by the end of the year. …

During the outbreak, for example, a widespread rumour claimed that the plague was an experiment conducted by the West, which led some people to resist going to clinics and helped Ebola to spread.

Merson and her collaborators want to avoid the kind of data fragmentation that hindered efforts to stop the outbreak in Liberia, Guinea and Sierra Leone. As the Ebola crisis was escalating in October 2014, she visited treatment units in the three countries to advise on research. Merson found tremendous variation in practices, which complicated attempts to merge and analyse the information. For instance, some record books listed lethargy and hiccups as symptoms, whereas others recorded fatigue but not hiccups.

“People were just collecting what they could,” she recalls. Non-governmental organizations “were keeping their data private; academics take a year to get it out; and West Africa had set up surveillance but they were siloed from the international systems”, she says. …

In July 2015, the IDDO received pilot funds from the UK charity the Wellcome Trust to pool anonymized data from the medical records of people who contracted Ebola — and those who survived it — as well as data from clinical trials and public health projects during outbreaks in West Africa, Uganda and the Democratic Republic of Congo. The hope is that a researcher could search for data to help in diagnosing, treating and understanding the disease. The platform would also provide a home for new data as they emerge. A draft research agenda lists questions that the information might answer, such as how long the virus can survive outside the human body, and what factors are associated with psychological issues in those who survive Ebola.

One sensitive issue is deciding who will control the data. …It’s vital that these discussions happen now, in a period of relative calm, says Jeremy Farrar, director of the Wellcome Trust in London. When the virus emerges again, clinicians, scientists, and regulatory boards will need fast access to data so as not to repeat mistakes made last time. “We need to sit down and make sure we have a data platform in place so that we can respond to a new case of Ebola in hours and days, and not in months and years,” he says. “A great danger is that the world will move on and forget the horror of Ebola in West Africa.”…(More)”

Gaming for Infrastructure


Nilmini Rubin & Jennifer Hara  at the Stanford Social Innovation Review: “…the American Society of Civil Engineers (ASCE) estimates that the United States needs $4.56 trillion to keep its deteriorating infrastructure current but only has funding to cover less than half of necessary infrastructure spending—leaving the at least country $2.0 trillion short through the next decade. Globally, the picture is bleak as well: World Economic Forum estimates that the infrastructure gap is $1 trillion each year.

What can be done? Some argue that public-private partnerships (PPPs or P3s) are the answer. We agree that they can play an important role—if done well. In a PPP, a private party provides a public asset or service for a government entity, bears significant risk, and is paid on performance. The upside for governments and their citizens is that the private sector can be incentivized to deliver projects on time, within budget, and with reduced construction risk. The private sector can benefit by earning a steady stream of income from a long-term investment from a secure client. From the Grand Parkway Project in Texas to the Queen Alia International Airport in Jordan, PPPs have succeeded domestically and internationally.

The problem is that PPPs can be very hard to design and implement. And since they can involve commitments of millions or even billions of dollars, a PPP failure can be awful. For example, the Berlin Airport is a PPP that is six years behind schedule, and its costs overruns total roughly $3.8 billion to date.

In our experience, it can be useful for would-be partners to practice engaging in a PPP before they dive into a live project. At our organization, Tetra Tech’s Institute for Public-Private Partnerships, for example, we use an online and multiplayer game—the P3 Game—to help make PPPs work.

The game is played with 12 to 16 people who are divided into two teams: a Consortium and a Contracting Authority. In each of four rounds, players mimic the activities they would engage in during the course of a real PPP, and as in real life, they are confronted with unexpected events: The Consortium fails to comply with a routine road inspection, how should the Contracting Authority team respond? The cost of materials skyrockets, how should the Consortium team manage when it has a fixed price contract?

Players from government ministries, legislatures, construction companies, financial institutions, and other entities get to swap roles and experience a PPP from different vantage points. They think through challenges and solve problems together—practicing, failing, learning, and growing—within the confines of the game and with no real-world cost.

More than 1,000 people have participated to date, including representatives of the US Army Corps of Engineers, the World Bank, and Johns Hopkins University, using a variety of scenarios. PPP team members who work on part of the Schiphol-Amsterdam-Almere Project, a $5.6-billion road project in the Netherlands, played the game using their actual contract document….(More)”.

Can AI tools replace feds?


Derek B. Johnson at FCW: “The Heritage Foundation…is calling for increased reliance on automation and the potential creation of a “contractor cloud” offering streamlined access to private sector labor as part of its broader strategy for reorganizing the federal government.

Seeking to take advantage of a united Republican government and a president who has vowed to reform the civil service, the foundation drafted a pair of reports this year attempting to identify strategies for consolidating, merging or eliminating various federal agencies, programs and functions. Among those strategies is a proposal for the Office of Management and Budget to issue a report “examining existing government tasks performed by generously-paid government employees that could be automated.”

Citing research on the potential impacts of automation on the United Kingdom’s civil service, the foundation’s authors estimated that similar efforts across the U.S. government could yield $23.9 billion in reduced personnel costs and a reduction in the size of the federal workforce by 288,000….

The Heritage report also called on the federal government to consider a “contracting cloud.” The idea would essentially be for a government version of TaskRabbit, where agencies could select from a pool of pre-approved individual contractors from the private sector who could be brought in for specialized or seasonal work without going through established contracts. Greszler said the idea came from speaking with subcontractors who complained about having to kick over a certain percentage of their payments to prime contractors even as they did all the work.

Right now the foundation is only calling for the government to examine the potential of the issue and how it would interact with existing or similar vehicles for contracting services like the GSA schedule. Greszler emphasized that any pool of workers would need to be properly vetted to ensure they met federal standards and practices.

“There has to be guidelines or some type of checks, so you’re not having people come off the street and getting access to secure government data,” she said….(More)

Ireland Opens E-Health Open Data Portal


Adi Gaskell at HuffPost: “… an open data portal has been launched by eHealth Ireland.  The portal aims to bring together some 300 different open data sources into one place, making it easier to find data from across the Irish Health Sector.

The portal includes data from a range of sources, including statistics on hospital day and inpatient cases, waiting list statistics and information around key new digital initiatives.

Open data

The resource features datasets from both the Department of Health and HealthLink, so the team believe that the data is of the highest quality, and also compliant with the Open Health Data Policy.  This ensures that the approach taken with the release of data is consistent and in accordance with national and international guidelines.

“I am delighted to welcome the launch of the eHealth Ireland Open Data Portal today. The aim of Open Data is twofold; on the one hand facilitating transparency of the Public Sector and on the other providing a valuable resource that can drive innovation. The availability of Open Data can empower citizens and support clinicians, care providers, and researchers make better decisions, spur new innovations and identify efficiencies while ensuring that personal data remains confidential,” Richard Corbridge, CIO at the Health Service Executive says.

Data from both HealthLink and the National Treatment Purchase Fund (NTPF) will be uploaded to the portal each month, with new datasets due to be added on a regular basis….

The project follows a number of clearly defined Open Health Data Principles that are designed to support the health service in the provision of better patient care and in the support of new innovations in the sector, all whilst ensuring that patient data is secured and governed appropriately…(More)”.

Journal tries crowdsourcing peer reviews, sees excellent results


Chris Lee at ArsTechnica: “Peer review is supposed to act as a sanity check on science. A few learned scientists take a look at your work, and if it withstands their objective and entirely neutral scrutiny, a journal will happily publish your work. As those links indicate, however, there are some issues with peer review as it is currently practiced. Recently, Benjamin List, a researcher and journal editor in Germany, and his graduate assistant, Denis Höfler, have come up with a genius idea for improving matters: something called selected crowd-sourced peer review….

My central point: peer review is burdensome and sometimes barely functional. So how do we improve it? The main way is to experiment with different approaches to the reviewing process, which many journals have tried, albeit with limited success. Post-publication peer review, when scientists look over papers after they’ve been published, is also an option but depends on community engagement.

But if your paper is uninteresting, no one will comment on it after it is published. Pre-publication peer review is the only moment where we can be certain that someone will read the paper.

So, List (an editor for Synlett) and Höfler recruited 100 referees. For their trial, a forum-style commenting system was set up that allowed referees to comment anonymously on submitted papers but also on each other’s comments as well. To provide a comparison, the papers that went through this process also went through the traditional peer review process. The authors and editors compared comments and (subjectively) evaluated the pros and cons. The 100-person crowd of researchers was deemed the more effective of the two.

The editors found that it took a bit more time to read and collate all the comments into a reviewers’ report. But it was still faster, which the authors loved. Typically, it took the crowd just a few days to complete their review, which compares very nicely to the usual four to six weeks of the traditional route (I’ve had papers languish for six months in peer review). And, perhaps most important, the responses were more substantive and useful compared to the typical two-to-four-person review.

So far, List has not published the trial results formally. Despite that, Synlett is moving to the new system for all its papers.

Why does crowdsourcing work?

Here we get back to something more editorial. I’d suggest that there is a physical analog to traditional peer review, called noise. Noise is not just a constant background that must be overcome. Noise is also generated by the very process that creates a signal. The difference is how the amplitude of noise grows compared to the amplitude of signal. For very low-amplitude signals, all you measure is noise, while for very high-intensity signals, the noise is vanishingly small compared to the signal, even though it’s huge compared to the noise of the low-amplitude signal.

Our esteemed peers, I would argue, are somewhat random in their response, but weighted toward objectivity. Using this inappropriate physics model, a review conducted by four reviewers can be expected (on average) to contain two responses that are, basically, noise. By contrast, a review by 100 reviewers may only have 10 responses that are noise. Overall, a substantial improvement. So, adding the responses of a large number of peers together should produce a better picture of a scientific paper’s strengths and weaknesses.

Didn’t I just say that reviewers are overloaded? Doesn’t it seem that this will make the problem worse?

Well, no, as it turns out. When this approach was tested (with consent) on papers submitted to Synlett, it was discovered that review times went way down—from weeks to days. And authors reported getting more useful comments from their reviewers….(More)”.

Community Digital Storytelling for Collective Intelligence: towards a Storytelling Cycle of Trust


Sarah Copeland and Aldo de Moor in AI & SOCIETY: “Digital storytelling has become a popular method for curating community, organisational, and individual narratives. Since its beginnings over 20 years ago, projects have sprung up across the globe, where authentic voice is found in the narration of lived experiences. Contributing to a Collective Intelligence for the Common Good, the authors of this paper ask how shared stories can bring impetus to community groups to help identify what they seek to change, and how digital storytelling can be effectively implemented in community partnership projects to enable authentic voices to be carried to other stakeholders in society. The Community Digital Storytelling (CDST) method is introduced as a means for addressing community-of-place issues. There are five stages to this method: preparation, story telling, story digitisation, digital story sense-making, and digital story sharing. Additionally, a Storytelling Cycle of Trust framework is proposed. We identify four trust dimensions as being imperative foundations in implementing community digital media interventions for the common good: legitimacy, authenticity, synergy, and commons. This framework is concerned with increasing the impact that everyday stories can have on society; it is an engine driving prolonged storytelling. From this perspective, we consider the ability to scale up the scope and benefit of stories in civic contexts. To illustrate this framework, we use experiences from the CDST workshop in northern Britain and compare this with a social innovation project in the southern Netherlands….(More)”.