How are Italian Companies Embracing Open Data?


open-data-200-italy (1)Are companies embracing the use of open government data? How, why and what data is being leveraged? To answer these questions, the GovLab started a project three years ago, Open Data 500, to map and assess — in a comparative manner, across sectors and countries — the private sector’s use of open data to develop new products and services, and create social value.

Today we are launching Open Data 200 Italy, in partnership with Fondazione Bruno Kessler, which seeks to showcase the breadth and depth of companies using open data in Italy.

OD200 Italy is the first and only platform to map the use of open data by companies in Italy. 

Our findings show there is a growing ecosystem around open data in Italy that goes beyond traditional open data advocates. …

The OD200 Italy project shows the diversity of data being used, which makes it necessary to keep open data broad and sustained.

“The merits and use of open data for businesses are often praised but not supported by evidence. OD200 Italy is a great contribution to the evidence base of who, how and why corporations are leveraging open data,” said Stefaan Verhulst, Co-Founder of The GovLab and Chief Research and Development Officer. “Policy makers, practitioners and researchers can leverage the data generated by this initiative to improve the supply and use of open data, or to generate new insights. As such, OD200 Italy is a new open data set on open data.”…(More)”.

From Katrina To Harvey: How Disaster Relief Is Evolving With Technology


Cale Guthrie Weissman at Fast Company: “Open data may sound like a nerdy thing, but this weekend has proven it’s also a lifesaver in more ways than one.

As Hurricane Harvey pelted the southern coast of Texas, a local open-data resource helped provide accurate and up-to-date information to the state’s residents. Inside Harris County’s intricate bayou system–intended to both collect water and effectively drain it–gauges were installed to sense when water is overflowing. The sensors transmit the data to a website, which has become a vital go-to for Houston residents….

This open access to flood gauges is just one of the many ways new tech-driven projects have helped improve responses to disasters over the years. “There’s no question that technology has played a much more significant role,” says Lemaitre, “since even Hurricane Sandy.”

While Sandy was noted in 2012 for its ability to connect people with Twitter hashtags and other relatively nascent social apps like Instagram, the last few years have brought a paradigm shift in terms of how emergency relief organizations integrate technology into their responses….

Social media isn’t just for the residents. Local and national agencies–including FEMA–rely on this information and are using it to help create faster and more effective disaster responses. Following the disaster with Hurricane Katrina, FEMA worked over the last decade to revamp its culture and methods for reacting to these sorts of situations. “You’re seeing the federal government adapt pretty quickly,” says Lemaitre.

There are a few examples of this. For instance, FEMA now has an app to push necessary information about disaster preparedness. The agency also employs people to cull the open web for information that would help make its efforts better and more effective. These “social listeners” look at all the available Facebook, Snapchat, and other social media posts in aggregate. Crews are brought on during disasters to gather intelligence, and then report about areas that need relief efforts–getting “the right information to the right people,” says Lemaitre.

There’s also been a change in how this information is used. Often, when disasters are predicted, people send supplies to the affected areas as a way to try and help out. Yet they don’t know exactly where they should send it, and local organizations sometimes become inundated. This creates a huge logistical nightmare for relief organizations that are sitting on thousands of blankets and tarps in one place when they should be actively dispersing them across hundreds of miles.

“Before, you would just have a deluge of things dropped on top of a disaster that weren’t particularly helpful at times,” says Lemaitre. Now people are using sites like Facebook to ask where they should direct the supplies. For example, after a bad flood in Louisiana last year, a woman announced she had food and other necessities on Facebook and was able to direct the supplies to an area in need. This, says Lemaitre, is “the most effective way.”

Put together, Lemaitre has seen agencies evolve with technology to help create better systems for quicker disaster relief. This has also created a culture of learning updates and reacting in real time. Meanwhile, more data is becoming open, which is helping both people and agencies alike. (The National Weather Service, which has long trumpeted its open data for all, has become a revered stalwart for such information, and has already proven indispensable in Houston.)

Most important, the pace of technology has caused organizations to change their own procedures. Twelve years ago, during Katrina, the protocol was to wait until an assessment before deploying any assistance. Now organizations like FEMA know that just doesn’t work. “You can’t afford to lose time,” says Lemaitre. “Deploy as much as you can and be fast about it–you can always scale back.”

It’s important to note that, even with rapid technological improvements, there’s no way to compare one disaster response to another–it’s simply not apples to apples. All the same, organizations are still learning about where they should be looking and how to react, connecting people to their local communities when they need them most….(More)”.

Open & Shut


Harsha Devulapalli: “Welcome to Open & Shut — a new blog dedicated to exploring the opportunities and challenges of working with open data in closed societies around the world. Although we’ll be exploring questions relevant to open data practitioners worldwide, we’re particularly interested in seeing how civil society groups and actors in the Global South are using open data to push for greater government transparency, and tackle daunting social and economic challenges facing their societies….Throughout this series we’ll be profiling and interviewing organisations working with open data worldwide, and providing do-it-yourself data tutorials that will be useful for beginners as well as data experts. …

What do we mean by the terms ‘open data’ and ‘closed societies’?

It’s important to be clear about what we’re dealing with, here. So let’s establish some key terms. When we talk about ‘open data’, we mean data that anyone can access, use and share freely. And when we say ‘closed societies’, we’re referring to states or regions in which the political and social environment is actively hostile to notions of openness and public scrutiny, and which hold principles of freedom of information in low esteem. In closed societies, data is either not published at all by the government, or else is only published in inaccessible formats, is missing data, is hard to find or else is just not digitised at all.

Iran is one such state that we would characterise as a ‘closed society’. At Small Media, we’ve had to confront the challenges of poor data practice, secrecy, and government opaqueness while undertaking work to support freedom of information and freedom of expression in the country. Based on these experiences, we’ve been working to build Iran Open Data — a civil society-led open data portal for Iran, in an effort to make Iranian government data more accessible and easier for researchers, journalists, and civil society actors to work with.

Iran Open Data — an open data portal for Iran, created by Small Media

.

..Open & Shut will shine a light on the exciting new ways that different groups are using data to question dominant narratives, transform public opinion, and bring about tangible change in closed societies. At the same time, it’ll demonstrate the challenges faced by open data advocates in opening up this valuable data. We intend to get the community talking about the need to build cross-border alliances in order to empower the open data movement, and to exchange knowledge and best practices despite the different needs and circumstances we all face….(More)

Ireland Opens E-Health Open Data Portal


Adi Gaskell at HuffPost: “… an open data portal has been launched by eHealth Ireland.  The portal aims to bring together some 300 different open data sources into one place, making it easier to find data from across the Irish Health Sector.

The portal includes data from a range of sources, including statistics on hospital day and inpatient cases, waiting list statistics and information around key new digital initiatives.

Open data

The resource features datasets from both the Department of Health and HealthLink, so the team believe that the data is of the highest quality, and also compliant with the Open Health Data Policy.  This ensures that the approach taken with the release of data is consistent and in accordance with national and international guidelines.

“I am delighted to welcome the launch of the eHealth Ireland Open Data Portal today. The aim of Open Data is twofold; on the one hand facilitating transparency of the Public Sector and on the other providing a valuable resource that can drive innovation. The availability of Open Data can empower citizens and support clinicians, care providers, and researchers make better decisions, spur new innovations and identify efficiencies while ensuring that personal data remains confidential,” Richard Corbridge, CIO at the Health Service Executive says.

Data from both HealthLink and the National Treatment Purchase Fund (NTPF) will be uploaded to the portal each month, with new datasets due to be added on a regular basis….

The project follows a number of clearly defined Open Health Data Principles that are designed to support the health service in the provision of better patient care and in the support of new innovations in the sector, all whilst ensuring that patient data is secured and governed appropriately…(More)”.

Africa’s open data revolution hampered by challenges


Gilbert Nakweya at SciDevNet: “According to the inaugural Africa Data Revolution Report (ADRR), there is minimal or non-existent collaborations among data communities regarding the Sustainable Development Goals (SDGs) and Africa’s Agenda 2063.
…The report cites issues such as legal and policy frameworks, infrastructure, technology and interactions among key actors as challenges that confront data ecosystems of ten African countries studied: Cote d’Ivoire, Ethiopia, Kenya, Madagascar, Nigeria, Rwanda, Senegal, South Africa, Swaziland and Tanzania.

The ADRR was jointly published by the Economic Commission for Africa, United Nations Development Programme (UNDP), World Wide Web Foundation and Open Data for Development Network (OD4D).

“Open data is Africa’s biggest challenge,” says Nnenna Nwakanma, a senior policy manager at the US-headquartered World Wide Web Foundation, noting that open data revolution is key to Africa achieving the SDGs.

Nwakanma tells SciDev.Net that data revolution is built on access to information, the web, and to content, citing open data’s benefits such as governments functioning more efficiently, businesses innovating more and citizens participating in governance and demanding accountability.

Serge Kapto, a policy specialist on data from the UNDP, says that frameworks such as the African charter on statistics and the strategy for harmonisation of statistics in Africa adopted by the continent have laid the groundwork for an African data revolution…
Kapto adds that Africa is well positioned to reap the benefits of the data revolution for sustainable development and leapfrog technology to serve national and regional development priorities.

But, he explains, much work remains to be done to fully take advantage of the opportunity afforded by the data revolution for achieving development plans….(More)”

Building Digital Government Strategies


Book by Rodrigo Sandoval-Almazan et al: “This book provides key strategic principles and best practices to guide the design and implementation of digital government strategies. It provides a series of recommendations and findings to think about IT applications in government as a platform for information, services and collaboration, and strategies to avoid identified pitfalls. Digital government research suggests that information technologies have the potential to generate immense public value and transform the relationships between governments, citizens, businesses and other stakeholders. However, developing innovative and high impact solutions for citizens hinges on the development of strategic institutional, organizational and technical capabilities.

Thus far,  particular characteristics and problems of the public sector organization promote the development of poorly integrated and difficult to maintain applications. For example, governments maintain separate applications for open data, transparency, and public services, leading to duplication of efforts and a waste of resources. The costs associated with maintaining such sets of poorly integrated systems may limit the use of resources to future projects and innovation.

This book provides best practices and recommendations based on extensive research in both Mexico and the United States on how governments can develop a digital government strategy for creating public value, how to finance digital innovation in the public sector, how to building successful collaboration networks and foster citizen engagement, and how to correctly implement open government projects and open data. It will be of interest to researchers, practitioners, students, and public sector IT professionals that work in the design and implementation of technology-based projects and programs….(More)”.

Rise of the Government Chatbot


Zack Quaintance at Government Technology: “A robot uprising has begun, except instead of overthrowing mankind so as to usher in a bleak yet efficient age of cold judgment and colder steel, this uprising is one of friendly robots (so far).

Which is all an alarming way to say that many state, county and municipal governments across the country have begun to deploy relatively simple chatbots, aimed at helping users get more out of online public services such as a city’s website, pothole reporting and open data. These chatbots have been installed in recent months in a diverse range of places including Kansas City, Mo.; North Charleston, S.C.; and Los Angeles — and by many indications, there is an accompanying wave of civic tech companies that are offering this tech to the public sector.

They range from simple to complex in scope, and most of the jurisdictions currently using them say they are doing so on somewhat of a trial or experimental basis. That’s certainly the case in Kansas City, where the city now has a Facebook chatbot to help users get more out of its open data portal.

“The idea was never to create a final chatbot that was super intelligent and amazing,” said Eric Roche, Kansas City’s chief data officer. “The idea was let’s put together a good effort, and put it out there and see if people find it interesting. If they use it, get some lessons learned and then figure out — either in our city, or with developers, or with people like me in other cities, other chief data officers and such — and talk about the future of this platform.”

Roche developed Kansas City’s chatbot earlier this year by working after hours with Code for Kansas City, the local Code for America brigade — and he did so because since in the four-plus years the city’s open data program has been active, there have been regular concerns that the info available through it was hard to navigate, search and use for average citizens who aren’t data scientists and don’t work for the city (a common issue currently being addressed by many jurisdictions). The idea behind the Facebook chatbot is that Roche can program it with a host of answers to the most prevalent questions, enabling it to both help interested users and save him time for other work….

In North Charleston, S.C., the city has adopted a text-based chatbot, which goes above common 311-style interfaces by allowing users to report potholes or any other lapses in city services they may notice. It also allows them to ask questions, which it subsequently answers by crawling city websites and replying with relevant links, said Ryan Johnson, the city’s public relations coordinator.

North Charleston has done this by partnering with a local tech startup that has deep roots in the area’s local government. The company is called Citibot …

With Citibot, residents can report a pothole at 2 a.m., or they can get info about street signs or trash pickup sent right to their phones.

There are also more complex chatbot technologies taking hold at both the civic and state levels, in Los Angeles and Mississippi, to be exact.

Mississippi’s chatbot is called Missi, and its capabilities are vast and nuanced. Residents can even use it for help submitting online payments. It’s accessible by clicking a small chat icon on the side of the website.

Back in May, Los Angeles rolled out Chip, or City Hall Internet Personality, on the Los Angeles Business Assistance Virtual Network. The chatbot aims to assist visitors by operating as a 24/7 digital assistant for visitors to the site, helping them navigate it and better understand its services by answering their inquiries. It is capable of presenting info from anywhere on the site, and it can even go so far as helping users fill out forms or set up email alerts….(More)”

How data can heal our oceans


Nishan Degnarain and Steve Adler at WEF: “We have collected more data on our oceans in the past two years than in the history of the planet.

There has been a proliferation of remote and near sensors above, on, and beneath the oceans. New low-cost micro satellites ring the earth and can record what happens below daily. Thousands of tidal buoys follow currents transmitting ocean temperature, salinity, acidity and current speed every minute. Undersea autonomous drones photograph and map the continental shelf and seabed, explore deep sea volcanic vents, and can help discover mineral and rare earth deposits.

The volume, diversity and frequency of data is increasing as the cost of sensors fall, new low-cost satellites are launched, and an emerging drone sector begins to offer new insights into our oceans. In addition, new processing capabilities are enhancing the value we receive from such data on the biological, physical and chemical properties of our oceans.

Yet it is not enough.

We need much more data at higher frequency, quality, and variety to understand our oceans to the degree we already understand the land. Less than 5% of the oceans are comprehensively monitored. We need more data collection capacity to unlock the sustainable development potential of the oceans and protect critical ecosystems.

More data from satellites will help identify illegal fishing activity, track plastic pollution, and detect whales and prevent vessel collisions. More data will help speed the placement of offshore wind and tide farms, improve vessel telematics, develop smart aquaculture, protect urban coastal zones, and enhance coastal tourism.

Unlocking the ocean data market

But we’re not there yet.

This new wave of data innovation is constrained by inadequate data supply, demand, and governance. The supply of existing ocean data is locked by paper records, old formats, proprietary archives, inadequate infrastructure, and scarce ocean data skills and capacity.

The market for ocean observation is driven by science and science isn’t adequately funded.

To unlock future commercial potential, new financing mechanisms are needed to create market demand that will stimulate greater investments in new ocean data collection, innovation and capacity.

Efforts such as the Financial Stability Board’s Taskforce on Climate-related Financial Disclosure have gone some way to raise awareness and create demand for such ocean-related climate risk data.

Much data that is produced is collected by nations, universities and research organizations, NGO’s, and the private sector, but only a small percentage is Open Data and widely available.

Data creates more value when it is widely utilized and well governed. Helping organize to improve data infrastructure, quality, integrity, and availability is a requirement for achieving new ocean data-driven business models and markets. New Ocean Data Governance models, standards, platforms, and skills are urgently needed to stimulate new market demand for innovation and sustainable development….(More)”.

Let the People Know the Facts: Can Government Information Removed from the Internet Be Reclaimed?


Paper by Susan Nevelow Mart: “…examines the legal bases of the public’s right to access government information, reviews the types of information that have recently been removed from the Internet, and analyzes the rationales given for the removals. She suggests that the concerted use of the Freedom of Information Act by public interest groups and their constituents is a possible method of returning the information to the Internet….(More)”.

The hidden costs of open data


Sara Friedman at GCN: “As more local governments open their data for public use, the emphasis is often on “free” — using open source tools to freely share already-created government datasets, often with pro bono help from outside groups. But according to a new report, there are unforeseen costs when it comes pushing government datasets out of public-facing platforms — especially when geospatial data is involved.

The research, led by University of Waterloo professor Peter A. Johnson and McGill University professor Renee Sieber, was based on work as part of Geothink.ca partnership research grant and exploration of the direct and indirect costs of open data.

Costs related to data collection, publishing, data sharing, maintenance and updates are increasingly driving governments to third-party providers to help with hosting, standardization and analytical tools for data inspection, the researchers found. GIS implementation also has associated costs to train staff, develop standards, create valuations for geospatial data, connect data to various user communities and get feedback on challenges.

Due to these direct costs, some governments are more likely to avoid opening datasets that need complex assessment or anonymization techniques for GIS concerns. Johnson and Sieber identified four areas where the benefits of open geospatial data can generate unexpected costs.

First, open data can create “smoke and mirrors” situation where insufficient resources are put toward deploying open data for government use. Users then experience “transaction costs” when it comes to working in specialist data formats that need additional skills, training and software to use.

Second, the level of investment and quality of open data can lead to “material benefits and social privilege” for communities that devote resources to providing more comprehensive platforms.

While there are some open source data platforms, the majority of solutions are proprietary and charged on a pro-rata basis, which can present a challenge for cities with larger, poor populations compared to smaller, wealthier cities. Issues also arise when governments try to combine their data sets, leading to increased costs to reconcile problems.

The third problem revolves around the private sector pushing for the release of data sets that can benefit their business objectives. Companies could push for the release high-value sets, such as a real-time transit data, to help with their product development goals. This can divert attention from low-value sets, such as those detailing municipal services or installations, that could have a bigger impact on residents “from a civil society perspective.”

If communities decide to release the low-value sets first, Johnson and Sieber think the focus can then be shifted to high-value sets that can help recoup the costs of developing the platforms.

Lastly, the report finds inadvertent consequences could result from tying open data resources to private-sector companies. Public-private open data partnerships could lead to infrastructure problems that prevent data from being widely shared, and help private companies in developing their bids for public services….

Johnson and Sieber encourage communities to ask the following questions before investing in open data:

  1. Who are the intended constituents for this open data?
  2. What is the purpose behind the structure for providing this data set?
  3. Does this data enable the intended users to meet their goals?
  4. How are privacy concerns addressed?
  5. Who sets the priorities for release and updates?…(More)”

Read the full report here.