EU negotiators agree on new rules for sharing of public sector data


European Commission Press Release: “Negotiators from the European Parliament, the Council of the EU and the Commission have reached an agreement on a revised directive that will facilitate the availability and re-use of public sector data.

Data is the fuel that drives the growth of many digital products and services. Making sure that high-quality, high-value data from publicly funded services is widely and freely available is a key factor in accelerating European innovation in highly competitive fields such as artificial intelligence requiring access to vast amounts of high-quality data.

In full compliance with the EU General Data Protection Regulation, the new Directive on Open Data and Public Sector Information (PSI) – which can be for example anything from anonymised personal data on household energy use to general information about national education or literacy levels – updates the framework setting out the conditions under which public sector data should be made available for re-use, with a particular focus on the increasing amounts of high-value data that is now available.

Vice-President for the Digital Single Market Andrus Ansip said: “Data is increasingly the lifeblood of today’s economy and unlocking the potential of public open data can bring significant economic benefits. The total direct economic value of public sector information and data from public undertakings is expected to increase from €52 billion in 2018 to €194 billion by 2030. With these new rules in place, we will ensure that we can make the most of this growth” 

Commissioner for Digital Economy and Society Mariya Gabriel said: “Public sector information has already been paid for by the taxpayer. Making it more open for re-use benefits the European data economy by enabling new innovative products and services, for example based on artificial intelligence technologies. But beyond the economy, open data from the public sector is also important for our democracy and society because it increases transparency and supports a facts-based public debate.”

As part of the EU Open Data policy, rules are in place to encourage Member States to facilitate the re-use of data from the public sector with minimal or no legal, technical and financial constraints. But the digital world has changed dramatically since they were first introduced in 2003.

What do the new rules cover?

  • All public sector content that can be accessed under national access to documents rules is in principle freely available for re-use. Public sector bodies will not be able to charge more than the marginal cost for the re-use of their data, except in very limited cases. This will allow more SMEs and start-ups to enter new markets in providing data-based products and services.
  • A particular focus will be placed on high-value datasets such as statistics or geospatial data. These datasets have a high commercial potential, and can speed up the emergence of a wide variety of value-added information products and services.
  • Public service companies in the transport and utilities sector generate valuable data. The decision on whether or not their data has to be made available is covered by different national or European rules, but when their data is available for re-use, they will now be covered by the Open Data and Public Sector Information Directive. This means they will have to comply with the principles of the Directive and ensure the use of appropriate data formats and dissemination methods, while still being able to set reasonable charges to recover related costs.
  • Some public bodies strike complex data deals with private companies, which can potentially lead to public sector information being ‘locked in’. Safeguards will therefore be put in place to reinforce transparency and to limit the conclusion of agreements which could lead to exclusive re-use of public sector data by private partners.
  • More real-time data, available via Application Programming Interfaces (APIs), will allow companies, especially start-ups, to develop innovative products and services, e.g. mobility apps. Publicly-funded research data is also being brought into the scope of the directive: Member States will be required to develop policies for open access to publicly funded research data while harmonised rules on re-use will be applied to all publicly-funded research data which is made accessible via repositories….(More)”.

Does good governance foster trust in government? A panel data analysis


Paper by Jonathan Spiteri and Marie Briguglio: “This study examines the relationship between good governance and trust in government. It sets out to test which aspects of good governance, if any, foster strong trust in government. We construct a panel data set drawn from 29 European countries over the period 2004 to 2015. The data set includes measures of government trust, six different dimensions of good governance, as well as variables on GDP growth and income inequality.

We find that freedom of expression and citizen involvement in the democratic process, to be the good governance dimension that has the strongest relationship with government trust, across all specifications of our regression models. We also find that real GDP growth rates have a significant (albeit weaker) relationship with trust in government. Our results suggest that certain elements of good governance foster trust in government over and above that generated by economic success. We discuss the implications of these findings in light of declining levels of public trust in government around the world….(More)”.

Info We Trust: How to Inspire the World with Data


Book by R.J. Andrews: “How do we create new ways of looking at the world? Join award-winning data storyteller RJ Andrews as he pushes beyond the usual how-to, and takes you on an adventure into the rich art of informing.

Creating Info We Trust is a craft that puts the world into forms that are strong and true.  It begins with maps, diagrams, and charts — but must push further than dry defaults to be truly effective. How do we attract attention? How can we offer audiences valuable experiences worth their time? How can we help people access complexity?

Dark and mysterious, but full of potential, data is the raw material from which new understanding can emerge. Become a hero of the information age as you learn how to dip into the chaos of data and emerge with new understanding that can entertain, improve, and inspire. Whether you call the craft data storytelling, data visualization, data journalism, dashboard design, or infographic creation — what matters is that you are courageously confronting the chaos of it all in order to improve how people see the world. Info We Trust is written for everyone who straddles the domains of data and people: data visualization professionals, analysts, and all who are enthusiastic for seeing the world in new ways.

This book draws from the entirety of human experience, quantitative and poetic. It teaches advanced techniques, such as visual metaphor and data transformations, in order to create more human presentations of data.  It also shows how we can learn from print advertising, engineering, museum curation, and mythology archetypes. This human-centered approach works with machines to design information for people. Advance your understanding beyond by learning from a broad tradition of putting things “in formation” to create new and wonderful ways of opening our eyes to the world….(More)”.

Artificial Unintelligence: How Computers Misunderstand the World


Book by Meredith Broussard where she “…argues that our collective enthusiasm for applying computer technology to every aspect of life has resulted in a tremendous amount of poorly designed systems. We are so eager to do everything digitally—hiring, driving, paying bills, even choosing romantic partners—that we have stopped demanding that our technology actually work. Broussard, a software developer and journalist, reminds us that there are fundamental limits to what we can (and should) do with technology. With this book, she offers a guide to understanding the inner workings and outer limits of technology—and issues a warning that we should never assume that computers always get things right.

Making a case against technochauvinism—the belief that technology is always the solution—Broussard argues that it’s just not true that social problems would inevitably retreat before a digitally enabled Utopia. To prove her point, she undertakes a series of adventures in computer programming. She goes for an alarming ride in a driverless car, concluding “the cyborg future is not coming any time soon”; uses artificial intelligence to investigate why students can’t pass standardized tests; deploys machine learning to predict which passengers survived the Titanic disaster; and attempts to repair the U.S. campaign finance system by building AI software. If we understand the limits of what we can do with technology, Broussard tells us, we can make better choices about what we should do with it to make the world better for everyone….(More)”.

AI is sending people to jail—and getting it wrong


Karen Hao atMIT Technology Review : “Using historical data to train risk assessment tools could mean that machines are copying the mistakes of the past. …

AI might not seem to have a huge personal impact if your most frequent brush with machine-learning algorithms is through Facebook’s news feed or Google’s search rankings. But at the Data for Black Lives conference last weekend, technologists, legal experts, and community activists snapped things into perspective with a discussion of America’s criminal justice system. There, an algorithm can determine the trajectory of your life. The US imprisons more people than any other country in the world. At the end of 2016, nearly 2.2 million adults were being held in prisons or jails, and an additional 4.5 million were in other correctional facilities. Put another way, 1 in 38 adult Americans was under some form of correctional supervision. The nightmarishness of this situation is one of the few issues that unite politicians on both sides of the aisle. Under immense pressure to reduce prison numbers without risking a rise in crime, courtrooms across the US have turned to automated tools in attempts to shuffle defendants through the legal system as efficiently and safely as possible. This is where the AI part of our story begins….(More)”.

This Startup Is Challenging Google Maps—and It Needs You


Aarian Marshall at Wired: “A whole lifetime in New York City, and Christiana Ting didn’t realize just how many urgent care facilities there were until the app told her to start looking for them. “They were giving extra points for medical offices, and I found them, I think, on every block,” she says. “I’m not sure what that says about the neighborhood where I work.”

Ting was one of 761 New Yorkers who downloaded, played with, and occasionally became obsessed with an app called MapNYC this fall, vying for their share of an 8-bitcoin prize (worth about $50,000 at the time). The month-long contest, run by a new mapping startup called StreetCred, was really an experiment. StreetCred’s main research question: How do you convince regular people to build and verify mappingdata?

It turns out that the maps that guide you to the nearest Arby’s, or help your Lyft driver find your house, don’t just materialize. “I took mapping for granted until I started the competition,” Ting says, even though she pulls up Google Maps at least twice a day. “But it’s such an inconvenience if the info on the map is wrong, especially in a place like New York, that’s changing all the time.”

For regular folk, detailed, reliable mapping info is helpful. For businesses, it can be crucial. Some want to be found when a map user searches for the nearest sandwich shop. Others use products that rely on base maps—think Uber, the Weather Channel, your car’s navigation system—and require up-to-date location data. “One of the huge challenges to any geographic database is its currency,” says Renee Sieber, a geographer who studies participatory mapping at McGill University. That is to say, yesterday’s map is no good to anybody doing business today.

StreetCred sees that as an opportunity. “There’s a lot of companies, none of whom I can name, who have location data, and that data needs improvement,” says Randy Meech, CEO of the small startup. (Meech’s last open-source mapping company, a Samsung subsidiary called Mapzen, shut down in January.) Maybe a client found a data set online or purchased one from another company. Either way, it’s static, and that means it’s only a matter of time before it fails to represent reality.

Google Maps, the giant in this space, has created its extensive database through years of web scraping, Streetview roaming, purchasing and collecting satellite data, and both paying and asking volunteers to verify that the businesses it identifies are still in the same place. But the company doesn’t provide all of its specific location or “point of interest” data to developers—where that Thai restaurant is, or where the hiking trail starts, or where the hospital parking lot is located. When it and other mapping services like HERE Technologies, TomTom, and Foursquare do offer that intel, it can be pricey. StreetCred wants to make that info free for customers who don’t need that much data and cheaper for those that do….(More)”.

Machine Learning and the Rule of Law


Paper by Daniel L. Chen: “Predictive judicial analytics holds the promise of increasing the fairness of law. Much empirical work observes inconsistencies in judicial behavior. By predicting judicial decisions—with more or less accuracy depending on judicial attributes or case characteristics—machine learning offers an approach to detecting when judges most likely to allow extra legal biases to influence their decision making. In particular, low predictive accuracy may identify cases of judicial “indifference,” where case characteristics (interacting with judicial attributes) do no strongly dispose a judge in favor of one or another outcome. In such cases, biases may hold greater sway, implicating the fairness of the legal system….(More)”

Outcomes of open government: Does an online platform improve citizens’ perception of local government?


Paper by Lisa Schmidthuber et al: “Governments all over the world have implemented citizensourcing initiatives to integrate citizens into decision-making processes. A more participative decision-making process is associated with an open government and assumed to benefit public service quality and interactive value creation. The purpose of this paper is to highlight the outcomes of open government initiatives and ask to what extent open government participation is related to perceived outcomes of open government....

Data conducted from a survey among users of a citizensourcing platform and platform data are used to perform non-parametric analyses and examine the relationship between platform participation and perceived outcomes of open government....

The findings of this paper suggest that active platform usage positively relates to several outcomes perceived by citizens, such as improved information flow, increased trust in and satisfaction with local government. In contrast, repetitive participation does not significantly relate to users’ outcome evaluation….(More)”.

The Digitalization of Public Diplomacy


Book by Ilan Manor: “This book addresses how digitalization has influenced the institutions, practitioners and audiences of diplomacy. Throughout, the author argues that terms such as ‘digitalized public diplomacy’ or ‘digital public diplomacy’ are misleading, as they suggest that Ministries of Foreign Affairs (MFAs) are either digital or non-digital, when in fact digitalization should be conceptualized as a long-term process in which the values, norms, working procedures and goals of public diplomacy are challenged and re-defined. Subsequently, through case study examination, this book also argues that different MFAs are at different stages of the digitalization process. By adopting the term ‘the digitalization of public diplomacy’, this book will offer a new conceptual framework for investigating the impact of digitalization on the practice of public diplomacy….(More)”.

Can I Trust the Data I See? A Physician’s Concern on Medical Data in IoT Health Architectures


Conference Paper by Fariha Tasmin Jaigirdar, Carsten Rudolph, and Chris Bain: “With the increasing advancement of Internet of Things (IoT) enabled systems, smart medical devices open numerous opportunities for the healthcare sector. The success of using such devices in the healthcare industry depends strongly on secured and reliable medical data transmission. Physicians diagnose that data and prescribe medicines and/or give guidelines/instructions/treatment plans for the patients. Therefore, a physician is always concerned about the medical data trustworthiness, because if it is not guaranteed, a savior can become an involuntary foe! This paper analyses two different scenarios to understand the real-life consequences in IoT-based healthcare (IoT-Health) application. Appropriate sequence diagrams for both scenarios show data movement as a basis for determining necessary security requirements in each layer of IoT-Health.

We analyse the individual entities of the overall system and develop a system-wide view of trust in IoT-Health. The security analysis pinpoints the research gap in end-to-end trust and indicates the necessity to treat the whole IoT-Health system as an integrated entity. This study highlights the importance of integrated cross-layer security solutions that can deal with the heterogeneous security architectures of IoT healthcare system and finally identifies a possible solution for the open question raised in the security analysis with appropriate future research directions….(More)”.