Explore our articles
View All Results

Stefaan Verhulst

Press Release: “Capgemini Consulting, the global strategy and transformation consulting arm of the Capgemini Group, today published two new reports on the state of play of Open Data in Europe, to mark the launch of the European Open Data Portal. The first report addresses “Open Data Maturity in Europe 2015: Insights into the European state of play” and the second focuses on “Creating Value through Open Data: Study on the Impact of Re-use of Public Data Resources.” The countries covered by these assessments include the EU28 countries plus Iceland, Liechtenstein, Norway, and Switzerland – commonly referred to as the EU28+ countries. The reports were requested by the European Commission within the framework of the Connecting Europe Facility program, supporting the deployment of European Open Data infrastructure.

Open Data refers to the information collected, produced or paid for by public bodies and can be freely used, modified and shared by anyone.. For the period 2016-2020, the direct market size for Open Data is estimated at EUR 325 billion for Europe. Capgemini’s study “Creating Value through Open Data” illustrates how Open Data can create economic value in multiple ways including increased market transactions, job creation from producing services and products based on Open Data, to cost savings and efficiency gains. For instance, effective use of Open Data could help save 629 million hours of unnecessary waiting time on the roads in the EU; and help reduce energy consumption by 16%. The accumulated cost savings for public administrations making use of Open Data across the EU28+ in 2020 are predicted to equal 1.7 bn EUR. Reaping these benefits requires reaching a high level of Open Data maturity.

In order to address the accessibility and the value of Open Data across European countries, the European Union has launched the Beta version of the European Data Portal. The Portal addresses the whole Data Value Chain, from data publishing to data re-use. Over 240,000 data sets are referenced on the Portal and 34 European countries. It offers seamless access to public data across Europe, with over 13 content categories to categorize data, ranging from health or education to transport or even science and justice. Anyone, citizens, businesses, journalists or administrations can search, access and re-use the full data collection. A wide range of data is available, from crime records in Helsinki, labor mobility in the Netherlands, forestry maps in France to the impact of digitization in Poland…..The study, “Open Data Maturity in Europe 2015: Insights into the European state of play”, uses two key indicators: Open Data Readiness and Portal Maturity. These indicators cover both the maturity of national policies supporting Open Data as well as an assessment of the features made available on national data portals. The study shows that the EU28+ have completed just 44% of the journey towards achieving full Open Data Maturity and there are large discrepancies across countries. A third of European countries (32%), recognized globally, are leading the way with solid policies, licensing norms, good portal traffic and many local initiatives and events to promote Open Data and its re-use….(More)”

Creating Value through Open Data

Australian government: “Earlier in 2015, Michael Thawley, Secretary of the Department of the Prime Minister and Cabinet (PM&C), commissioned an in-house study into how public sector data can be better used to achieve efficiencies for government, enable better service delivery and properly be used by the private sector to stimulate economic activity…..

There are four commonly used classifications of data: personal data, research data, open data and security data. Each type of data is used for different purposes and requires a different set of considerations, as the graphic below illustrates. The project focused on how the Australian Public Service manages its research data and open data, while ensuring personal data was kept appropriately secured. Security data was beyond the scope of this project.

4 different types of data and their different purposes

The project found that there are pockets of excellence across the Australian Public Service, with some agencies actively working on projects that focus on a richer analysis of linked data. However, this approach is fragmented and is subject to a number of barriers, both perceived and real. These include cultural and legislative barriers, and a data analytics skills and capability shortage across the Australian Public Service.

To overcome these barriers, the project established a roadmap to make better use of public data, comprising an initial period to build confidence and momentum across the APS, and a longer term set of initiatives to systematise the use, publishing and sharing of public data.

The report is available from the link below: Public Sector Data Management Project

Public Sector Data Management Project

Paper by Gherghina, Sergiu and Miscoiu, Sergiu: “Constitutional reform is a tedious process that requires long periods of time, a relatively broad consensus among the political actors, and often needs popular approval. In spite of these, Romania changed its constitution once (2003) and witnessed several unsuccessful revisions. The most recent attempt, in 2013, has introduced the deliberative dimension in the form of a constitutional forum. This article investigates the legitimacy of this deliberative practice using a tri-dimensional approach: input, throughput, and output legitimacy. Our qualitative study relying on direct observation and secondary data analysis concludes that while input and throughput legitimacy were achieved to great extent, the output legitimacy was low….(More)”

Crowd Sourced Legislation and Politics: The Legitimacy of Constitutional Deliberation in Romania

DFAT’s innovationXchange is seeing throughout our conversations with other countries that so many of us are looking at the issue of innovation.  …There is a great deal of interest in exploring how we can share information across borders, how we use that information to trigger new ideas, and how we leverage the skills and knowledge of others to achieve better outcomes. Innovation is fast becoming a common objective, something we all aim to embed in our respective organisations, but which we know we cannot do alone. The problems we seek to solve are global and a collaborative, innovative approach to solve them is needed….

This makes me think, is innovation the new diplomatic tool on which we can base new or enhanced relationships on?  Can we use the shared goal of doing things better, more cost effectively harnessing the knowledge and capital that sits outside governments to not only have a better impact but bring countries closer together in a collaborative partnership?  Could these collaborative partnerships even contribute to increased regional stability?

Innovation is fuelled by collaboration – taking an idea, sharing with others, using their knowledge and creativity to improve the idea, building on it, testing it, adapting and testing again.  This collaborative process aligns very well with the intent behind diplomacy – the act of a state seeking toachieve its aims, in relation to those of others, through dialogue and negotiation.

This is already happening to some extent with like-mindeds, like UK and US.  But innovation is about risk taking, trying new things and stepping outside of the familiar and comfortable. The emergence of new groupings, like MIKTA, and the increasing engagement of the private sector in partnering for social impact expands the opportunities to learn about other approaches and find complementary skills and knowledge.

This is all about making collaboration, co-creation and through that, innovation a way of working – an approach we can take to working with other states and other organisations. While innovation is the latest buzzword in government and in the development community, it will remain just a buzzword, easily replaced by the next trend, unless we look for opportunities to work with others to co-create and innovate to solve shared problems….(More)”

Can we achieve effective economic diplomacy without innovation diplomacy?

Paper by Flyverbom, Mikkel and Madsen, Anders Klinkby and Rasche, Andreas: “This paper conceptualizes how large-scale data and algorithms condition and reshape knowledge production when addressing international development challenges. The concept of governmentality and four dimensions of an analytics of government are proposed as a theoretical framework to examine how big data is constituted as an aspiration to improve the data and knowledge underpinning development efforts. Based on this framework, we argue that big data’s impact on how relevant problems are governed is enabled by (1) new techniques of visualizing development issues, (2) linking aspects of international development agendas to algorithms that synthesize large-scale data, (3) novel ways of rationalizing knowledge claims that underlie development efforts, and (4) shifts in professional and organizational identities of those concerned with producing and processing data for development. Our discussion shows that big data problematizes selected aspects of traditional ways to collect and analyze data for development (e.g. via household surveys). We also demonstrate that using big data analyses to address development challenges raises a number of questions that can deteriorate its impact….(More)

Big Data as Governmentality – Digital Traces, Algorithms, and the Reconfiguration of Data in International Development

Paper by Zuiderveen Borgesius, Frederik J. and van Eechoud, Mireille and Gray, Jonathan: “Open data are held to contribute to a wide variety of social and political goals, including strengthening transparency, public participation and democratic accountability, promoting economic growth and innovation, and enabling greater public sector efficiency and cost savings. However, releasing government data that contain personal information may threaten privacy and related rights and interests. In this paper we ask how these privacy interests can be respected, without unduly hampering benefits from disclosing public sector information. We propose a balancing framework to help public authorities address this question in different contexts. The framework takes into account different levels of privacy risks for different types of data. It also separates decisions about access and re-use, and highlights a range of different disclosure routes. A circumstance catalogue lists factors that might be considered when assessing whether, under which conditions, and how a dataset can be released. While open data remains an important route for the publication of government information, we conclude that it is not the only route, and there must be clear and robust public interest arguments in order to justify the disclosure of personal information as open data….(More)

Open Data, Privacy, and Fair Information Principles: Towards a Balancing Framework

Geeta Padmanabhan at the Hindu: “Ippodhu, a mobile app, is all about crowd-sourced civic participation for good governance…Last week, a passer-by noticed how the large hoardings outside Vivekanandar Illam, facing Marina Beach, blocked the view of the iconic building. Enraged, he whipped out his smartphone, logged on to Ippodhu and wrote: “How is this allowed? The banners are in the walking space and we can’t see the historic building!” Ippodhu.com carried the story with pictures.

“On Ippodhu, a community information mobile application, the person complaining has the option to do more,” says Peer Mohamed, the team leader of the app/website. “He could have registered a complaint with the police, the Corporation or a relevant NGO, using the ‘Act’ option. This facility makes Ippodhu a valuable tool for beleaguered citizens to complain and puts it above other social media avenues.”

Users can choose between Tamil and English, and read the latest posts just as they would in a Twitter feed. While posting, your location is geo-tagged automatically; if you find that intrusive, you can post anonymously. There is no word limit and one can enlarge the font, write an essay, a note or a rant and post it under one of 15 categories. I decided to check out the app and created an account. My post went live in less than a minute. Then I moved to Ippodhu’s USP. I clicked‘Act’, chose ‘civic issue’ as the category, and posted a note about flooding in my locality. “It’s on Apple and Android as just text now, but expect picture and video features soon when the circulation hits the target,” says Peer. “My team of 12 journalists curates the feeds 24/7, allowing no commercials, ads or abusive language. We want to keep it non-controversial and people-friendly.” It’s crowd-sourced citizen journalism and civic participation for good governance….(More)”

For people, by people

George I. Seffers at Signal: “U.S. intelligence agencies are in the business of predicting the future, but no one has systematically evaluated the accuracy of those predictions—until now. The intelligence community’s cutting-edge research and development agency uses a handful of predictive analytics programs to measure and improve the ability to forecast major events, including political upheavals, disease outbreaks, insider threats and cyber attacks.

The Office for Anticipating Surprise at the Intelligence Advanced Research Projects Activity (IARPA) is a place where crystal balls come in the form of software, tournaments and throngs of people. The office sponsors eight programs designed to improve predictive analytics, which uses a variety of data to forecast events. The programs all focus on incidents outside of the United States, and the information is anonymized to protect privacy. The programs are in different stages, some having recently ended as others are preparing to award contracts.

But they all have one more thing in common: They use tournaments to advance the state of the predictive analytic arts. “We decided to run a series of forecasting tournaments in which people from around the world generate forecasts about, now, thousands of real-world events,” says Jason Matheny, IARPA’s new director. “All of our programs on predictive analytics do use this tournament style of funding and evaluating research.” The Open Source Indicators program used a crowdsourcing technique in which people across the globe offered their predictions on such events as political uprisings, disease outbreaks and elections.

The data analyzed included social media trends, Web search queries and even cancelled dinner reservations—an indication that people are sick. “The methods applied to this were all automated. They used machine learning to comb through billions of pieces of data to look for that signal, that leading indicator, that an event was about to happen,” Matheny explains. “And they made amazing progress. They were able to predict disease outbreaks weeks earlier than traditional reporting.” The recently completed Aggregative Contingent Estimation (ACE) program also used a crowdsourcing competition in which people predicted events, including whether weapons would be tested, treaties would be signed or armed conflict would break out along certain borders. Volunteers were asked to provide information about their own background and what sources they used. IARPA also tested participants’ cognitive reasoning abilities. Volunteers provided their forecasts every day, and IARPA personnel kept score. Interestingly, they discovered the “deep domain” experts were not the best at predicting events. Instead, people with a certain style of thinking came out the winners. “They read a lot, not just from one source, but from multiple sources that come from different viewpoints. They have different sources of data, and they revise their judgments when presented with new information. They don’t stick to their guns,” Matheny reveals. …

The ACE research also contributed to a recently released book, Superforecasting: The Art and Science of Prediction, according to the IARPA director. The book was co-authored, along with Dan Gardner, by Philip Tetlock, the Annenberg University professor of psychology and management at the University of Pennsylvania who also served as a principal investigator for the ACE program. Like ACE, the Crowdsourcing Evidence, Argumentation, Thinking and Evaluation program uses the forecasting tournament format, but it also requires participants to explain and defend their reasoning. The initiative aims to improve analytic thinking by combining structured reasoning techniques with crowdsourcing.

Meanwhile, the Foresight and Understanding from Scientific Exposition (FUSE) program forecasts science and technology breakthroughs….(More)”

Decoding the Future for National Security

Springwise: “The Food For Fines scheme enables Lexington residents to trade cans of food for a reduction on their unpaid parking ticket fine. In 2014, 14 percent of US households had unstable food resources, so it is no wonder that we have seen a number of initiatives that help distribute food among the hungry. In Minneapolis, for example, the police department are distributing healthy food boxes with nutrition advice during their patrol. Now, the Lexington Parking Authority has launched the Food For Finesscheme, during which residents can trade cans of food for a reduction on their unpaid parking ticket fine.

The drive is being run in collaboration with local food bank God’s Pantry. To participate, anyone who has an outstanding or past parking citation from LEXPARK or the Lexington Police Department, can receive a USD 15 reduction in exchange for 10 cans of food….(More)”

Tinned food donations reduce parking fines
Tom Simonite in MIT Technology Review: “Software trained to know the difference between an honest mistake and intentional vandalism is being rolled out in an effort to make editing Wikipedia less psychologically bruising. It was developed by the Wikimedia Foundation, the nonprofit organization that supports Wikipedia.

One motivation for the project is a significant decline in the number of people considered active contributors to the flagship English-language Wikipedia: it has fallen by 40 percent over the past eight years, to about 30,000. Research indicates that the problem is rooted in Wikipedians’ complex bureaucracy and their often hard-line responses to newcomers’ mistakes, enabled by semi-automated tools that make deleting new changes easy (see “The Decline of Wikipedia”).

Aaron Halfaker, a senior research scientist at Wikimedia Foundation who helped diagnose that problem, is now leading the project trying to fight it, which relies on algorithms with a sense for human fallibility. His ORES system, for “Objective Revision Evaluation Service,” can be trained to score the quality of new changes to Wikipedia and judge whether an edit was made in good faith or not….

ORES can allow editing tools to direct people to review the most damaging changes. The software can also help editors treat rookie or innocent mistakes more appropriately, says Halfaker. “I suspect the aggressive behavior of Wikipedians doing quality control is because they’re making judgments really fast and they’re not encouraged to have a human interaction with the person,” he says. “This enables a tool to say, ‘If you’re going to revert this, maybe you should be careful and send the person who made the edit a message.’”

..Earlier efforts to make Wikipedia more welcoming to newcomers have been stymied by the very community that’s supposed to benefit. Wikipedians rose up in 2013 when Wikimedia made a word-processor-style editing interface the default, forcing the foundation to make it opt-in instead. To this day, the default editor uses a complicated markup language called Wikitext…(More)”

Artificial Intelligence Aims to Make Wikipedia Friendlier and Better

Get the latest news right in you inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday