Paper by Paolo Parigi, Jessica J. Santana and Karen S. Cook in Social Psychology Quarterly: “Thanks to the Internet and the related availability of “Big Data,” social interactions and their environmental context can now be studied experimentally. In this article, we discuss a methodology that we term the online field experiment to differentiate it from more traditional lab-based experimental designs. We explain how this experimental method can be used to capture theoretically relevant environmental conditions while also maximizing the researcher’s control over the treatment(s) of interest. We argue that this methodology is particularly well suited for social psychology because of its focus on social interactions and the factors that influence the nature and structure of these interactions. We provide one detailed example of an online field experiment used to investigate the impact of the sharing economy on trust behavior. We argue that we are fundamentally living in a new social world in which the Internet mediates a growing number of our social interactions. These highly prevalent forms of social interaction create opportunities for the development of new research designs that allow us to advance our theories of social interaction and social structure with new data sources….(More)”.
Realising the Data Revolution for Sustainable Development: Towards Capacity Development 4.0
Report by Niels Keijzer and Stephan Klingebiel for Paris21: “An ever-deepening data revolution is shaping everyday lives in many parts of the world. As just one of many mindboggling statistics on Big Data, it has been estimated that by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet. The benefits of the data revolution extend to different groups of people, social movements, institutions and businesses. Yet many people and countries do not have access to these positive benefits and in richer countries potentially positive changes raise suspicion amongst citizens as well as concerns related to privacy and confidentiality. The availability of potential advantages is, to a large extent, guided by levels of development and income. Despite the rapid spread of mobile phone technology that allows regions otherwise disconnected from the grid to ‘leapfrog’ in terms of using and producing data and statistics, poor people are still less likely to benefit from the dramatic changes in the field of data.
Against the background of the 2030 Agenda for Sustainable Development and its Sustainable Development Goals (SDGs), the main challenge for statistics is to manage the data revolution in support of sustainable development. The main priorities are the broadening and deepening of production, dissemination and use of data and statistics, and achieving them requires identifying those population groups which are most vulnerable and making governments more accountable to their citizens. In parallel, the risks accompanying the data revolution need to be mitigated and reduced, including the use of data for purposes of repression or otherwise infringing on the privacy of citizens. In addition to representing a universal agenda that breaks away from the dichotomy of developed and developing countries, the new agenda calls for tailor-made approaches in each country or region concerned, supported by global actions. The 2030 Agenda further states the international community’s realisation of the need to move away from ‘business as usual’ in international support for data and statistics.
The most important driving forces shaping the data revolution are domestic (legal) frameworks and public policies across the globe. This applies not only to wealthier countries but also developing countries2 , and external support cannot compensate for absent domestic leadership and investment. Technical, legal and political factors all affect whether countries are willing and able to succeed in benefiting from the data revolution. However, in both low income countries and lower-middle income countries, and to some extent in upper-middle income countries, we can observe two constraining factors in this regard, capacities and funding. These factors are, to some degree, interrelated: if funding is not sufficiently available it might be difficult to increase the capacities required, and if capacities are insufficient funding issues might be more challenging….(More)”
Governing through Goals
Book edited by Norichika Kanie and Frank Biermann: “In September 2015, the United Nations General Assembly adopted the Sustainable Development Goals as part of the 2030 Agenda for Sustainable Development. The Sustainable Development Goals built on and broadened the earlier Millennium Development Goals, but they also signaled a larger shift in governance strategies. The seventeen goals add detailed content to the concept of sustainable development, identify specific targets for each goal, and help frame a broader, more coherent, and transformative 2030 agenda. The Sustainable Development Goals aim to build a universal, integrated framework for action that reflects the economic, social, and planetary complexities of the twenty-first century.
This book examines in detail the core characteristics of goal setting, asking when it is an appropriate governance strategy and how it differs from other approaches; analyzes the conditions under which a goal-oriented agenda can enable progress toward desired ends; and considers the practical challenges in implementation….(More)”
What Algorithms Want
Book by Ed Finn: “We depend on—we believe in—algorithms to help us get a ride, choose which book to buy, execute a mathematical proof. It’s as if we think of code as a magic spell, an incantation to reveal what we need to know and even what we want. Humans have always believed that certain invocations—the marriage vow, the shaman’s curse—do not merely describe the world but make it. Computation casts a cultural shadow that is shaped by this long tradition of magical thinking. In this book, Ed Finn considers how the algorithm—in practical terms, “a method for solving a problem”—has its roots not only in mathematical logic but also in cybernetics, philosophy, and magical thinking.
Finn argues that the algorithm deploys concepts from the idealized space of computation in a messy reality, with unpredictable and sometimes fascinating results. Drawing on sources that range from Neal Stephenson’s Snow Crash to Diderot’s Encyclopédie, from Adam Smith to the Star Trek computer, Finn explores the gap between theoretical ideas and pragmatic instructions. He examines the development of intelligent assistants like Siri, the rise of algorithmic aesthetics at Netflix, Ian Bogost’s satiric Facebook game Cow Clicker, and the revolutionary economics of Bitcoin. He describes Google’s goal of anticipating our questions, Uber’s cartoon maps and black box accounting, and what Facebook tells us about programmable value, among other things.
If we want to understand the gap between abstraction and messy reality, Finn argues, we need to build a model of “algorithmic reading” and scholarship that attends to process, spearheading a new experimental humanities….(More)”
A Data-driven Approach to Assess the Potential of Smart Cities: The Case of Open Data for Brussels Capital Region
Paper by Miguel Angel Gomez Zotano and Hugues Bersini in Energy Procedia: “The success of smart city projects is intrinsically related to the existence of large volumes of data that could be processed to achieve their objectives. For this purpose, the plethora of data stored by public administrations becomes an incredibly rich source of insight and information due to its volume and diversity. However, it was only with the Open Government Movement when governments have been concerned with the need to open their data to citizens and businesses. Thus, with the emergence of open data portals, these myriad of data enables the development of new business models. The achievement of the benefits sought by making this data available triggers new challenges to cope with the diversity of sources involved. The business potential could be jeopardized by the scarcity of relevant data in the different blocks and domains that makes a city and by the lack of a common approach to data publication, in terms of format, content, etc.
This paper introduces a holistic approach that relies on the Smart City Ontology as the cornerstone to standardise and structure data. This approach, which is proposed to be an analytical tool to assess the potential of data in a given smart city, analyses three main aspects: availability of data, the criteria that data should fulfil to be considered eligible and the model used to structure and organise data. The approach has been applied to the case of Brussels Capital Region, which first results are presented and discussed in this paper. The main conclusion that has been obtained is that, besides its commitment with open data and smart cities, Brussels is not mature enough to fully exploit the real intelligence that smart cities could provide. This maturity would be achieved in the following years with the implementation of the new Brussels’ Smart City Strategy…(More)”.
The Governance Report 2017
Report by The Hertie School of Governance: “Looking at recent developments around the world, it seems that democratic values — from freedom of association and speech to fair and free elections and a system of checks and balances — have come under threat. Experts have, however, disproportionately focused on the problems of democracy in the West, and pointed to familiar sets of shortcomings and emerging deficiencies. By contrast, and with few exceptions, there is less attention to assessing the numerous efforts and innovative activities that are taking place at local, national and international levels. They seek to counteract backsliding and subversion by improving resilience and consolidation and by promoting the expansion of democracy, especially in an era of limited sovereignty and, frequently also, statehood.
The Governance Report 2017 focuses on those policies, programs, and initiatives meant to address the causes of the current democratic malaise, to foster democratic resilience, and to stimulate the (re-)consolidation and development of democratic regimes. The Report’s ambition, reflecting its evidence-based approach, is to shed light on how to manage and care for democracy itself. Specifically, against the backdrop of an assessment of the state of democracy and enriched by cross-national, comparative indicators and case studies, the Report emphasizes solutions geared toward enhancing citizen participation and improving institutions in various contexts, including the rise of neo-populism. Going beyond descriptions of best practices, the Report also examines their origins, identifies the actual and potential trade-offs these solutions entail, and makes concrete recommendations to policymakers….(More)”
Access to New Data Sources for Statistics: Business Models and Incentives for the Corporate Sector
Report by Thilo Klein and Stefaan Verhulst: “New data sources, commonly referred to as “Big Data”, have attracted growing interest from National Statistical Institutes. They have the potential to complement official and more conventional statistics used, for instance, to determine progress towards the Sustainable Development Goals (SDGs) and other targets. However, it is often assumed that this type of data is readily available, which is not necessarily the case. This paper examines legal requirements and business incentives to obtain agreement on private data access, and more generally ways to facilitate the use of Big Data for statistical purposes. Using practical cases, the paper analyses the suitability of five generic data access models for different data sources and data uses in an emerging new data ecosystem. Concrete recommendations for policy action are presented in the conclusions….(More)”.
Open Data Maturity in Europe 2016
European Data Portal Report: “…the second in a series of annual studies and explores the level of Open Data Maturity in the EU28 and Norway, Switzerland and Liechtenstein – referred to as EU28+. The measurement is built on two key indicators Open Data Readiness and Portal Maturity, thereby covering the level of development of national activities promoting Open Data as well as the level of development of national portals. In 2016, with a 28.6% increase compared to 2015, the EU28+ countries completed over 55% of their Open Data journey showing that, by 2016, a majority of the EU28+ countries have successfully developed a basic approach to address Open Data. The Portal Maturity level increased by 22.6 percentage points from 41.7% to 64.3% thanks to the development of more advanced features on country data portals. The overall Open Data Maturity groups countries into different clusters: Beginners, Followers, Fast Trackers and Trend Setters. Barriers do remain to move Open Data forward. The report concludes on a series of recommendations, providing countries with guidance to further improve Open Data maturity. Countries need to raise more (political) awareness around Open Data, increase automated processes on their portals to increase usability and re-usability of data, and organise more events and trainings to support both local and national initiatives….(More)”.
Accelerating the UN’s Sustainable Development Goals through AI
“Special edition of ITU news magazine (PDF)… about:
- How Artificial Intelligence (AI) can boost sustainable development
- How to prepare for the opportunities and risks of an AI-driven society
- The AI for Good Global Summit that ITU is hosting in June with XP…(More)”
Data Quality Tester
Publish What You Fund has launched a new online tool that allows aid and development finance publishers to independently check the quality of their data before they publish it to IATI. The aim of the Data Quality Tester – currently in Beta – is to indicate when information falls short of the specific data quality tests used to assess donors in the Aid Transparency Index. We expect it to be most useful for donors who are included in the Index to monitor their own progress both during and outside of the Index cycle.
Who is the Data Quality Tester for?
The Data Quality Tester is also suitable for organisations who want to start publishing in the IATI Standard and for those that do not qualify for inclusion in the Index, or that used to be assessed but are not currently. The open source online tool is useful because:
- Both the IATI Standard and the Index tests can at times be complex and the tool allows a quick check against them, so donor agency staff can understand any issues
- It allows publishers to internally and independently check the quality of their information before uploading to the IATI Registry, saving time and making sure that when data is uploaded, it is as good as it can be
- It provides publishers with an opportunity to assess their data against the updated Index methodology and recognise where they need to improve
The tool is now live and available to use at: http://dataqualitytester.publishwhatyoufund.org/…..(More)”