Europen Open Data Portal: A series of indicators have been selected to measure Open Data maturity across Europe. These indicators cover the level of development of national policies promoting Open Data, an assessment of the features made available on national data portals as well as the expected impact of Open Data….(More)”
Citizen engagement in rulemaking — evidence on regulatory practices in 185 countries
Paper by Johns,Melissa Marie and Saltane,Valentina for the World Bank: “… presents a new database of indicators measuring the extent to which rulemaking processes are transparent and participatory across 185 countries. The data look at how citizen engagement happens in practice, including when and how governments open the policy-making process to public input. The data also capture the use of ex ante assessments to determine the possible cost of compliance with a proposed new regulation, the likely administrative burden of enforcing the regulation, and its potential environmental and social impacts. The data show that citizens have more opportunities to participate directly in the rulemaking process in developed economies than in developing ones. Differences are also apparent among regions: rulemaking processes are significantly less transparent and inclusive in Sub-Saharan Africa, the Middle East and North Africa, and South Asia on average than in Organisation for Economic Co-operation and Development high-income countries, Europe and Central Asia, and East Asia and the Pacific. In addition, ex ante impact assessments are much more common among higher-income economies than among lower-income ones. And greater citizen engagement in rulemaking is associated with higher-quality regulation, stronger democratic regimes, and less corrupt institutions….(More)”
Collective intelligence and international development
Gina Lucarelli, Tom Saunders and Eddie Copeland at Nesta: “The mountain kingdom of Lesotho, a small landlocked country in Sub-Saharan Africa, is an unlikely place to look for healthcare innovation. Yet in 2016, it became the first country in Africa to deploy the test and treat strategy for treating people with HIV. Rather than waiting for white blood cell counts to drop, patients begin treatment as soon as they are diagnosed. This strategy is backed by the WHO as it has the potential to increase the number of people who are able to access treatment, consequently reducing transmisssion and keeping people with HIV healthy and alive for longer.
While lots of good work is underway in Lesotho, and billions have been spent on HIV programmes in the country, the percentage of the population infected with HIV has remained steady and is now almost 23%. Challenges of this scale need new ideas and better ways to adopt them.
On a recent trip to Lesotho as part of a project with the United Nations Development Group, we met various UN agencies, the World Bank, government leaders, civil society actors and local businesses, to learn about the key development issues in Lesotho and to discuss the role that ‘collective intelligence’ might play in creating better country development plans. The key question Nesta and the UN are working on is: how can we increase the impact of the UN’s work by tapping into the ideas, information and possible solutions which are distributed among many partners, the private sector, and the 2 million people of Lesotho?
…our framework of collective intelligence, a set of iterative stages which can help organisations like the UN tap into the ideas, information and possible solutions of groups and individuals which are not normally involved included in the problem solving process. For each stage, we also presented a number of examples of how this works in practice.
Collective intelligence framework – stages and examples
-
Better understanding the facts, data and experiences: New tools, from smartphones to online communities enable researchers, practitioners and policymakers to collect much larger amounts of data much more quickly. Organisations can use this data to target their resources at the most critical issues as well as feed into the development of products and services that more accurately meet the needs of citizens. Examples include mPower, a clinical study which used an app to collect data about people with Parkinsons disease via surveys and smartphone sensors.
-
Better development of options and ideas: Beyond data collection, organisations can use digital tools to tap into the collective brainpower of citizens to come up with better ideas and options for action. Examples include participatory budgeting platforms like “Madame Mayor, I have an idea” and challenge prizes, such as USAID’s Ebola grand challenge.
-
Better, more inclusive decision making: Decision making and problem solving are usually left to experts, yet citizens are often best placed to make the decisions that will affect them. New digital tools make it easier than ever for governments to involve citizens in policymaking, planning and budgeting. Our D-CENT tools enable citizen involvement in decision making in a number of fields. Another example is the Open Medicine Project, which designs digital tools for healthcare in consultation with both practitioners and patients.
-
Better oversight and improvement of what is done: From monitoring corruption to scrutinising budgets, a number of tools allow broad involvement in the oversight of public sector activity, potentially increasing accountability and transparency. The Family and Friends Test is a tool that allows NHS users in the UK to submit feedback on services they have experienced. So far, 25 million pieces of feedback have been submitted. This feedback can be used to stimulate local improvement and empower staff to carry out changes… (More)”
Bringing together the United States of data
“The U.S. Data Federation will support government-wide data standardization and data federation initiatives across both Federal agencies and local governments. This is intended to be a fundamental coordinating mechanism for a more open and interconnected digital government by profiling and supporting use-cases that demonstrate unified and coherent data architectures across disparate government agencies. These examples will highlight emerging data standards and API initiatives across all levels of government, convey the level of maturity for each effort, and facilitate greater participation by government agencies. Initiatives that may be profiled within the U.S. Data Federation include Open311, DOT’s National Transit Map, the Project Open Data metadata schema, Contact USA, and the Police Data Initiative. As part of the U.S. Data Federation, GSA will also pilot the development of reusable components needed for a successful data federation strategy including schema documentation tools, schema validation tools, and automated data aggregation and normalization capabilities. The U.S. Data Federation will provide more sophisticated and seamless opportunities on the foundation of U.S. open data initiatives by allowing the public to more easily do comparative data analysis across government bodies and create applications that work across multiple government agencies….(More)”
Measuring Scientific Impact Beyond Citation Counts
Robert M. Patton, Christopher G. Stahl and Jack C. Wells at DLib Magazine: “Measuring scientific progress remains elusive. There is an intuitive understanding that, in general, science is progressing forward. New ideas and theories are formed, older ideas and theories are confirmed, rejected, or modified. Progress is made. But, questions such as how is it made, by whom, how broadly, or how quickly present significant challenges. Historically, scientific publications reference other publications if the former publication in some way shaped the work that was performed. In other words, one publication “impacted” a latter one. The implication of this impact revolves around the intellectual content of the idea, theory, or conclusion that was formed. Several metrics such as h-index or journal impact factor (JIF) are often used as a means to assess whether an author, article, or journal creates an “impact” on science. The implied statement behind high values for such metrics is that the work must somehow be valuable to the community, which in turn implies that the author, article, or journal somehow has influenced the direction, development, or progress of what others in that field do. Unfortunately, the drive for increased publication revenue, research funding, or global recognition has lead to a variety of external factors completely unrelated to the quality of the work that can be used to manipulate key metric values. In addition, advancements in computing and data sciences field have further altered the meaning of impact on science.
The remainder of this paper will highlight recent advancements in both cultural and technological factors that now influence scientific impact as well as suggest new factors to be leveraged through full content analysis of publications….(More)”
Twitter, UN Global Pulse announce data partnership
PressRelease: “Twitter and UN Global Pulse today announced a partnership that will provide the United Nations with access to Twitter’s data tools to support efforts to achieve the Sustainable Development Goals, which were adopted by world leaders last year.
Every day, people around the world send hundreds of millions of Tweets in dozens of languages. This public data contains real-time information on many issues including the cost of food, availability of jobs, access to health care, quality of education, and reports of natural disasters. This partnership will allow the development and humanitarian agencies of the UN to turn these social conversations into actionable information to aid communities around the globe.
“The Sustainable Development Goals are first and foremost about people, and Twitter’s unique data stream can help us truly take a real-time pulse on priorities and concerns — particularly in regions where social media use is common — to strengthen decision-making. Strong public-private partnerships like this show the vast potential of big data to serve the public good,” said Robert Kirkpatrick, Director of UN Global Pulse.
“We are incredibly proud to partner with the UN in support of the Sustainable Development Goals,” said Chris Moody, Twitter’s VP of Data Services. “Twitter data provides a live window into the public conversations that communities around the world are having, and we believe that the increased potential for research and innovation through this partnership will further the UN’s efforts to reach the Sustainable Development Goals.”
Organizations and business around the world currently use Twitter data in many meaningful ways, and this unique data source enables them to leverage public information at scale to better inform their policies and decisions. These partnerships enable innovative uses of Twitter data, while protecting the privacy and safety of Twitter users.
UN Global Pulse’s new collaboration with Twitter builds on existing R&D that has shown the power of social media for social impact, like measuring the impact of public health campaigns, tracking reports of rising food prices, or prioritizing needs after natural disasters….(More)”
Impact Evaluation in Practice
Book of the World Bank Group and the Inter-American Development Bank: “The second edition of the Impact Evaluation in Practice handbook is a comprehensive and accessible introduction to impact evaluation for policymakers and development practitioners. First published in 2011, it has been used widely across the development and academic communities. The book incorporates real-world examples to present practical guidelines for designing and implementing impact evaluations. Readers will gain an understanding of impact evaluation and the best ways to use impact evaluations to design evidence-based policies and programs. The updated version covers the newest techniques for evaluating programs and includes state-of-the-art implementation advice, as well as an expanded set of examples and case studies that draw on recent development challenges. It also includes new material on research ethics and partnerships to conduct impact evaluation. The handbook is divided into four sections: Part One discusses what to evaluate and why; Part Two presents the main impact evaluation methods; Part Three addresses how to manage impact evaluations; Part Four reviews impact evaluation sampling and data collection. Case studies illustrate different applications of impact evaluations. The book links to complementary instructional material available online, including an applied case as well as questions and answers. The updated second edition will be a valuable resource for the international development community, universities, and policymakers looking to build better evidence around what works in development….(More Resources) (Download here)”
The mAgri Design Toolkit
“The mAgri Design Toolkit is a collection of instructions, tools, and stories to help develop and scale mobile agriculture products by applying a user-centered design approach.
Many mAgri services that have launched in emerging markets have suffered from low user adoption, despite coming from leading mobile network operators and value-added service (VAS) providers. This toolkit is one of the outcomes of a partnership between the GSMA mAgri Programme and frog, and provides operational guidance on how to bring the user-centred design approach into the product development process to better connect mAgri services with the needs of farmers and other key actors in the ecosystem….(More)”
Responsible Data in Agriculture
Report by Lindsay Ferris and Zara Rahman for GODAN: “The agriculture sector is creating increasing amounts of data, from many different sources. From tractors equipped with GPS tracking, to open data released by government ministries, data is becoming ever more valuable, as agricultural business development and global food policy decisions are being made based upon data. But the sector is also home to severe resource inequality. The largest agricultural companies make billions of dollars per year, in comparison with subsistence farmers growing just enough to feed themselves, or smallholder farmers who grow enough to sell on a year-by-year basis. When it comes to data and technology, these differences in resources translate to stark power imbalances in data access and use. The most well resourced actors are able to delve into new technologies and make the most of those insights, whereas others are unable to take any such risks or divert any of their limited resources. Access to and use of data has radically changed the business models and behaviour of some of those well resourced actors, but in contrast, those with fewer resources are receiving the same, limited access to information that they always have.
In this paper, we have approached these issues from a responsible data perspective, drawing upon the experience of the Responsible Data community1 who over the past three years have created tools, questions and resources to deal with the ethical, legal, privacy and security challenges that come from new uses of data in various sectors. This piece aims to provide a broad overview of some of the responsible data challenges facing these actors, with a focus on the power imbalance between actors, and looking into how that inequality affects behaviour when it comes to the agricultural data ecosystem. What are the concerns of those with limited resources, when it comes to this new and rapidly changing data environment? In addition, what are the ethical grey areas or uncertainties that we need to address in the future? As a first attempt to answer these questions, we spoke to 14 individuals with various perspectives on the sector to understand what the challenges are for them and for the people they work with. We also carried out desk research to dive deeper into these issues, and we provide here an analysis of our findings and responsible data challenges….(More)”
How to advance open data research: Towards an understanding of demand, users, and key data
Danny Lämmerhirt and Stefaan Verhulst at IODC blog: “…Lord Kelvin’s famous quote “If you can not measure it, you can not improve it” equally applies to open data. Without more evidence of how open data contributes to meeting users’ needs and addressing societal challenges, efforts and policies toward releasing and using more data may be misinformed and based upon untested assumptions.
When done well, assessments, metrics, and audits can guide both (local) data providers and users to understand, reflect upon, and change how open data is designed. What we measure and how we measure is therefore decisive to advance open data.
Back in 2014, the Web Foundation and the GovLab at NYU brought together open data assessment experts from Open Knowledge, Organisation for Economic Co-operation and Development, United Nations, Canada’s International Development Research Centre, and elsewhere to explore the development of common methods and frameworks for the study of open data. It resulted in a draft template or framework for measuring open data. Despite the increased awareness for more evidence-based open data approaches, since 2014 open data assessment methods have only advanced slowly. At the same time, governments publish more of their data openly, and more civil society groups, civil servants, and entrepreneurs employ open data to manifold ends: the broader public may detect environmental issues and advocate for policy changes, neighbourhood projects employ data to enable marginalized communities to participate in urban planning, public institutions may enhance their information exchange, and entrepreneurs embed open data in new business models.
In 2015, the International Open Data Conference roadmap made the following recommendations on how to improve the way we assess and measure open data.
- Reviewing and refining the Common Assessment Methods for Open Data framework. This framework lays out four areas of inquiry: context of open data, the data published, use practices and users, as well as the impact of opening data.
- Developing a catalogue of assessment methods to monitor progress against the International Open Data Charter (based on the Common Assessment Methods for Open Data).
- Networking researchers to exchange common methods and metrics. This helps to build methodologies that are reproducible and increase credibility and impact of research.
- Developing sectoral assessments.
In short, the IODC called for refining our assessment criteria and metrics by connecting researchers, and applying the assessments to specific areas. It is hard to tell how much progress has been made in answering these recommendations, but there is a sense among researchers and practitioners that the first two goals are yet to be fully addressed.
Instead we have seen various disparate, yet well meaning, efforts to enhance the understanding of the release and impact of open data. A working group was created to measure progress on the International Open Data Charter, which provides governments with principles for implementing open data policies. While this working group compiled a list of studies and their methodologies, it did not (yet) deepen the common framework of definitions and criteria to assess and measure the implementation of the Charter.
In addition, there is an increase of sector- and case-specific studies that are often more descriptive and context specific in nature, yet do contribute to the need for examples that illustrate the value proposition for open data.
As such, there seems to be a disconnect between top-level frameworks and on-the-ground research, preventing the sharing of common methods and distilling replicable experiences about what works and what does not….(More)”