Stefaan Verhulst
Report by the Partnership for Public Service: “The use of data analysis, rigorous evaluation and a range of other credible strategies to inform decision-making is becoming more common across government. Even so, the movement is nascent, with leading practices implemented at some agencies, but not yet widely adopted. Much more progress is necessary. In fact, the recently enacted Foundations for Evidence-Based Policymaking Act, as well as the recently released draft Federal Data Strategy Action Plan, both prioritize broader adoption of leading practices.
To support that effort, this report highlights practical steps that agencies can take to become more data-driven and evidence-based. The findings emerged from a series of workshops and interviews conducted between April 2018 and May 2019 by the Partnership for Public Service and Grant Thornton. From these sessions, we learned that the most forward-thinking agencies rely on multiple approaches, including:
• Using top-down and bottom-up approaches to build evidence-based organizations.
• Driving longer-term and shorter-term learning.
• Using existing data and new data.
• Strengthening internal capacity and creating external research practitioner partnerships.
This report describes what these strategies look like in practice, and shares real-world and replicable examples of how leading agencies have become more data-driven and evidence-based….(More)”.
National Audit Office (UK): “Data is crucial to the way government delivers services for citizens, improves its own systems and processes, and makes decisions. Our work has repeatedly highlighted the importance of evidence-based decision-making at all levels of government activity, and the problems that arise when data is inadequate.
Government recognises the value of using data more effectively, and the importance of ensuring security and public trust in how it is used. It plans to produce a new national data strategy in 2020 to position “the UK as a global leader on data, working collaboratively and openly across government”.
To achieve its ambitions government will need to resolve fundamental challenges around how to use and share data safely and appropriately, and how to balance competing demands on public resources in a way that allows for sustained but proportionate investment in data. The future national data strategy provides the government with an opportunity to do this, building on the renewed interest and focus on the use of data within government and beyond.
Content and scope of the report
This report sets out the National Audit Office’s experience of data across government, including initial efforts to start to address the issues. From our past work we have identified three areas where government needs to establish the pre-conditions for success: clear strategy and leadership; a coherent infrastructure for managing data; and broader enablers to safeguard and support the better use of data. In this report we consider:
- the current data landscape across government (Part One);
- how government needs a clear plan and leadership to improve its use of data (Part Two);
- the quality, standards and systems needed to use data effectively (Part Three); and
- wider conditions and enablers for success (Part Four).
Concluding remarks
Past examples such as Windrush and Carer’s Allowance show how important good‑quality data is, and the consequences if not used well. Without accurate, timely and proportionate data, government will not be able get the best use out of public money or take the next step towards more sophisticated approaches to using data that can reap real rewards.
But despite years of effort and many well-documented failures, government has lacked clear and sustained strategic leadership on data. This has led to departments under-prioritising their own efforts to manage and improve data. There are some early signs that the situation is improving, but unless government uses the data strategy to push a sea change in strategy and leadership, it will not get the right processes, systems and conditions in place to succeed, and this strategy will be yet another missed opportunity….(More)”.
Report by Lorelei Kelly: “Congress represents a national cross section of civic voice. It is potentially the most diverse market for ideas in government and should be reaping the benefits of America’s creativity and knowledge. During our transition into the 21st century, this civic information asset — from lived experience to structured data — should fuel the digital infrastructure of a modern representative system. Yet Congress has thus far failed to tap this resource on behalf of its legislative and deliberative functions.
Today’s Congress can’t compete on digital infrastructure or modern data methods with the executive branch, the media or the private sector. To be sure, information weaponization, antique technology and Congress’ stubborn refusal to fund itself has arrested its development of a digital infrastructure. Congress is knowledge incapacitated, physically disconnected and technologically obsolete. In this condition, it cannot fulfill its First Branch duties as laid out in Article I of the U.S. Constitution.
Fortunately, changing the direction of Congress is now in sight. Before the end of January 2019, (1) the Foundations for Evidence-Based Policymaking Act became law, (2) the House created a Select Committee on Modernization, and (3) Congress began to restore its internal science and technology capacity.
Modernizing Congress lays out a plan to accelerate this institutional progress. It scopes out the challenge of including civic voice in the legislative and deliberative process. It then identifies trusted local information intermediaries who could act as key components of a modern knowledge commons in Congress. With three case studies, the report illustrates how members and staff are finding new ways to build connection and gather useful constituent input at the district level. The report explores an urban, rural and suburban district. It concludes that while individual members are leveraging technology to connect and use new forms of civic voice from constituents, what Congress needs most is a systemwide digital infrastructure and updated institutional standards for data collection….(More)”.
Book edited by Blayne Haggart, Kathryn Henne, and Natasha Tusikov: “This book explores the interconnected ways in which the control of knowledge has become central to the exercise of political, economic, and social power. Building on the work of International Political Economy scholar Susan Strange, this multidisciplinary volume features experts from political science, anthropology, law, criminology, women’s and gender studies, and Science and Technology Studies, who consider how the control of knowledge is shaping our everyday lives. From “weaponised copyright” as a censorship tool, to the battle over control of the internet’s “guts,” to the effects of state surveillance at the Mexico–U.S. border, this book offers a coherent way to understand the nature of power in the twenty-first century…(More)”.
Clive Thompson at Wired: “…Marine litter isn’t the only hazard whose contours we can’t fully see. The United Nations has 93 indicators to measure the environmental dimensions of “sustainable development,” and amazingly, the UN found that we have little to no data on 68 percent of them—like how rapidly land is being degraded, the rate of ocean acidification, or the trade in poached wildlife. Sometimes this is because we haven’t collected it; in other cases some data exists but hasn’t been shared globally, or it’s in a myriad of incompatible formats. No matter what, we’re flying blind. “And you can’t manage something if you can’t measure it,” says David Jensen, the UN’s head of environmental peacebuilding.
In other words, if we’re going to help the planet heal and adapt, we need a data revolution. We need to build a “digital ecosystem for the environment,” as Jensen puts it.
The good news is that we’ve got the tools. If there’s one thing tech excels at (for good and ill), it’s surveillance, right? We live in a world filled with cameras and pocket computers, titanic cloud computing, and the eerily sharp insights of machine learning. And this stuff can be used for something truly worthwhile: studying the planet.
There are already some remarkable cases of tech helping to break through the fog. Consider Global Fishing Watch, a nonprofit that tracks the world’s fishing vessels, looking for overfishing. They use everything from GPS-like signals emitted by ships to satellite infrared imaging of ship lighting, plugged into neural networks. (It’s massive, cloud-scale data: over 60 million data points per day, making the AI more than 90 percent accurate at classifying what type of fishing activity a boat is engaged in.)
“If a vessel is spending its time in an area that has little tuna and a lot of sharks, that’s questionable,” says Brian Sullivan, cofounder of the project and a senior program manager at Google Earth Outreach. Crucially, Global Fishing Watch makes its data open to anyone—so now the National Geographic Society is using it to lobby for new marine preserves, and governments and nonprofits use it to target illicit fishing.
If we want better environmental data, we’ll need for-profit companies with the expertise and high-end sensors to pitch in too. Planet, a firm with an array of 140 satellites, takes daily snapshots of the entire Earth. Customers like insurance and financial firms love that sort of data. (It helps them understand weather and climate risk.) But Planet also offers it to services like Global Forest Watch, which maps deforestation and makes the information available to anyone (like activists who help bust illegal loggers). Meanwhile, Google’s skill in cloud-based data crunching helps illuminate the state of surface water: Google digitized 30 years of measurements from around the globe—extracting some from ancient magnetic tapes—then created an easy-to-use online tool that lets resource-poor countries figure out where their water needs protecting….(More)”.
Report by Amy O’Hara: “Data sharing across government agencies allows consumers, policymakers, practitioners, and researchers to answer pressing questions. Creating a data infrastructure to enable this data sharing for higher education data is challenging, however, due to legal, privacy, technical, and perception issues. To overcome these challenges, postsecondary education can learn from other domains to permit secure, responsible data access and use. Working models from both the public sector and academia show how sensitive data from multiple sources can be linked and accessed for authorized uses.
This brief describes best practices in use today and the emerging technology that could further protect future data systems and creates a new framework, the “Five Safes”, for controlling data access and use. To support decisions facing students, administrators, evaluators, and policymakers, a postsecondary infrastructure must support cycles of data discovery, request, access, analysis, review, and release. It must be cost-effective, secure, and efficient and, ideally, it will be highly automated, transparent, and adaptable. Other industries have successfully developed such infrastructures, and postsecondary education can learn from their experiences.
A functional data infrastructure relies on trust and control between the data providers, intermediaries, and users. The system should support equitable access for approved users and offer the ability to conduct independent analyses with scientific integrity for reasonable financial costs. Policymakers and developers should ensure the creation of expedient, convenient data access modes that allow for policy analyses. …
The “Five Safes” framework describes an approach for controlling data access and use. The five safes are: safe projects, safe people, safe settings, safe data, and afe outputs….(More)”.
“Make FOIA Work is about re-imagining journalism through design, participation and collaboration. Faculty, staff and students at Emerson College and the Engagement Lab staff worked alongside the Boston Institute of Nonprofit Journalism (BINJ) and MuckRock, two independent and alternative news and information platforms and publishers, to produce a data-driven and engagement-based investigative reporting series that exposes corruption around the sales of guns in Massachusetts. Through design studios in participatory methods and data visualization, project participants created a participatory guide book for journalists, practitioners and community members on how to undertake participatory design projects with a focus on FOIA requests, community participation, and collaboration. The project also highlights the course syllabi in participatory design methods and data visualization….(More)”.
Conference Paper by Christine Meschede and Tobias Siebenlist: “Since the adoption of the United Nations’ Sustainable Development Goals (SDGs) in 2015 – an ambitious agenda to end poverty, combat environmental threats and ensure prosperity for everyone – some effort has been made regarding the adequate measuring of the progress on its targets. As the crucial point is the availability of sufficient, comparable information, open data can play a key role. The coverage of open data, i.e., data that is machine-readable, freely available and reusable for everyone, is assessed by several measurement tools. We propose the use of open governmental data to make the achievement of SDGs easy and transparent to measure. For this purpose, a mapping of the open data categories to the SDGs is presented. Further, we argue that the SDGs need to be tackled in particular at the city level. For analyzing the current applicability of open data for measuring progress on the SDGs, we provide a small-scale case study on German open data portals and the embedded data categories and datasets. The results suggest that further standardization is needed in order to be able to use open data for comparing cities and their progress towards the SDGs….(More)”.
Paper by Sandip Mukhopadhyay, Harry Bouwman and Mahadeo PrasadJaiswal: “The efficient delivery of government services to the poor, or Bottom of the Pyramid (BOP), faces many challenges. While a core problem is the lack of scalability, that could be solved by the rapid proliferation of platforms and associated ecosystems. Existing research involving platforms focus on modularity, openness, ecosystem leadership and governance, as well as on their impact on innovation, scale and agility. However, existing studies fail to explore the role of platform in scalable e-government services delivery on an empirical level. Based on an in-depth case study of the world’s largest biometric identity platform, used by millions of the poor in India, we develop a set of propositions connecting the attributes of a digital platform ecosystem to different indicators for the scalability of government service delivery. We found that modular architecture, combined with limited functionality in core modules, and open standards combined with controlled access and ecosystem governance enabled by keystone behaviour, have a positive impact on scalability. The research provides insights to policy-makers and government officials alike, particularly those in nations struggling to provide basic services to poor and marginalised. …(More)”.
Report by the Institute for Public Relations: “Sixty-three percent of Americans view disinformation—deliberately biased and misleading information—as a “major” problem in society, on par with gun violence (63%) and terrorism (66%), according to the 2019 Institute for Public Relations Disinformation in Society Report.
The 2019 IPR Disinformation in Society Report surveyed 2,200 adults to determine the prevalence of disinformation, who is responsible for sharing disinformation, the level of trust in different information sources, and the parties responsible for combatting disinformation.
“One surprising finding was how significant of a problem both Republicans and Democrats rated disinformation,” said Dr. Tina McCorkindale, APR, President and CEO of the Institute for Public Relations. “Unfortunately, only a few organizations outside of the media literacy and news space devote resources to help fix it, including many of the perceived culprits responsible for spreading disinformation.”
More than half (51%) of the respondents said they encounter disinformation at least once a day, while 78% said they see it once a week. Four in five adults (80%) said they are confident in their ability to recognize false news and information. Additionally, nearly half of Americans (47%) said they “often” or “always” go to other sources to see if news and information are accurate….(More)”.