Information, Technology and Control in a Changing World: Understanding Power Structures in the 21st Century


Book edited by Blayne Haggart, Kathryn Henne, and Natasha Tusikov: “This book explores the interconnected ways in which the control of knowledge has become central to the exercise of political, economic, and social power. Building on the work of International Political Economy scholar Susan Strange, this multidisciplinary volume features experts from political science, anthropology, law, criminology, women’s and gender studies, and Science and Technology Studies, who consider how the control of knowledge is shaping our everyday lives. From “weaponised copyright” as a censorship tool, to the battle over control of the internet’s “guts,” to the effects of state surveillance at the Mexico–U.S. border, this book offers a coherent way to understand the nature of power in the twenty-first century…(More)”.

We Need a Data-Rich Picture of What’s Killing the Planet


Clive Thompson at Wired: “…Marine litter isn’t the only hazard whose contours we can’t fully see. The United Nations has 93 indicators to measure the environmental dimensions of “sustainable development,” and amazingly, the UN found that we have little to no data on 68 percent of them—like how rapidly land is being degraded, the rate of ocean acidification, or the trade in poached wildlife. Sometimes this is because we haven’t collected it; in other cases some data exists but hasn’t been shared globally, or it’s in a myriad of incompatible formats. No matter what, we’re flying blind. “And you can’t manage something if you can’t measure it,” says David Jensen, the UN’s head of environmental peacebuilding.

In other words, if we’re going to help the planet heal and adapt, we need a data revolution. We need to build a “digital eco­system for the environment,” as Jensen puts it.

The good news is that we’ve got the tools. If there’s one thing tech excels at (for good and ill), it’s surveillance, right? We live in a world filled with cameras and pocket computers, titanic cloud computing, and the eerily sharp insights of machine learning. And this stuff can be used for something truly worthwhile: studying the planet.

There are already some remarkable cases of tech helping to break through the fog. Consider Global Fishing Watch, a nonprofit that tracks the world’s fishing vessels, looking for overfishing. They use everything from GPS-like signals emitted by ships to satellite infrared imaging of ship lighting, plugged into neural networks. (It’s massive, cloud-scale data: over 60 million data points per day, making the AI more than 90 percent accurate at classifying what type of fishing activity a boat is engaged in.)

“If a vessel is spending its time in an area that has little tuna and a lot of sharks, that’s questionable,” says Brian Sullivan, cofounder of the project and a senior program manager at Google Earth Outreach. Crucially, Global Fishing Watch makes its data open to anyone­­­—so now the National Geographic Society is using it to lobby for new marine preserves, and governments and nonprofits use it to target illicit fishing.

If we want better environmental data, we’ll need for-profit companies with the expertise and high-end sensors to pitch in too. Planet, a firm with an array of 140 satellites, takes daily snapshots of the entire Earth. Customers like insurance and financial firms love that sort of data. (It helps them understand weather and climate risk.) But Planet also offers it to services like Global Forest Watch, which maps deforestation and makes the information available to anyone (like activists who help bust illegal loggers). Meanwhile, Google’s skill in cloud-based data crunching helps illuminate the state of surface water: Google digitized 30 years of measurements from around the globe—extracting some from ancient magnetic tapes—then created an easy-to-use online tool that lets resource-poor countries figure out where their water needs protecting….(More)”.

Postsecondary Data Infrastructure: What is Possible Today


Report by Amy O’Hara: “Data sharing across government agencies allows consumers, policymakers, practitioners, and researchers to answer pressing questions. Creating a data infrastructure to enable this data sharing for higher education data is challenging, however, due to legal, privacy, technical, and perception issues. To overcome these challenges, postsecondary education can learn from other domains to permit secure, responsible data access and use. Working models from both the public sector and academia show how sensitive data from multiple sources can be linked and accessed for authorized uses.

This brief describes best practices in use today and the emerging technology that could further protect future data systems and creates a new framework, the “Five Safes”, for controlling data access and use. To support decisions facing students, administrators, evaluators, and policymakers, a postsecondary infrastructure must support cycles of data discovery, request, access, analysis, review, and release. It must be cost-effective, secure, and efficient and, ideally, it will be highly automated, transparent, and adaptable. Other industries have successfully developed such infrastructures, and postsecondary education can learn from their experiences.

A functional data infrastructure relies on trust and control between the data providers, intermediaries, and users. The system should support equitable access for approved users and offer the ability to conduct independent analyses with scientific integrity for reasonable financial costs. Policymakers and developers should ensure the creation of expedient, convenient data access modes that allow for policy analyses. …

The “Five Safes” framework describes an approach for controlling data access and use. The five safes are: safe projects, safe people, safe settings, safe data, and afe outputs….(More)”.

Make FOIA Work


Make FOIA Work is about re-imagining journalism through design, participation and collaboration. Faculty, staff and students at Emerson College and the Engagement Lab staff worked alongside the Boston Institute of Nonprofit Journalism (BINJ) and MuckRock, two independent and alternative news and information platforms and publishers, to produce a data-driven and engagement-based investigative reporting series that exposes corruption around the sales of guns in Massachusetts. Through design studios in participatory methods and data visualization, project participants created a participatory guide book for journalists, practitioners and community members on how to undertake participatory design projects with a focus on FOIA requests, community participation, and collaboration. The project also highlights the course syllabi in participatory design methods and data visualization….(More)”.

Open Urban Data and the Sustainable Development Goals


Conference Paper by Christine Meschede and Tobias Siebenlist: “Since the adoption of the United Nations’ Sustainable Development Goals (SDGs) in 2015 – an ambitious agenda to end poverty, combat environmental threats and ensure prosperity for everyone – some effort has been made regarding the adequate measuring of the progress on its targets. As the crucial point is the availability of sufficient, comparable information, open data can play a key role. The coverage of open data, i.e., data that is machine-readable, freely available and reusable for everyone, is assessed by several measurement tools. We propose the use of open governmental data to make the achievement of SDGs easy and transparent to measure. For this purpose, a mapping of the open data categories to the SDGs is presented. Further, we argue that the SDGs need to be tackled in particular at the city level. For analyzing the current applicability of open data for measuring progress on the SDGs, we provide a small-scale case study on German open data portals and the embedded data categories and datasets. The results suggest that further standardization is needed in order to be able to use open data for comparing cities and their progress towards the SDGs….(More)”.

An open platform centric approach for scalable government service delivery to the poor: The Aadhaar case


Paper by Sandip Mukhopadhyay, Harry Bouwman and Mahadeo PrasadJaiswal: “The efficient delivery of government services to the poor, or Bottom of the Pyramid (BOP), faces many challenges. While a core problem is the lack of scalability, that could be solved by the rapid proliferation of platforms and associated ecosystems. Existing research involving platforms focus on modularity, openness, ecosystem leadership and governance, as well as on their impact on innovation, scale and agility. However, existing studies fail to explore the role of platform in scalable e-government services delivery on an empirical level. Based on an in-depth case study of the world’s largest biometric identity platform, used by millions of the poor in India, we develop a set of propositions connecting the attributes of a digital platform ecosystem to different indicators for the scalability of government service delivery. We found that modular architecture, combined with limited functionality in core modules, and open standards combined with controlled access and ecosystem governance enabled by keystone behaviour, have a positive impact on scalability. The research provides insights to policy-makers and government officials alike, particularly those in nations struggling to provide basic services to poor and marginalised. …(More)”.

Disinformation Rated As Significant of a Problem As Gun Violence and Terrorism


Report by the Institute for Public Relations: “Sixty-three percent of Americans view disinformation—deliberately biased and misleading information—as a “major” problem in society, on par with gun violence (63%) and terrorism (66%), according to the 2019 Institute for Public Relations Disinformation in Society Report.

The 2019 IPR Disinformation in Society Report surveyed 2,200 adults to determine the prevalence of disinformation, who is responsible for sharing disinformation, the level of trust in different information sources, and the parties responsible for combatting disinformation.

“One surprising finding was how significant of a problem both Republicans and Democrats rated disinformation,” said Dr. Tina McCorkindale, APR, President and CEO of the Institute for Public Relations. “Unfortunately, only a few organizations outside of the media literacy and news space devote resources to help fix it, including many of the perceived culprits responsible for spreading disinformation.”

More than half (51%) of the respondents said they encounter disinformation at least once a day, while 78% said they see it once a week. Four in five adults (80%) said they are confident in their ability to recognize false news and information. Additionally, nearly half of Americans (47%) said they “often” or “always” go to other sources to see if news and information are accurate….(More)”.

Access to Data in Connected Cars and the Recent Reform of the Motor Vehicle Type Approval Regulation


Paper by Wolfgang Kerber and Daniel Moeller: “The need for regulatory solutions for access to in-vehicle data and resources of connected cars is one of the big controversial and unsolved policy issues. Last year the EU revised the Motor Vehicle Type Approval Regulation which already entailed a FRAND-like solution for the access to repair and maintenance information (RMI) to protect competition on the automotive aftermarkets. However, the transition to connected cars changes the technological conditions for this regulatory solution significantly. This paper analyzes the reform of the type approval regulation and shows that the regulatory solutions for access to RMI are so far only very insufficiently capable of dealing with the challenges coming along with increased connectivity, e.g. with regard to the new remote diagnostic, repair and maintenance services. Therefore, an important result of the paper is that the transition to connected cars will require a further reform of the rules for the regulated access to RMI (esp. with regard to data access, interoperability, and safety/security issues). However, our analysis also suggests that the basic approach of the current regulated access regime for RMI in the type approval regulation can also be a model for developing general solutions for the currently unsolved problems of access to in-vehicle data and resources in the ecosystem of connected driving….(More)”.

The Public Domain


Article by Ilanah Simon Fhima: “…explores whether it is possible to identify and delineate the public domain in intellectual property law.

The article begins with a review of existing IP scholarship on the public domain. Identifying that the prevailing model of the public domain relies upon analogies to the commons, it questions how well a model based upon medieval real property holdings might be expected to capture the nuances of intangible property in the internet age. Additionally, academic focus is predominantly directed on copyright law at the expense of other fields of intellectual endeavour and adopts a US-centric viewpoint, which may not reflective of the different settlement reached in the EU-influenced UK. The article then explores whether an alternative rhetoric in ‘no property’ – tangible objects which cannot be propertised – or a potential analogy with public rights of access to land might provide more suitable alternatives.

Ultimately, the article concludes that it is impossible to draw a complete map of the public domain in terms of what should not be propertised. Instead, it advocates for a positive conception of the public domain, based on individual uses that should always remain free, and the general public interests underlying those uses. This more flexible approach allows the law to evolve in response to specific changes in technology and social conditions, while at the same time maintaining a focus on core non-negotiable freedoms. It also consider whether the same methodology can be applied to tangible property….(More)”.

Transparency in the EU Institutions – An Overview


Paper by Gianluca Sgueo: “The concepts of transparency of the public sector has been in existence, in various forms, for centuries. Academics, however, agree on the fact that transparency should be qualified as a modern concept. The wave of government reforms that occurred in the 1950s and the 1960s fostered the culture of transparent, accessible and accountable bureaucracies. In the 1990s, following on the spread of technologies, terms like “Government 2.0” and “open government” were coined to describe the use that public administrations made of the Internet and other digital tools in order to foster civic engagement, improve transparency, and enhance the efficiency of government services.

Transparency has come to the fore again over the past few years. National and supranational regulators (including the European Union) have placed transparency among the priorities in their regulatory agendas. Enhanced transparency in decision-making is considered to be a solution to the decline of trust in the public sector, a limit to the negative impact of conspiracy theories and fake news, and also a way to revitalise civic engagement.

EU institutions promote transparency across different lines of action. Exemplary are the ongoing debates on reforming the legislative procedure of the Union, regulating lobbying activities, making available data in open format and digitalising services. Also relevant is the role of the European Ombudsman in promoting a culture of transparency at the EU level. 

Studies suggest that transparency and participation in public governance are having a positive impact on the accountability of EU institutions, and hence on citizens’ perceptions of their activities. The present briefing offers an overview of the actions that EU institutions are implementing to foster transparency, analyzing the potential benefits and briefly discussing its possible drawbacks…(More)”.