Paper by Katharina Hölscher et al: “The narrative of urban sustainability transformations epitomises the hope that urban governance can create the conditions to plan and govern cities in a way that they contribute to local and global sustainability and resilience. So far, urban governance is not delivering: novel governance approaches are emerging in cities worldwide, yet are unable to transform conventional policymaking and planning to allow for innovative, co-beneficial and long-term solutions and actions to emerge and institutionalise. We present a capacities framework for urban transformations governance, starting from the need to fulfil distinct output functions (‘what needs to happen’) for mobilising and influencing urban transformation dynamics. The framework helps to diagnose and inform urban governance for responding to disturbances (stewarding capacity), phasing-out drivers of path-dependency (unlocking capacity), creating and embedding novelties (transformative capacity) and coordinating multi-actor processes (orchestrating capacity). Our case study of climate governance in New York City exemplifies the framework’s applicability and explanatory power to identify conditions and activities facilitating transformation (governance), and to reveal gaps and barriers of these vis-à-vis the existing governance regime. Our framework thereby functions as a tool to explore what new forms of urban transformation governance are emerging, how effective these are, and how to strengthen capacities….(More)”.
Open Data Retrospective
Laura Bacon at Luminate:: “Our global philanthropic organisation – previously the Government & Citizen Engagement (GCE) initiative at Omidyar Network, now Luminate – has been active in the open data space for over decade. In that time, we have invested more than $50m in organisations and platforms that are working to advance open data’s potential, including Open Data Institute, IMCO, Open Knowledge, ITS Rio, Sunlight, GovLab, Web Foundation, Open Data Charter, and Open Government Partnership.
Ahead of our transition from GCE to Luminate last year, we wanted to take a step back and assess the field in order to cultivate a richer understanding of the evolution of open data—including its critical developments, drivers of change, and influential actors[1]. This research would help inform our own strategy and provide valuable insight that we can share with the broader open data ecosystem.
First, what is open data? Open data is data that can be freely used, shared, and built-upon by anyone, anywhere, for any purpose. At its best, open government data can empower citizens, improve governments, create opportunities, and help solve public problems. Have you used a transport app to find out when the next bus will arrive? Or a weather app to look up a forecast? When using a real estate website to buy or rent a home, have you also reviewed its proximity to health, education, and recreational facilities or checked out neighborhood crime rates? If so, your life has been impacted by open data.
The Open Data Retrospective
We commissioned Dalberg, a global strategic advisory firm, to conduct an Open Data Retrospective to explore: ‘how and why did the open data field evolve globally over the past decade?’ as well as ‘where is the field today?’ With the concurrent release of the report “The State of Open Data” – led by IDRC and Open Data for Development initiative – we thought this would be a great time to make public the report we’d commissioned.
You can see Dalberg’s open data report here, and its affiliated data here. Please note, this presentation is a modification of the report. Several sections and slides have been removed for brevity and/or confidentiality. Therefore, some details about particular organisations and strategies are not included in this deck.
Evolution and impact
Dalberg’s report covers the trajectory of the open data field and characterised it as: inception (pre-2008), systematisation (2009-2010), expansion (2011-2015), and reevaluation (2016-2018).This characterisation varies by region and sector, but generally captures the evolution of the open data movement….(More)”.
Datafication, development and marginalised urban communities: an applied data justice framework
Paper by Richard Heeks et al: “The role of data within international development is rapidly expanding. However, the recency of this phenomenon means analysis has been lagging; particularly, analysis of broader impacts of real-world initiatives. Addressing this gap through a focus on data’s increasing presence in urban development, this paper makes two contributions. First – drawing from the emerging literature on ‘data justice’ – it presents an explicit, systematic and comprehensive new framework that can be used for analysis of datafication. Second, it applies the framework to four mapping initiatives in cities of the global South. These initiatives capture and visualise new data about marginalised communities: residents living in slums and other informal settlements about whom data has traditionally been lacking. Analysing across procedural, rights, instrumental and structural dimensions, it finds these initiatives deliver real incremental gains for their target communities. But it is external actors and wealthier communities that gain more; thus, increasing relative inequality….(More)”.
Political Corruption in a World in Transition
Book edited by Jonathan Mendilow and Éric Phélippeau: “This book argues that the mainstream definitions of corruption, and the key expectations they embed concerning the relationship between corruption, democracy, and the process of democratization, require reexamination. Even critics who did not consider stable institutions and legal clarity of veteran democracies as a cure-all, assumed that the process of widening the influence on government decision making and implementation allows non-elites to defend their interests, define the acceptable sources and uses of wealth, and demand government accountability. This had proved correct, especially insofar as ‘petty corruption’ is involved. But the assumption that corruption necessarily involves the evasion of democratic principles and a ‘market approach’ in which the corrupt seek to maximize profit does not exhaust the possible incentives for corruption, the types of behaviors involved (for obvious reasons, the tendency in the literature is to focus on bribery), or the range of situations that ‘permit’ corruption in democracies. In the effort to identify some of the problems that require recognition, and to offer a more exhaustive alternative, the chapters in this book focus on corruption in democratic settings (including NGOs and the United Nations which were largely so far ignored), while focusing mainly on behaviors other than bribery….(More)”.
The Age of Digital Interdependence
Report of the High-level Panel on Digital Cooperation: “The immense power and value of data in the modern economy can and must be harnessed to meet the SDGs, but this will require new models of collaboration. The Panel discussed potential pooling of data in areas such as health, agriculture and the environment to enable scientists and thought leaders to use data and artificial intelligence to better understand issues and find new ways to make progress on the SDGs. Such data commons would require criteria for establishing relevance to the SDGs, standards for interoperability, rules on access and safeguards to ensure privacy and security.
Anonymised data – information that is rendered anonymous in such a way that the data subject is not or no longer identifiable – about progress toward the SDGs is generally less sensitive and controversial than the use of personal data of the kind companies such as Facebook, Twitter or Google may collect to drive their business models, or facial and gait data that could be used for surveillance. However, personal data can also serve development goals, if handled with proper oversight to ensure its security and privacy.
For example, individual health data is extremely sensitive – but many people’s health data, taken together, can allow researchers to map disease outbreaks, compare the effectiveness of treatments and improve understanding of conditions. Aggregated data from individual patient cases was crucial to containing the Ebola outbreak in West Africa. Private and public sector healthcare providers around the world are now using various forms of electronic medical records. These help individual patients by making it easier to personalise health services, but the public health benefits require these records to be interoperable.
There is scope to launch collaborative projects to test the interoperability of data, standards and safeguards across the globe. The World Health Assembly’s consideration of a global strategy for digital health in 2020 presents an opportunity to launch such projects, which could initially be aimed at global health challenges such as Alzheimer’s and hypertension.
Improved digital cooperation on a data-driven approach to public health has the potential to lower costs, build new partnerships among hospitals, technology companies, insurance providers and research institutes and support the shift from treating diseases to improving wellness. Appropriate safeguards are needed to ensure the focus remains on improving health care outcomes. With testing, experience and necessary protective measures as well as guidelines for the responsible use of data, similar cooperation could emerge in many other fields related to the SDGs, from education to urban planning to agriculture…(More)”.
The Ethics of Big Data Applications in the Consumer Sector
Paper by Markus Christen et al : “Business applications relying on processing of large amounts of heterogeneous data (Big Data) are considered to be key drivers of innovation in the digital economy. However, these applications also pose ethical issues that may undermine the credibility of data-driven businesses. In our contribution, we discuss ethical problems that are associated with Big Data such as: How are core values like autonomy, privacy, and solidarity affected in a Big Data world? Are some data a public good? Or: Are we obliged to divulge personal data to a certain degree in order to make the society more secure or more efficient?
We answer those questions by first outlining the ethical topics that are discussed in the scientific literature and the lay media using a bibliometric approach. Second, referring to the results of expert interviews and workshops with practitioners, we identify core norms and values affected by Big Data applications—autonomy, equality, fairness, freedom, privacy, property-rights, solidarity, and transparency—and outline how they are exemplified in examples of Big Data consumer applications, for example, in terms of informational self-determination, non-discrimination, or free opinion formation. Based on use cases such as personalized advertising, individual pricing, or credit risk management we discuss the process of balancing such values in order to identify legitimate, questionable, and unacceptable Big Data applications from an ethics point of view. We close with recommendations on how practitioners working in applied data science can deal with ethical issues of Big Data….(More)”.
The clinician crowdsourcing challenge: using participatory design to seed implementation strategies
Paper by Rebecca E. Stewart et al: “In healthcare settings, system and organization leaders often control the selection and design of implementation strategies even though frontline workers may have the most intimate understanding of the care delivery process, and factors that optimize and constrain evidence-based practice implementation within the local system. Innovation tournaments, a structured participatory design strategy to crowdsource ideas, are a promising approach to participatory design that may increase the effectiveness of implementation strategies by involving end users (i.e., clinicians). We utilized a system-wide innovation tournament to garner ideas from clinicians about how to enhance the use of evidence-based practices (EBPs) within a large public behavioral health system…(More)”
Study finds that a GPS outage would cost $1 billion per day
Eric Berger at Ars Technica: “….one of the most comprehensive studies on the subject has assessed the value of this GPS technology to the US economy and examined what effect a 30-day outage would have—whether it’s due to a severe space weather event or “nefarious activity by a bad actor.” The study was sponsored by the US government’s National Institutes of Standards and Technology and performed by a North Carolina-based research organization named RTI International.
Economic effect
As part of the analysis, researchers spoke to more than 200 experts in the use of GPS technology for various services, from agriculture to the positioning of offshore drilling rigs to location services for delivery drivers. (If they’d spoken to me, I’d have said the value of using GPS to navigate Los Angeles freeways and side streets was incalculable). The study covered a period from 1984, when the nascent GPS network was first opened to commercial use, through 2017. It found that GPS has generated an estimated $1.4 trillion in economic benefits during that time period.
The researchers found that the largest benefit, valued at $685.9 billion, came in the “telecommunications” category, including improved reliability and bandwidth utilization for wireless networks. Telematics (efficiency gains, cost reductions, and environmental benefits through improved vehicle dispatch and navigation) ranked as the second most valuable category at $325 billion. Location-based services on smartphones was third, valued at $215 billion.
Notably, the value of GPS technology to the US economy is growing. According to the study, 90 percent of the technology’s financial impact has come since just 2010, or just 20 percent of the study period. Some sectors of the economy are only beginning to realize the value of GPS technology, or are identifying new uses for it, the report says, indicating that its value as a platform for innovation will continue to grow.
Outage impact
In the case of some adverse event leading to a widespread outage, the study estimates that the loss of GPS service would have a $1 billion per-day impact, although the authors acknowledge this is at best a rough estimate. It would likely be higher during the planting season of April and May, when farmers are highly reliant on GPS technology for information about their fields.
To assess the effect of an outage, the study looked at several different variables. Among them was “precision timing” that enables a number of wireless services, including the synchronization of traffic between carrier networks, wireless handoff between base stations, and billing management. Moreover, higher levels of precision timing enable higher bandwidth and provide access to more devices. (For example, the implementation of 4G LTE technology would have been impossible without GPS technology)….(More)”
The war to free science
Brian Resnick and Julia Belluz at Vox: “The 27,500 scientists who work for the University of California generate 10 percent of all the academic research papers published in the United States.
Their university recently put them in a strange position: Sometime this year, these scientists will not be able to directly access much of the world’s published research they’re not involved in.
That’s because in February, the UC system — one of the country’s largest academic institutions, encompassing Berkeley, Los Angeles, Davis, and several other campuses — dropped its nearly $11 million annual subscription to Elsevier, the world’s largest publisher of academic journals.
On the face of it, this seemed like an odd move. Why cut off students and researchers from academic research?
In fact, it was a principled stance that may herald a revolution in the way science is shared around the world.
The University of California decided it doesn’t want scientific knowledge locked behind paywalls, and thinks the cost of academic publishing has gotten out of control.
Elsevier owns around 3,000 academic journals, and its articles account for some 18 percentof all the world’s research output. “They’re a monopolist, and they act like a monopolist,” says Jeffrey MacKie-Mason, head of the campus libraries at UC Berkeley and co-chair of the team that negotiated with the publisher.Elsevier makes huge profits on its journals, generating billions of dollars a year for its parent company RELX .
This is a story about more than subscription fees. It’s about how a private industry has come to dominate the institutions of science, and how librarians, academics, and even pirates are trying to regain control.
The University of California is not the only institution fighting back. “There are thousands of Davids in this story,” says University of California Davis librarian MacKenzie Smith, who, like so many other librarians around the world, has been pushing for more open access to science. “But only a few big Goliaths.”…(More)”.
100 Radical Innovation Breakthroughs for the future
The Radical Innovation Breakthrough Inquirer for the European Commission: “This report provides insights on 100 emerging developments that may exert a strong impact on global value creation and offer important solutions to societal needs. We identified this set of emerging developments through a carefully designed procedure that combined machine learning algorithms and human evaluation. After successive waves of selection and refinement, the resulting 100 emerging topics were subjected to several assessment procedures, including expert consultation and analysis of related patents and publications.
Having analysed the potential importance of each of these innovations for Europe, their current maturity and the relative strength of Europe in related R&D, we can make some general policy recommendations that follow.
However, it is important to note that our recommendations are based on the extremes of the distributions, and thus not all RIBs are named under the recommendations. Yet, the totality of the set of Radical Innovation Breakthrough (RIBs) and Radical Societal Breakthrough (RSBs) descriptions and their recent progress directions constitute an important collection of intelligence material that can inform strategic planning in research an innovation policy, industry and enterprise policy, and local development policy….(More)”.