The city as platform


The report of the 2015 Aspen Institute Roundtable on Information Technology: “In the age of ubiquitous Internet connections, smartphones and data, the future vitality of cities is increasingly based on their ability to use digital networks in intelligent, strategic ways. While we are accustomed to thinking of cities as geophysical places governed by mayors, conventional political structures and bureaucracies, this template of city governance is under great pressure to evolve. Urban dwellers now live their lives in all sorts of hyper-connected virtual spaces, pulsating with real-time information, intelligent devices, remote-access databases and participatory crowdsourcing. Expertise is distributed, not centralized. Governance is not just a matter of winning elections and assigning tasks to bureaucracies; it is about the skillful collection and curation of information as a way to create new affordances for commerce and social life.

Except among a small class of vanguard cities, however, the far-reaching implications of the “networked city” for economic development, urban planning, social life and democracy, have not been explored in depth. The Aspen Institute Communications and Society Program thus convened an eclectic group of thirty experts to explore how networking technologies are rapidly changing the urban landscape in nearly every dimension. The goal was to learn how open networks, onlinecooperation and open data can enhance urban planning and administration, and more broadly, how they might improve economic opportunity and civic engagement. The conference, the 24th Annual Aspen Roundtable on Information Technology, also addressed the implications of new digital technologies for urban transportation, public health and safety, and socio-economic inequality….(Download the InfoTech 2015 Report)”

Exploring the economic value of open government data


Fatemeh Ahmadi Zeleti et al in Government Information Quarterly: “Business models for open data have emerged in response to the economic opportunities presented by the increasing availability of open data. However, scholarly efforts providing elaborations, rigorous analysis and comparison of open data models are very limited. This could be partly attributed to the fact that most discussions on Open Data Business Models (ODBMs) are predominantly in the practice community. This shortcoming has resulted in a growing list of ODBMs which, on closer examination, are not clearly delineated and lack clear value orientation. This has made the understanding of value creation and exploitation mechanisms in existing open data businesses difficult and challenging to transfer. Following the Design Science Research (DSR) tradition, we developed a 6-Value (6-V) business model framework as a design artifact to facilitate the explication and detailed analysis of existing ODBMs in practice. Based on the results from the analysis, we identify business model patterns and emerging core value disciplines for open data businesses. Our results not only help streamline existing ODBMs and help in linking them to the overall business strategy, but could also guide governments in developing the required capabilities to support and sustain the business models….(More)”

 

Zika Emergency Puts Open Data Policies to the Test


Larry Peiperl and Peter Hotez at PLOS: “The spreading epidemic of Zika virus, with its putative and alarming associations with Guillain-Barre syndrome and infant microcephaly, has arrived just as several initiatives have come into place to minimize delays in sharing the results of scientific research.

In September 2015, in response to concerns that research publishing practices had delayed access tocrucial information in the Ebola crisis, the World Health Organization convened a consultation “[i]nrecognition of the need to streamline mechanisms of data dissemination—globally and in as close toreal-time as possible” in the context of public health emergencies.

Participating medical journal editors, representing PLOS,BMJ and Nature journals and NEJM, provided a statement that journals should not act to delay access to data in a public health emergency: “In such scenarios,journals should not penalize, and, indeed, shouldencourage or mandate public sharing of relevant data…”

In a subsequent Comment in The Lancet, authors frommajor research funding organizations expressed supportfor data sharing in public health emergencies. TheInternational Committee of Medical Journal Editors(ICMJE), meeting in November 2015, lent further support to the principles of the WHO consultation byamending ICMJE “Recommendations” to endorse data sharing for public health emergencies of anygeographic scope.

Now that WHO has declared Zika to be a Public Health Emergency of International Concern, responses from these groups in recent days appear consistent with their recent declarations.

The ICMJE has announced that “In light of the need to rapidly understand and respond to the globalemergency caused by the Zika virus, content in ICMJE journals related to Zika virus is being made freeto access. We urge other journals to do the same. Further, as stated in our Recommendations, in theevent of a public health emergency (as defined by public health officials), information with immediateimplications for public health should be disseminated without concern that this will preclude subsequentconsideration for publication in a journal.”(www.icmje.org, accessed 9 Feburary 2016)

WHO has implemented special provisions for research manuscripts relevant to the Zika epidemic thatare submitted to WHO Bulletin; such papers “will be assigned a digital object identifier and posted onlinein the “Zika Open” collection within 24 hours while undergoing peer review. The data in these papers willthus be attributed to the authors while being freely available for reader scrutiny and unrestricted use”under a Creative Commons Attribution License (CC BY IGO 3.0).

At PLOS, where open access and data sharing apply as matter of course, all PLOS journals aim toexpedite peer review evaluation, pre-publication posting, and data sharing from research relevant to theZika outbreak. PLOS Currents Outbreaks offers an online platform for rapid publication of preliminaryresults, PLOS Neglected Tropical Diseases has committed to provide priority handling of Zika reports ingeneral, and other PLOS journals will prioritize submissions within their respective scopes. The PLOSZika Collection page provides central access to relevant and continually updated content from acrossthe PLOS journals, blogs, and collaborating organizations.

Today, the Wellcome Trust has issued a statement urging journals to commit to “make all content concerning the Zika virus free to access,” and funders to “require researchers undertaking work relevant to public health emergencies to set in place mechanisms to share quality-assured interim and final data as rapidly and widely as possible, including with public health and research communities and the World Health Organisation.”  Among 31 initial signatories are such journals and publishers as PLOS, Springer Nature, Science journals, The JAMA Network, eLife, the Lancet, and New England Journal ofMedicine; and funding organizations including Bill and Melinda Gates Foundation, UK Medical ResearchCouncil,  US National Institutes of Health, Wellcome Trust, and other major national and internationalresearch funders.

This policy shift prompts reconsideration of how we publish urgently needed data during a public health emergency….(More)”

Civic hacking as data activism and advocacy: A history from publicity to open government data


Andrew R Schrock in New Media and Society: “The civic hacker tends to be described as anachronistic, an ineffective “white hat” compared to more overtly activist cousins. By contrast, I argue that civic hackers’ politics emerged from a distinct historical milieu and include potentially powerful modes of political participation. The progressive roots of civic data hacking can be found in early 20th-century notions of “publicity” and the right to information movement. Successive waves of activists saw the Internet as a tool for transparency. The framing of openness shifted in meaning from information to data, weakening of mechanisms for accountability even as it opened up new forms of political participation. Drawing on a year of interviews and participant observation, I suggest civic data hacking can be framed as a form of data activism and advocacy: requesting, digesting, contributing to, modeling, and contesting data. I conclude civic hackers are utopian realists involved in the crafting of algorithmic power and discussing ethics of technology design. They may be misunderstood because open data remediates previous forms of openness. In the process, civic hackers transgress established boundaries of political participation….(More)”

Linked Open Economy: Take Full Advantage of Economic Data


Paper by Michalis N. Vafopoulos et al: “For decades, information related to public finances was out of reach for most of the people. Gradually, public budgets and tenders are becoming openly available and global initiatives promote fiscal transparency and open product and price data. But, the poor quality of economic open data undermines their potential to answer interesting questions (e.g. efficiency of public funds and market processes). Linked Open Economy (LOE) has been developed as a top-level conceptualization that interlinks the publicly available economic open data by modelling the flows incorporated in public procurement together with the market process to address complex policy issues. LOE approach is extensively used to enrich open economic data ranging from budgets and spending to prices. Developers, professionals, public administrations and any other interested party use and customize LOE model to develop new systems, to enable information exchange between systems, to integrate data from heterogeneous sources and to publish open data related to economic activities….(More)”

Open data and (15 million!) new measures of democracy


Joshua Tucker in the Washington Post: “Last month the University of Gothenberg’s V-Dem Institute released a new“Varieties of Democracy” dataset. It provides about 15 million data points on democracy, including 39 democracy-related indices. It can be accessed at v-dem.net along with supporting documentation. I asked Staffan I. Lindberg, Director of the V-Dem Institute and one of the directors of the project, a few questions about the new data. What follows is a lightly edited version of his answers.


Women’s Political Empowerment Index for Southeast Asia (Data: V-Dem data version 5; Figure V-Dem Institute, University of Gothenberg, Sweden)

Joshua Tucker: What is democracy, and is it even really to have quantitative measures on democracy?

Staffan Lindberg: There is no consensus on the definition of democracy and how to measure it. The understanding of what a democracy really is varies across countries and regions. This motivates the V-Dem approach not to offer one standard definition of the concept but instead to distinguish among five principles different versions of democracy: Electoral, Liberal, Participatory, Deliberative, and Egalitarian democracy. All of these principles have played prominent roles in current and historical discussions about democracy. Our measurement of these principles are based on two types of data, factual data collected by assisting researchers and survey responses by country experts, which are combined using a rather complex measurement model (which is a“custom-designed Bayesian ordinal item response theory model”, for details see the V-Dem Methodology document)….(More)

Open government data and why it matters


Australian Government: “This was a key focus of the Prime Minister’s $1.1 billion innovation package announced this month.

The Bureau of Communications Research (BCR) today released analysis of the impact of open government data, revealing its potential to generate up to $25 billion per year, or 1.5 per cent of Australia’s GDP.

In Australia, users can already access and re-use more than 7000 government data sets published on data.gov.au,’ said Dr Paul Paterson, Chief Economist and Head of the Bureau of Communications Research (BCR).

‘Some of the high-value data sets include geospatial/mapping data, health data, transport data, mining data, environmental data, demographics data, and real-time emergency data.

‘Many Australians are unaware of the flow-on benefits from open government data as a result of the increased innovation and informed choice it creates. For example open data has the power to generate new careers, more efficient government revenues, improved business practices, and drive better public engagement,

Open data dusts off the art world


Suzette Lohmeyer at GCN: “Open data is not just for spreadsheets. Museums are finding ways to convert even the provenance of artwork into open data, offering an out-of-the-box lesson in accessibility to public sector agencies. The specific use case could be of interest to government as well — many cities and states have sizeable art collections, and the General Services Administration owns more than 26,000 pieces.

Open data solving art history mysteries?

Making provenance data open and accessible gives more people information about a piece’s sometimes sordid history, including clues that might uncover evidence of Nazi confiscation. Read more.

Most art pieces have a few skeletons in their closet, or at least a backstory worthy of The History Channel. That provenance, or ownership information, has traditionally been stored in manila folders, only occasionally dusted off by art historians for academic papers or auction houses to verify authenticity. Many museums have some provenance data in collection management systems, but the narratives that tell the history of the work are often stored as semi-structured data, formatted according to the needs of individual institutions, making the information both hard to search and share across systems.

Enter Art Tracks from Pittsburgh’s Carnegie Museum of Art (CMOA) — a new open source, open data initiative that aims to turn provenance into structured data by building a suite of open source software tools so an artwork’s past can be available to museum goers, curators, researchers and software developers.

 

….The Art Tracks software is all open source. The code libraries and the user-facing provenance entry tool called Elysa (E-lie-za) are all “available on GitHub for use, modification and tinkering,” Berg-Fulton explained. “That’s a newer way of working for our museum, but that openness gives others a chance to lean on our technical expertise and improve their own records and hopefully contribute back to the software to improve that as well.”

Using an open data format, Berg-Fulton said, also creates opportunities for ongoing partnerships with other experts across the museum community so that provenance becomes a constant conversation.

This is a move Berg-Fulton said CMOA has been “dying to make,” because the more people that have access to data, the more ways it can be interpreted. “When you give people data, they do cool things with it, like help you make your own records better, or interpret it in a way you’ve never thought of,” she said. “It feels like the right thing to do in light of our duty to public trust.”….(More)”

The Promise and Perils of Open Medical Data


Sharona Hoffman at the Hastings Center: “Not long ago I visited the Personal Genome Project’s website. The PGP describes its mission as “creating public genome, health, and trait data.” In the “Participant Profiles” section, I found several entries that disclosed the names of individuals along with their date of birth, sex, weight, height, blood type, race, health conditions, medications, allergies, medical procedures, and more. Other profiles did not feature names but provided all of the other details. I had no special access to this information. It is available to absolutely anyone with Internet access. The PGP is part of a trend known as “open data.” Many government and private entities have launched initiatives to compile very large data resources (also known as “big data”) and to make them available to the public. President Obama himself has endorsed open data by issuing a May 2013 executive order directing that, to the extent permitted by law, the federal government must release its data to the public in forms that make it easy to locate, access, and use.

Read more:http://www.thehastingscenter.org/Publications/HCR/Detail.aspx?id=7731#ixzz3zOSM2kF0

Moving from Open Data to Open Knowledge: Announcing the Commerce Data Usability Project


Jeffrey Chen, Tyrone Grandison, and Kristen Honey at the US Department of Commerce: “…in 2016, the DOC is committed to building on this momentum with new and expanded efforts to transform open data into knowledge into action.

DOC Open Data Graphic
Graphic Credit: Radhika Bhatt, Commerce Data Service

DOC has been in the business of open data for a long time. DOC’s National Oceanic and Atmospheric Administration (NOAA) alone collects and disseminates huge amounts of data that fuel the global weather economy—and this information represents just a fraction of the tens of thousands of datasets that DOC collects and manages, on topics ranging from satellite imagery to material standards to demographic surveys.

Unfortunately, far too many DOC datasets are either hard to find, difficult to use, and/or not yet publicly available on Data.gov, the home of U.S. government’s open data. This challenge is not exclusive to DOC; and indeed, under Project Open Data, Federal agencies are working hard on various efforts to make tax-payer funded data more easily discoverable.

CDUP screenshot

One of these efforts is DOC’s Commerce Data Usability Project (CDUP). To unlock the power of data, just making data open isn’t enough. It’s critical to make data easier to find and use—to provide information and tools that make data accessible and actionable for all users. That’s why DOC formed a public-private partnership to create CDUP, a collection of online data tutorials that provide students, developers, and entrepreneurs with the necessary context and code for them to start quickly extracting value from various datasets. Tutorials exist on topics such as:

  • NOAA’s Severe Weather Data Inventory (SWDI), demonstrating how to use hail data to save life and property. The tutorial helps users see that hail events often occur in the summer (late night to early morning), and in midwestern and southern states.
  • Security vulnerability data from the National Institute of Standards and Technology (NIST). The tutorial helps users see that spikes and dips in security incidents consistently occur in the same set of weeks each year.
  • Visible Infrared Imaging Radiometer Suite (VIIRS) data from the National Oceanic and Atmospheric Administration (NOAA). The tutorial helps users understand how to use satellite imagery to estimate populations.
  • American Community Survey (ACS) data from the U.S. Census Bureau. The tutorial helps users understand how nonprofits can identify communities that they want to serve based on demographic traits.

In the coming months, CDUP will continue to expand with a rich, diverse set of additional tutorials….(More)