The report of the 2015 Aspen Institute Roundtable on Information Technology: “In the age of ubiquitous Internet connections, smartphones and data, the future vitality of cities is increasingly based on their ability to use digital networks in intelligent, strategic ways. While we are accustomed to thinking of cities as geophysical places governed by mayors, conventional political structures and bureaucracies, this template of city governance is under great pressure to evolve. Urban dwellers now live their lives in all sorts of hyper-connected virtual spaces, pulsating with real-time information, intelligent devices, remote-access databases and participatory crowdsourcing. Expertise is distributed, not centralized. Governance is not just a matter of winning elections and assigning tasks to bureaucracies; it is about the skillful collection and curation of information as a way to create new affordances for commerce and social life.
Except among a small class of vanguard cities, however, the far-reaching implications of the “networked city” for economic development, urban planning, social life and democracy, have not been explored in depth. The Aspen Institute Communications and Society Program thus convened an eclectic group of thirty experts to explore how networking technologies are rapidly changing the urban landscape in nearly every dimension. The goal was to learn how open networks, onlinecooperation and open data can enhance urban planning and administration, and more broadly, how they might improve economic opportunity and civic engagement. The conference, the 24th Annual Aspen Roundtable on Information Technology, also addressed the implications of new digital technologies for urban transportation, public health and safety, and socio-economic inequality….(Download the InfoTech 2015 Report)”
Fatemeh Ahmadi Zeleti et al in Government Information Quarterly: “Business models for open data have emerged in response to the economic opportunities presented by the increasing availability of open data. However, scholarly efforts providing elaborations, rigorous analysis and comparison of open data models are very limited. This could be partly attributed to the fact that most discussions on Open Data Business Models (ODBMs) are predominantly in the practice community. This shortcoming has resulted in a growing list of ODBMs which, on closer examination, are not clearly delineated and lack clear value orientation. This has made the understanding of value creation and exploitation mechanisms in existing open data businesses difficult and challenging to transfer. Following the Design Science Research (DSR) tradition, we developed a 6-Value (6-V) business model framework as a design artifact to facilitate the explication and detailed analysis of existing ODBMs in practice. Based on the results from the analysis, we identify business model patterns and emerging core value disciplines for open data businesses. Our results not only help streamline existing ODBMs and help in linking them to the overall business strategy, but could also guide governments in developing the required capabilities to support and sustain the business models….(More)”
Larry Peiperl and Peter Hotez at PLOS: “The spreading epidemic of Zika virus, with its putative and alarming associations with Guillain-Barre syndrome and infant microcephaly, has arrived just as several initiatives have come into place to minimize delays in sharing the results of scientific research.
In September 2015, in response to concerns that research publishing practices had delayed access tocrucial information in the Ebola crisis, the World Health Organization convened a consultation “[i]nrecognition of the need to streamline mechanisms of data dissemination—globally and in as close toreal-time as possible” in the context of public health emergencies.
Participating medical journal editors, representing PLOS,BMJ and Nature journals and NEJM, provided a statement that journals should not act to delay access to data in a public health emergency: “In such scenarios,journals should not penalize, and, indeed, shouldencourage or mandate public sharing of relevant data…”
In a subsequent Comment in The Lancet, authors frommajor research funding organizations expressed supportfor data sharing in public health emergencies. TheInternational Committee of Medical Journal Editors(ICMJE), meeting in November 2015, lent further support to the principles of the WHO consultation byamending ICMJE “Recommendations” to endorse data sharing for public health emergencies of anygeographic scope.
Now that WHO has declared Zika to be a Public Health Emergency of International Concern, responses from these groups in recent days appear consistent with their recent declarations.
The ICMJE has announced that “In light of the need to rapidly understand and respond to the globalemergency caused by the Zika virus, content in ICMJE journals related to Zika virus is being made freeto access. We urge other journals to do the same. Further, as stated in our Recommendations, in theevent of a public health emergency (as defined by public health officials), information with immediateimplications for public health should be disseminated without concern that this will preclude subsequentconsideration for publication in a journal.”(www.icmje.org, accessed 9 Feburary 2016)
WHO has implemented special provisions for research manuscripts relevant to the Zika epidemic thatare submitted to WHO Bulletin; such papers “will be assigned a digital object identifier and posted onlinein the “Zika Open” collection within 24 hours while undergoing peer review. The data in these papers willthus be attributed to the authors while being freely available for reader scrutiny and unrestricted use”under a Creative Commons Attribution License (CC BY IGO 3.0).
At PLOS, where open access and data sharing apply as matter of course, all PLOS journals aim toexpedite peer review evaluation, pre-publication posting, and data sharing from research relevant to theZika outbreak. PLOS Currents Outbreaks offers an online platform for rapid publication of preliminaryresults, PLOS Neglected Tropical Diseases has committed to provide priority handling of Zika reports ingeneral, and other PLOS journals will prioritize submissions within their respective scopes. The PLOSZika Collection page provides central access to relevant and continually updated content from acrossthe PLOS journals, blogs, and collaborating organizations.
Today, the Wellcome Trust has issued a statement urging journals to commit to “make all content concerning the Zika virus free to access,” and funders to “require researchers undertaking work relevant to public health emergencies to set in place mechanisms to share quality-assured interim and final data as rapidly and widely as possible, including with public health and research communities and the World Health Organisation.” Among 31 initial signatories are such journals and publishers as PLOS, Springer Nature, Science journals, The JAMA Network, eLife, the Lancet, and New England Journal ofMedicine; and funding organizations including Bill and Melinda Gates Foundation, UK Medical ResearchCouncil, US National Institutes of Health, Wellcome Trust, and other major national and internationalresearch funders.
This policy shift prompts reconsideration of how we publish urgently needed data during a public health emergency….(More)”
Andrew R Schrock in New Media and Society: “The civic hacker tends to be described as anachronistic, an ineffective “white hat” compared to more overtly activist cousins. By contrast, I argue that civic hackers’ politics emerged from a distinct historical milieu and include potentially powerful modes of political participation. The progressive roots of civic data hacking can be found in early 20th-century notions of “publicity” and the right to information movement. Successive waves of activists saw the Internet as a tool for transparency. The framing of openness shifted in meaning from information to data, weakening of mechanisms for accountability even as it opened up new forms of political participation. Drawing on a year of interviews and participant observation, I suggest civic data hacking can be framed as a form of data activism and advocacy: requesting, digesting, contributing to, modeling, and contesting data. I conclude civic hackers are utopian realists involved in the crafting of algorithmic power and discussing ethics of technology design. They may be misunderstood because open data remediates previous forms of openness. In the process, civic hackers transgress established boundaries of political participation….(More)”
Teo Kermeliotis at AlJazeera: “Back in the summer of 2015, at the height of the ongoing refugee crisis, Karolin Schwarz started noticing a disturbing pattern.
Just as refugee arrivals in her town of Leipzig, eastern Germany, began to rise, so did the frequency of rumours over supposed crimes committed by those men, women and children who had fled war and hardship to reach Europe.
As months passed by, the allegations became even more common, increasingly popping up in social media feeds and often reproduced by mainstream news outlets.
The online map featured some 240 incidents in its first week [Source: Hoaxmap/Al Jazeera]
“The stories seemed to be [orchestrated] by far-right parties and organisations and I wanted to try to find some way to help organise this – maybe find patterns and give people a tool to look up these stories [when] they were being confronted with new ones.”
And so she did.
Along with 35-year-old developer Lutz Helm, Schwarz launched last week Hoaxmap, an online platform that allows people to separate fact from fiction by debunking false rumours about supposed crimes committed by refugees.
Using an interactive system of popping dots, the map documents and categorises where those “crimes” allegedly took place. It then counters that false information with official statements from the police and local authorities, as well as news reports in which the allegations have been disproved. The debunked cases marked on the map range from thefts and assaults to manslaughter – but one of the most common topics is rape, Schwarz said….(More)”
Paul Basken at the Chronicle of Higher Education: “Thanks to what they’ve learned from university research, consultants like Matthew Kalmans have become experts in modern political persuasion. A co-founder of Applecart, a New York data firm, Mr. Kalmans specializes in shaping societal attitudes by using advanced analytical techniques to discover and exploit personal connections and friendships. His is one of a fast-growing collection of similar companies now raising millions of dollars, fattening businesses, and aiding political campaigns with computerized records of Facebook exchanges, high-school yearbooks, even neighborhood gossip.
Applecart uses that data to try to persuade people on a range of topics by finding voices they trust to deliver endorsements. “You can use this sort of technology to get people to purchase insurance at higher rates, get people to purchase a product, get people to do all sorts of other things that they might otherwise not be inclined to do,” said Mr. Kalmans, a 2014 graduate of the University of Pennsylvania. And in building such a valuable service, he’s found that the intellectual underpinnings are often free. “We are constantly reading academic papers to get ideas on how to do things better,” Mr. Kalmans said. That’s because scholars conduct the field experiments and subsequent tests that Mr. Kalmans needs to build and refine his models. “They do a lot of the infrastructural work that, frankly, a lot of commercial companies don’t have the in-house expertise to do,” he said of university researchers. Yet the story of Applecart stands in contrast to the dominant attitude and approach among university researchers themselves. Universities are full of researchers who intensively study major global problems such as environmental destruction and societal violence, then stop short when their conclusions point to the need for significant change in public behavior.
Some in academe consider that boundary a matter of principle rather than a systematic failure or oversight. “The one thing that we have to do is not be political,” Michael M. Crow, the usually paradigm-breaking president of Arizona State University, said this summer at a conference on academic engagement in public discourse. “Politics is a process that we are informing. We don’t have to be political to inform politicians or political actors.” But other academics contemplate that stance and see a missed opportunity to help convert the millions of taxpayer dollars spent on research into meaningful societal benefit. They include Dan M. Kahan, a professor of law and of psychology at Yale University who has been trying to help Florida officials cope with climate change. Mr. Kahan works with the four-county Southeast Florida Regional Climate Change Compact, which wants to redesign roads, expand public transit, and build pumping stations to prepare for harsher weather. But Mr. Kahan says he and his Florida partners have had trouble getting enough
But Mr. Kahan says he and his Florida partners have had trouble getting enough policy makers to seriously consider the scale of the problem and the necessary solutions. It’s frustrating, Mr. Kahan said, to see so much university research devoted to work inside laboratories on problems like climate, and comparatively little spent on real-world needs such as sophisticated messaging strategies. “There really is a kind of deficit in the research relating to actually operationalizing the kinds of insights that people have developed from research,” he said. That deficit appears to stem from academic culture, said Utpal M. Dholakia, a professor of marketing at Rice University whose work involves testing people’s self-control in areas such as eating and shopping. He then draws conclusions about whether regulations or taxes aimed at changing behaviors will be effective. Companies find advanced personal behavioral data highly useful, said Mr. Dholakia, who works on the side to help retailers devise sales strategies. But his university, he said, appears more interested in seeing him publish his findings than take the time to help policy makers make real-world use of them. “My dean gets very worried if I don’t publish a lot.” Because universities h
That deficit appears to stem from academic culture, said Utpal M. Dholakia, a professor of marketing at Rice University whose work involves testing people’s self-control in areas such as eating and shopping. He then draws conclusions about whether regulations or taxes aimed at changing behaviors will be effective. Companies find advanced personal behavioral data highly useful, said Mr. Dholakia, who works on the side to help retailers devise sales strategies. But his university, he said, appears more interested in seeing him publish his findings than take the time to help policy makers make real-world use of them. “My dean gets very worried if I don’t publish a lot.” …(More)
Paper by Michalis N. Vafopoulos et al: “For decades, information related to public finances was out of reach for most of the people. Gradually, public budgets and tenders are becoming openly available and global initiatives promote fiscal transparency and open product and price data. But, the poor quality of economic opendata undermines their potential to answer interesting questions (e.g. efficiency of public funds and market processes). Linked Open Economy (LOE) has been developed as a top-level conceptualization that interlinks the publicly available economic opendata by modelling the flows incorporated in public procurement together with the market process to address complex policy issues. LOE approach is extensively used to enrich open economic data ranging from budgets and spending to prices. Developers, professionals, public administrations and any other interested party use and customize LOE model to develop new systems, to enable information exchange between systems, to integrate data from heterogeneous sources and to publish open data related to economic activities….(More)”
Francis Irving at LLRX: “The Humanitarian Data Exchange (HDX) is an unusual data hub. It’s made by the UN, and is successfully used by agencies, NGOs, companies, Governments and academics to share data.
They’re doing this during crises such as the Ebola epidemic and the Nepal earthquakes, and every day to build up information in between crises.
There are lots of data hubs which are used by one organisation to publish data, far fewer which are used by lots of organisations to share data. The HDX project did a bunch of things right. What were they?
Here are six lessons…
1) Do good design
HDX started with user needs research. This was expensive, and was immediately worth it because it stopped a large part of the project which wasn’t needed.
The user needs led to design work which has made the website seem simple and beautiful – particularly unusual for something from a large bureaucracy like the UN.
2) Build on existing software
When making a hub for sharing data, there’s no need to make something from scratch. Open Knowledge’s CKANsoftware is open source, this stuff is a commodity. HDX has developers who modify and improve it for the specific needs of humanitarian data.
3) Use experts
HDX is a great international team – the leader is in New York, most of the developers are in Romania, there’s a data lab in Nairobi. Crucially, they bring in specific outside expertise: frog design do the user research and design work;ScraperWiki, experts in data collaboration, provide operational management.
4) Measure the right things
HDX’s metrics are about both sides of its two sided network. Are users who visit the site actually finding and downloading data they want? Are new organisations joining to share data? They’re avoiding “vanity metrics”, taking inspiration from tech startup concepts like “pirate metrics“.
5) Add features specific to your community
There are endless features you can add to data hubs – most add no value, and end up a cost to maintain. HDX add specific things valuable to its community.
For example, much humanitarian data is in “shape files”, a standard for geographical information. HDX automatically renders a beautiful map of these – essential for users who don’t have ArcGIS, and a good check for those that do.
6) Trust in the data
The early user research showed that trust in the data was vital. For this reason, anyone can’t just come along and add data to it. New organisations have to apply – proving either that they’re known in humanitarian circles, or have quality data to share. Applications are checked by hand. It’s important to get this kind of balance right – being too ideologically open or closed doesn’t work.
Conclusion
The detail of how a data sharing project is run really matters….(More)”
Linda Poon at CityLab: “It’s not only your friends and family who follow your online selfies and group photos. Scientists are starting to look at them, too, though they’re more interested in what’s around you. In bulk, photos can reveal weather patterns across multiple locations, air quality of a place over time, the dynamics of a neighborhood—all sorts of information that helps researchers study cities.
At the Nanyang Technological University in Singapore, a research group is using crowdsourced photos to create a low-cost alternative to air-pollution sensors. Called AirTick, the smartphone app they’ve designed will collect photos from users and analyze how hazy the environment looks. It’ll then check each image against official air quality data, and through machine-learning the app will eventually be able to predict pollution levels based on an image alone.
AirTick creator Pan Zhengziang said in a promotional video last month that the growing concern among the public over air quality can make programs like this a success—especially in Southeast Asia, where smog has gotten so bad that governments have had to shut down schools and suspend outdoor activities. “In Singapore’s recent haze episode, around 250,000 people [have] shared their concerns via Twitter,” he said. “This has made crowdsourcing-based air quality monitoring a possibility.”…(More)”
Joshua Tucker in the Washington Post: “Last month the University of Gothenberg’s V-Dem Institute released a new“Varieties of Democracy” dataset. It provides about 15 million data points on democracy, including 39 democracy-related indices. It can be accessed at v-dem.net along with supporting documentation. I asked Staffan I. Lindberg, Director of the V-Dem Institute and one of the directors of the project, a few questions about the new data. What follows is a lightly edited version of his answers.
Women’s Political Empowerment Index for Southeast Asia (Data: V-Dem data version 5; Figure V-Dem Institute, University of Gothenberg, Sweden)
Joshua Tucker: What is democracy, and is it even really to have quantitative measures on democracy?
Staffan Lindberg: There is no consensus on the definition of democracy and how to measure it. The understanding of what a democracy really is varies across countries and regions. This motivates the V-Dem approach not to offer one standard definition of the concept but instead to distinguish among five principles different versions of democracy: Electoral, Liberal, Participatory, Deliberative, and Egalitarian democracy. All of these principles have played prominent roles in current and historical discussions about democracy. Our measurement of these principles are based on two types of data, factual data collected by assisting researchers and survey responses by country experts, which are combined using a rather complex measurement model (which is a“custom-designed Bayesian ordinal item response theory model”, for details see the V-Dem Methodology document)….(More)