Stefaan Verhulst
“Crowd2Map Tanzania is a new crowdsourcing initiative aimed at creating a comprehensive map ofTanzania, including detailed depictions of all of its villages, roads and public resources (such as schools, shops, offices etc.) in OpenStreetMap and/or Google Maps, both of which are sadly rather poor at the moment. (For a convincing example, see our post about a not-so-blank-as-map-suggests Zeze village here.)
…In February 2016, Crowd2Map Tanzania was one of the 7 projects selected in the Open Seventeenchallenge, which rallies the public to use open data as a means of achieving the 17 SustainableDevelopment Goals as proposed but the UN in September 2015! We are now excited to carry on with the helpof O17 partners – Citizen Cyberlab, The GovLab, ONE and SciFabric! We’re tackling Goal 11: creatingsustainable cities & communities and Goal 4: education through technology….(More)
Richard A. Berk, Susan B. Sorenson and Geoffrey Barnes in the The Journal of Empirical Legal Studies: “Arguably the most important decision at an arraignment is whether to release an offender until the date of his or her next scheduled court appearance. Under the Bail Reform Act of 1984, threats to public safety can be a key factor in that decision. Implicitly, a forecast of “future dangerousness” is required. In this article, we consider in particular whether usefully accurate forecasts of domestic violence can be obtained. We apply machine learning to data on over 28,000 arraignment cases from a major metropolitan area in which an offender faces domestic violence charges. One of three possible post-arraignment outcomes is forecasted within two years: (1) a domestic violence arrest associated with a physical injury, (2) a domestic violence arrest not associated with a physical injury, and (3) no arrests for domestic violence. We incorporate asymmetric costs for different kinds of forecasting errors so that very strong statistical evidence is required before an offender is forecasted to be a good risk. When an out-of-sample forecast of no post-arraignment domestic violence arrests within two years is made, it is correct about 90 percent of the time. Under current practice within the jurisdiction studied, approximately 20 percent of those released after an arraignment for domestic violence are arrested within two years for a new domestic violence offense. If magistrates used the methods we have developed and released only offenders forecasted not to be arrested for domestic violence within two years after an arraignment, as few as 10 percent might be arrested. The failure rate could be cut nearly in half. Over a typical 24-month period in the jurisdiction studied, well over 2,000 post-arraignment arrests for domestic violence perhaps could be averted….(More)”
Joshua Wilwohl in Forbes: “A new mobile and web application will help Cambodians better track complaints registered with local governments, but part of the app’s effectiveness hinges on whether the country’s leaders are receptive to the technology.
Known as Transmit, the app works by allowing selected government and grassroots leaders to enter in complaints made by citizens during routine community council meetings.
The app then sends the complaints to an online database. Once in the database, the government officials referenced by the issues can address them and indicate the status of the complaints.
The database is public and offers registered users the opportunity to comment on the complaints.
Currently, citizens register complaints with pen and paper or in a spreadsheet on an official’s computer….
Earlier this month, Pact began training officials in Pursat province to use the app and will expand training this week to local governments and community-based organizations in Kampong Cham, Battambang and Mondulkiri provinces, saidCenter.
But the app relies on government officials using the technology to keep the community informed about the progress of the complaints—a task that may be easier said than done in a country that is well-documented for its lack of transparency…(More)”
Tom Rosenstiel for the Brookings Center for Effective Public Management: “The path toward sustainable journalism, already challenged by a disrupted advertising business model, is also being undermined by something more unexpected—terrible data.
Analytics—another word for audience data or metrics—was supposed to offer the promise that journalists would be able to understand consumers at a deeper level. Journalism would be more connected and relevant as news people could see what audiences really wanted. Handled well, this should have helped journalists pursue what is at its core their fundamental challenge: learning how to make the significant interesting and the interesting more significant.
But a generation into the digital age, the problem associated with analytics isn’t the one that some feared—the discovery that audiences only care to be entertained and distracted. The bigger problem is that most web analytics are a mess. Designed for other purposes, the metrics used to understand publishing today offer too little information that is useful to journalists or to publishers on the business side. They mostly measure the wrong things. They also to a large extent measure things that are false or illusory.
As an example, the metric we have taken to call “unique visitors” is not what it sounds. Unique visitors are not different people. Instead, this metric measures devices; the same person who visits a publication on a phone, a tablet, and a computer is counted as three unique visitors. If they clean their cookies they are counted all over again. The traffic to most websites is probably over counted by more than double, perhaps more than triple
Time spent per article, in contrast, might offer a sense of depth of interest in a particular piece. But by itself it might also mean that someone stopped reading and walked away from the computer. Page views can tell a publisher how many times an individual piece of content was viewed. But views cannot tell the publisher why. Using conventional analytics, every story is an anecdote. Publishers may look at popular stories and say let’s do more like those. But they are largely inferring what “like those” means….(More)”
“The Digital Civil Society Lab at Stanford created digitalIMPACT.io to support civil society organizations in using digital data ethically, safely, and effectively. The content and tools on the site come from nonprofit and foundation partners.
digitalIMPACT.io is designed to help you learn from and share with others. The materials are provided as examples to inform your decision-making, organizational practice, and policy creation. We invite you to use and adapt what you find here, and hope you will share the practices and policies that you’ve developed. This website is only a start; real change will come as organizations integrate appropriate data management and governance throughout their work.
Digital data hold tremendous promise for civil society and they also raise new challenges. Think of digital data as both assets and liabilities. It’s time to start managing them to help you achieve your mission…. (More)”
Paper by Thomas Pellissier Tanon et al: “Collaborative knowledge bases that make their data freely available in a machine-readable form are central for the data strategy of many projects and organizations. The two major collaborative knowledge bases are Wikimedia’s Wikidata and Google’s Freebase. Due to the success of Wikidata, Google decided in 2014 to offer the content of Freebase to the Wikidata community. In this paper, we report on the ongoing transfer efforts and data mapping challenges, and provide an analysis of the effort so far. We describe the Primary Sources Tool, which aims to facilitate this and future data migrations. Throughout the migration, we have gained deep insights into both Wikidata and Freebase, and share and discuss detailed statistics on both knowledge bases….(More)”
The report of the 2015 Aspen Institute Roundtable on Information Technology: “In the age of ubiquitous Internet connections, smartphones and data, the future vitality of cities is increasingly based on their ability to use digital networks in intelligent, strategic ways. While we are accustomed to thinking of cities as geophysical places governed by mayors, conventional political structures and bureaucracies, this template of city governance is under great pressure to evolve. Urban dwellers now live their lives in all sorts of hyper-connected virtual spaces, pulsating with real-time information, intelligent devices, remote-access databases and participatory crowdsourcing. Expertise is distributed, not centralized. Governance is not just a matter of winning elections and assigning tasks to bureaucracies; it is about the skillful collection and curation of information as a way to create new affordances for commerce and social life.
Except among a small class of vanguard cities, however, the far-reaching implications of the “networked city” for economic development, urban planning, social life and democracy, have not been explored in depth. The Aspen Institute Communications and Society Program thus convened an eclectic group of thirty experts to explore how networking technologies are rapidly changing the urban landscape in nearly every dimension. The goal was to learn how open networks, onlinecooperation and open data can enhance urban planning and administration, and more broadly, how they might improve economic opportunity and civic engagement. The conference, the 24th Annual Aspen Roundtable on Information Technology, also addressed the implications of new digital technologies for urban transportation, public health and safety, and socio-economic inequality….(Download the InfoTech 2015 Report)”
Fatemeh Ahmadi Zeleti et al in Government Information Quarterly: “Business models for open data have emerged in response to the economic opportunities presented by the increasing availability of open data. However, scholarly efforts providing elaborations, rigorous analysis and comparison of open data models are very limited. This could be partly attributed to the fact that most discussions on Open Data Business Models (ODBMs) are predominantly in the practice community. This shortcoming has resulted in a growing list of ODBMs which, on closer examination, are not clearly delineated and lack clear value orientation. This has made the understanding of value creation and exploitation mechanisms in existing open data businesses difficult and challenging to transfer. Following the Design Science Research (DSR) tradition, we developed a 6-Value (6-V) business model framework as a design artifact to facilitate the explication and detailed analysis of existing ODBMs in practice. Based on the results from the analysis, we identify business model patterns and emerging core value disciplines for open data businesses. Our results not only help streamline existing ODBMs and help in linking them to the overall business strategy, but could also guide governments in developing the required capabilities to support and sustain the business models….(More)”
David Donaldson in The Mandarin: “If we now have the technology to allow citizens to vote directly on all issues, what job remains for public servants?
While new technology may provide new options to contribute, the really important thing is governmental willingness to actually listen, says Maria Katsonis, the Victorian Department of Premier and Cabinet’s director of equality.
The balance between citizen consultation and public service expertise in decision-making remains a hot debate, with South Australian Premier Jay Weatherill warning last year that while expertise in policy is important, overzealous bureaucrats and politicians can disenfranchise citizens.
The internet is assisting government to attain opinions from people more easily than ever before. SA, for example, has embraced the use of citizen juries in policy formation through its youSAy portal — though as yet on only some issues. Finland has experimented with digitally crowdsourcing input into the policymaking process.
The Victorian government, meanwhile, has received blowback around claims its recent announcement for a “skyrail” in Melbourne’s south-eastern suburbs went ahead with very little consultation…
Indeed, even a direct vote doesn’t mean the government is really listening to the people. A notable example of a government using a poorly designed popular vote to rubber stamp its own intentions was an online poll in Queensland on whether to cut public transport fares which was worded to suit the government’s own predilections.
Giving citizens the tools to contribute
Katsonis said she didn’t want to “diss crowdsourcing”; governments should think about where using it might be appropriate, and where it might not. Directly crowdsourcing legislation is perhaps not the best way to use the “wisdom of the crowd”, she suggested….The use of people’s panels to inform policy and budgeting — for example at the City of Melbourne — shows some promise as one tool to improve engagement. Participants of people’s panels — which see groups of ordinary citizens being given background information about the task at hand and then asked to come up with a proposal for what to do — tend to report a higher trust in governmental processes after they’ve gained some experience of the difficulty of making those decisions.
One of the benefits of that system is the chance to give participants the tools to understand those processes for themselves, rather than going in cold, as some other direct participation tools do….
Despite the risks, processes such as citizens’ panels are still a more nuanced approach than calls for frequent referenda or the new breed of internet-based political parties, such as Australia’s Online Direct Democracy, that promise their members of parliament will vote however a majority of voters tell them to….(More)”
PLOS: “The spreading epidemic of Zika virus, with its putative and alarming associations with Guillain-Barre syndrome and infant microcephaly, has arrived just as several initiatives have come into place to minimize delays in sharing the results of scientific research.
and atIn September 2015, in response to concerns that research publishing practices had delayed access tocrucial information in the Ebola crisis, the World Health Organization convened a consultation “[i]nrecognition of the need to streamline mechanisms of data dissemination—globally and in as close toreal-time as possible” in the context of public health emergencies.
Participating medical journal editors, representing PLOS,BMJ and Nature journals and NEJM, provided a statement that journals should not act to delay access to data in a public health emergency: “In such scenarios,journals should not penalize, and, indeed, shouldencourage or mandate public sharing of relevant data…”
In a subsequent Comment in The Lancet, authors frommajor research funding organizations expressed supportfor data sharing in public health emergencies. TheInternational Committee of Medical Journal Editors(ICMJE), meeting in November 2015, lent further support to the principles of the WHO consultation byamending ICMJE “Recommendations” to endorse data sharing for public health emergencies of anygeographic scope.
Now that WHO has declared Zika to be a Public Health Emergency of International Concern, responses from these groups in recent days appear consistent with their recent declarations.
The ICMJE has announced that “In light of the need to rapidly understand and respond to the globalemergency caused by the Zika virus, content in ICMJE journals related to Zika virus is being made freeto access. We urge other journals to do the same. Further, as stated in our Recommendations, in theevent of a public health emergency (as defined by public health officials), information with immediateimplications for public health should be disseminated without concern that this will preclude subsequentconsideration for publication in a journal.”(www.icmje.org, accessed 9 Feburary 2016)
WHO has implemented special provisions for research manuscripts relevant to the Zika epidemic thatare submitted to WHO Bulletin; such papers “will be assigned a digital object identifier and posted onlinein the “Zika Open” collection within 24 hours while undergoing peer review. The data in these papers willthus be attributed to the authors while being freely available for reader scrutiny and unrestricted use”under a Creative Commons Attribution License (CC BY IGO 3.0).
At PLOS, where open access and data sharing apply as matter of course, all PLOS journals aim toexpedite peer review evaluation, pre-publication posting, and data sharing from research relevant to theZika outbreak. PLOS Currents Outbreaks offers an online platform for rapid publication of preliminaryresults, PLOS Neglected Tropical Diseases has committed to provide priority handling of Zika reports ingeneral, and other PLOS journals will prioritize submissions within their respective scopes. The PLOSZika Collection page provides central access to relevant and continually updated content from acrossthe PLOS journals, blogs, and collaborating organizations.
Today, the Wellcome Trust has issued a statement urging journals to commit to “make all content concerning the Zika virus free to access,” and funders to “require researchers undertaking work relevant to public health emergencies to set in place mechanisms to share quality-assured interim and final data as rapidly and widely as possible, including with public health and research communities and the World Health Organisation.” Among 31 initial signatories are such journals and publishers as PLOS, Springer Nature, Science journals, The JAMA Network, eLife, the Lancet, and New England Journal ofMedicine; and funding organizations including Bill and Melinda Gates Foundation, UK Medical ResearchCouncil, US National Institutes of Health, Wellcome Trust, and other major national and internationalresearch funders.
This policy shift prompts reconsideration of how we publish urgently needed data during a public health emergency….(More)”