"Imagery to the Crowd"


Description: “The Humanitarian Information Unit (HIU), a division within the Office of the Geographer and Global Issues at the U.S. Department of State, is working to increase the availability of spatial data in areas experiencing humanitarian emergencies. Built from a crowdsourcing model, the new “Imagery to the Crowd” process publishes high-resolution commercial satellite imagery, purchased by the Unites States Government, in a web-based format that can be easily mapped by volunteers.
The digital map data generated by the volunteers are stored in a database maintained by OpenStreetMap (OSM), a UK-registered non-profit foundation, under a license that ensures the data are freely available and open for a range of uses (http://osm.org). Inspired by the success of the OSM mapping effort after the 2010 Haiti earthquake, the Imagery to the Crowd process harnesses the combined power of satellite imagery and the volunteer mapping community to help aid agencies provide informed and effective humanitarian assistance, and plan recovery and development activities.
5-minute Ignite Talk about Imagery to the Crowd:

The Wise Way to Crowdsource a Manhunt


in the New Yorker: “If Reddit were looking for a model to follow, it could use NASA’s Clickworkers experiment, which in 2000-01 let tens of thousands of amateurs look at photos of Mars in order to identify craters on the planet and classify them by age. That study found that the aggregated judgments of the amateur “clickworkers” were “virtually indistinguishable from the inputs of a geologist with years of experience.”
The problem from Reddit’s perspective, of course, is that this method of sleuthing would be far less exciting for users, and would probably generate less traffic, than its current free-for-all approach. The point of the “find-the-bombers” subthread, after all, wasn’t just to find the bombers—it was also to connect and talk with others, and to feel like you were part of a virtual community. But valuable as that experience may have been for users, it also diminished the chances of the community coming up with useful information. Reddit has done an excellent job of being engaging. Now it needs to figure out if it wants to be effective”.

Toward an Ecological Model of Research and Development


Ben Schneiderman, the founding director of the Human-Computer Interaction Lab,  in The Atlantic: “The choice between basic and applied research is a false one….The belief that basic or pure research lays the foundation for applied research was fixed in science policy circles by Vannevar Bush’s 1945 report on Science: The Endless Frontier. Unfortunately, his unsubstantiated beliefs have remained attractive to powerful advocates of basic research who seek funding for projects that may or may not advance innovation and economic growth. Shifting the policy agenda to recognize that applied research goals often trigger more effective basic research could accelerate both applied and basic research….the highest payoffs often come when there is a healthy interaction of basic and applied research (Figure 3). This ecological model also suggests that basic and applied research are embedded in a rich context of large development projects and continuing efforts to refine production & operations.”
ecologicalmodelshneiderman.jpg

Open Data Research Announced


WWW Foundation Press Release:  “Speaking at an Open Government Partnership reception last night in London, Sir Tim Berners-Lee, founder of the World Wide Web Foundation (Web Foundation) and inventor of the Web, unveiled details of the first ever in-depth study into how the power of open data could be harnessed to tackle social challenges in the developing world. The 14 country study is funded by Canada’s International Development Research Centre (IDRC) and will be overseen by the Web Foundation’s world-leading open data experts. An interim progress update will be made at an October 2013 meeting of the Open Government Partnership, with in-depth results expected in 2014…

Sir Tim Berners-Lee, founder of the World Wide Web Foundation and inventor of the Web said:

“Open Data, accessed via a free and open Web, has the potential to create a better world. However, best practice in London or New York is not necessarily best practice in Lima or Nairobi.  The Web Foundation’s research will help to ensure that Open Data initiatives in the developing world will unlock real improvements in citizens’ day-to-day lives.”

José M. Alonso, program manager at the World Wide Web Foundation, added:

“Through this study, the Web Foundation hopes not only to contribute to global understanding of open data, but also to cultivate the ability of developing world researchers and development workers to understand and apply open data for themselves.”

Further details on the project, including case study outlines are available here: http://oddc.opendataresearch.org/

From Open Data to Information Justice


Paper by Jeffrey Johnson for Annual Conference of the Midwest Political Science Association: “This paper argues for subsuming the question of open data within a larger question of information justice. I show that there are several problems of justice that emerge as a consequence of opening data to full public accessibility, and are generally a consequence of the failure of the open data movement to understand the constructed nature of data. I examine three such problems: the embedding of social privilege in datasets as the data is constructed, the differential capabilities of data users (especially differences between citizens and “enterprise” users), and the norms that data systems impose through their function as disciplinary systems.
In each case I show that open data has the quite real potential to exacerbate rather than alleviate injustices. This necessitates a theory of information justice. I briefly suggest two complementary directions in which such a theory might be developed: one leading toward moral principles that can be used to evaluate the justness of data practices, and another exploring the practices and structures that a social movement promoting information justice might pursue.”

The Social Affordances of the Internet for Networked Individualism


Paper by NetLab (Toronto University) scholars in the latest issue of the Journal of Computer-Mediated Communication: “We review the evidence from a number of surveys in which our NetLab has been involved about the extent to which the Internet is transforming or enhancing community. The studies show that the Internet is used for connectivity locally as well as globally, although the nature of its use varies in different countries. Internet use is adding on to other forms of communication, rather than replacing them. Internet use is reinforcing the pre-existing turn to societies in the developed world that are organized around networked individualism rather than group or local solidarities. The result has important implications for civic involvement.”

Procurement needs better data now


Howard Rolfe, procurement director for East of England NHS Collaborative Procurement Hub, in The Guardian: “Knowledge management is fundamental to any organisation and procurement in the NHS is no exception. Current systems are not joined up and don’t give the level of information that should be expected. Management in many NHS trusts cannot say how effective procurement is within their organisation because they don’t have a dashboard of information that tells them, for example, the biggest spend areas, who is placing the order, what price is paid and how that price compares.
Systems now exist that could help answer these questions and increase board and senior management focus on this area of huge spend….The time for better data is now, the opportunity is at the top of political and management agendas and the need is overwhelming. What is the solution? The provision of effective knowledge management systems is key and will facilitate improvements in information, procurement and collaborative aggregation by providing greater visibility of spend and reduction of administrative activity.”

The Dangers of Surveillance


Paper by Neil M. Richards in Harvard Law Review. Abstract:  “From the Fourth Amendment to George Orwell’s Nineteen Eighty-Four, our culture is full of warnings about state scrutiny of our lives. These warnings are commonplace, but they are rarely very specific. Other than the vague threat of an Orwellian dystopia, as a society we don’t really know why surveillance is bad, and why we should be wary of it. To the extent the answer has something to do with “privacy,” we lack an understanding of what “privacy” means in this context, and why it matters. Developments in government and corporate practices have made this problem more urgent. Although we have laws that protect us against government surveillance, secret government programs cannot be challenged until they are discovered.
… I propose a set of four principles that should guide the future development of surveillance law, allowing for a more appropriate balance between the costs and benefits of government surveillance. First, we must recognize that surveillance transcends the public-private divide. Even if we are ultimately more concerned with government surveillance, any solution must grapple with the complex relationships between government and corporate watchers. Second, we must recognize that secret surveillance is illegitimate, and prohibit the creation of any domestic surveillance programs whose existence is secret. Third, we should recognize that total surveillance is illegitimate and reject the idea that it is acceptable for the government to record all Internet activity without authorization. Fourth, we must recognize that surveillance is harmful. Surveillance menaces intellectual privacy and increases the risk of blackmail, coercion, and discrimination; accordingly, we must recognize surveillance as a harm in constitutional standing doctrine.

How to Clean Up Social News


verilyDavid Talbot in MIT Technology Review: ” New platforms for fact-checking and reputation scoring aim to better channel social media’s power in the wake of a disaster…Researchers from the Masdar Institute of Technology and the Qatar Computer Research Institute plan to launch Verily, a platform that aims to verify social media information, in a beta version this summer. Verily aims to enlist people in collecting and analyzing evidence to confirm or debunk reports. As an incentive, it will award reputation points—or dings—to its contributors.
Verily will join services like Storyful that use various manual and technical means to fact-check viral information, and apps such as Swift River that, among other things, let people set up filters on social media to provide more weight to trusted users in the torrent of posts following major events…Reputation scoring has worked well for e-commerce sites like eBay and Amazon and could help to clean up social media reports in some situations.

Sanitation Hackathon


SanitationNew York Times: “Because of the rapid spread of cellular phones, mobile technology has previously been used to address a variety of problems in the developing world, including access to financial services, health care information and education. But toilets were another matter….Building on a process that had previously been employed to address problems in supplying clean water to people in poor areas, the World Bank turned its attention to sanitation. Over six months last year, it solicited ideas from experts in the field, as well as software developers. The process culminated in early December with the actual hackathon — two days in which more than 1,000 developers gathered in 40 cities worldwide to work on their projects….After the event in Washington, the winners of the hackathon are set to travel to Silicon Valley for meetings with venture capitalists and entrepreneurs who are interested in the issue. The World Bank does not plan to invest in the projects, but hopes that others might.”
See also http://www.sanitationhackathon.org/