The Next Frontier of Engagement: Civic Innovation Labs


Maayan Dembo at Planetizen: “As described by Clayton Christensen, a professor at the Harvard Business School who developed the term “disruptive innovation,” a successful office for social innovation should employ four main tactics to accomplish its mission. First, governments should invest “in innovations that are developed and identified by citizens outside of government who better understand the problems.” Second, the office should support “‘bottom-up’ initiatives, in preference to ‘trickle-down’ philanthropy—because the societal impact of the former is typically greater.” Third, Christensen argues that the office should utilize impact metrics to measure performance and, finally, that it should also invest in social innovation outside of the non-profit sector.
Los Angeles’ most recent citizen-driven social innovation initiative, the Civic Innovation Lab, is an 11-month project aimed at prototyping new solutions for issues within the city of Los Angeles. It is supported by the HubLA, Learn Do Share, the Los Angeles *City  Tech Bullpen, and Innovate LA, a membership organization within the Los Angeles County Economic Development Corporation. Private and public sector support for such labs, in one of the largest cities in America, is highly unprecedented, and because this initiative in Los Angeles is a new mechanism explicitly supported by the public sector, it warrants a critical check on its motivations and accomplishments. Depending on its success, the Civic Innovation Lab could serve as a model for future municipalities.
The Los Angeles Civic Innovation Lab operates in three main phases: 1) workshops where citizens learn about the possibilities of Open Data and discuss what deep challenges face Los Angeles (called the “Discover, Define, Design” stage), 2) a call for solutions to solve the design challenges brought to light in the first phase, and 3) a six-month accelerator program to prototype selected solutions. I participated in the most recent Civic Innovation Lab session, a three-day workshop concluding the “Discover, Define, Design” phase….”

Future Crimes


New book by Marc Goodman: “Technological advances have benefited our world in immeasurable ways—but there is an ominous flip side. Criminals are often the earliest, and most innovative, adopters of technology, and modern times have led to modern crimes. Today’s criminals are stealing identities, draining online bank accounts and wiping out computer servers. It’s disturbingly easy to activate baby monitors to spy on families, pacemakers can be hacked to deliver a lethal jolt of electricity, and thieves are analyzing your social media in order to determine the best time for a home invasion. Meanwhile, 3D printers produce AK-47s, terrorists can download the recipe for the Ebola virus, and drug cartels are building drones. This is just the beginning of the tsunami of technological threats coming our way. In Future Crimes, Marc Goodman rips opens his database of hundreds of real cases to give us front-row access to these impending perils. Reading like a sci-fi thriller, but based in startling fact, Future Crimes raises tough questions about the expanding role of technology in our lives. Future Crimes is a call to action for better security measures worldwide, but most importantly, it will empower readers to protect themselves against looming technological threats—before it’s too late.”

Digital Sociology


New book by Deborah Lupton: “We now live in a digital society. New digital technologies have had a profound influence on everyday life, social relations, government, commerce, the economy and the production and dissemination of knowledge. People’s movements in space, their purchasing habits and their online communication with others are now monitored in detail by digital technologies. We are increasingly becoming digital data subjects, whether we like it or not, and whether we choose this or not.
The sub-discipline of digital sociology provides a means by which the impact, development and use of these technologies and their incorporation into social worlds, social institutions and concepts of selfhood and embodiment may be investigated, analysed and understood. This book introduces a range of interesting social, cultural and political dimensions of digital society and discusses some of the important debates occurring in research and scholarship on these aspects. It covers the new knowledge economy and big data, reconceptualising research in the digital era, the digitisation of higher education, the diversity of digital use, digital politics and citizen digital engagement, the politics of surveillance, privacy issues, the contribution of digital devices to embodiment and concepts of selfhood and many other topics.”

The Reliability of Tweets as a Supplementary Method of Seasonal Influenza Surveillance


New Paper by Ming-Hsiang Tsou et al in the Journal of Medical Internet Research: “Existing influenza surveillance in the United States is focused on the collection of data from sentinel physicians and hospitals; however, the compilation and distribution of reports are usually delayed by up to 2 weeks. With the popularity of social media growing, the Internet is a source for syndromic surveillance due to the availability of large amounts of data. In this study, tweets, or posts of 140 characters or less, from the website Twitter were collected and analyzed for their potential as surveillance for seasonal influenza.
Objective: There were three aims: (1) to improve the correlation of tweets to sentinel-provided influenza-like illness (ILI) rates by city through filtering and a machine-learning classifier, (2) to observe correlations of tweets for emergency department ILI rates by city, and (3) to explore correlations for tweets to laboratory-confirmed influenza cases in San Diego.
Methods: Tweets containing the keyword “flu” were collected within a 17-mile radius from 11 US cities selected for population and availability of ILI data. At the end of the collection period, 159,802 tweets were used for correlation analyses with sentinel-provided ILI and emergency department ILI rates as reported by the corresponding city or county health department. Two separate methods were used to observe correlations between tweets and ILI rates: filtering the tweets by type (non-retweets, retweets, tweets with a URL, tweets without a URL), and the use of a machine-learning classifier that determined whether a tweet was “valid”, or from a user who was likely ill with the flu.
Results: Correlations varied by city but general trends were observed. Non-retweets and tweets without a URL had higher and more significant (P<.05) correlations than retweets and tweets with a URL. Correlations of tweets to emergency department ILI rates were higher than the correlations observed for sentinel-provided ILI for most of the cities. The machine-learning classifier yielded the highest correlations for many of the cities when using the sentinel-provided or emergency department ILI as well as the number of laboratory-confirmed influenza cases in San Diego. High correlation values (r=.93) with significance at P<.001 were observed for laboratory-confirmed influenza cases for most categories and tweets determined to be valid by the classifier.
Conclusions: Compared to tweet analyses in the previous influenza season, this study demonstrated increased accuracy in using Twitter as a supplementary surveillance tool for influenza as better filtering and classification methods yielded higher correlations for the 2013-2014 influenza season than those found for tweets in the previous influenza season, where emergency department ILI rates were better correlated to tweets than sentinel-provided ILI rates. Further investigations in the field would require expansion with regard to the location that the tweets are collected from, as well as the availability of more ILI data…”

A New Ebola Crisis Page Built with Open Data


HDX team: “We are introducing a new Ebola crisis page that provides an overview of the data available in HDX. The page includes an interactive map of the worst-affected countries, the top-line figures for the crisis, a graph of cumulative Ebola cases and deaths, and over 40 datasets.
We have been working closely with UNMEER and WHO to make Ebola data available for public use. We have also received important contributions from the British Red Cross, InterAction, MapAction, the Standby Task Force, the US Department of Defense, and WFP, among others.

How we built it

The process to create this page started a couple of months ago by simply linking to existing data sites, such as Open Street Map’s geospatial data or OCHA’s common operational datasets. We then created a service by extracting the data on Ebola cases and deaths from the bi-weekly WHO situation report and making the raw files available for analysts and developers.
The OCHA Regional Office in Dakar contributed a dataset that included Ebola cases by district, which they had been collecting from reports by the national Ministries of Health since March 2014. This data was picked up by The New York Times graphics team and by Gapminder which partnered with Google Crisis Response to add the data to the Google Public Data Explorer.

As more organizations shared Ebola datasets through HDX, users started to transform the data into useful graphs and maps. These visuals were then shared back with the wider community through the HDX gallery. We have incorporated many of these user-generated visual elements into the design of our new Ebola crisis page….”
See also Hacking Ebola.

A New Taxonomy of Smart City Projects


New paper by Guido Perboli et al: “City logistics proposes an integrated vision of freight transportation systems within urban area and it aims at the optimization of them as a whole in terms of efficiency, security, safety, viability and environmental sustainability. Recently, this perspective has been extended by the Smart City concept in order to include other aspects of city management: building, energy, environment, government, living, mobility, education, health and so on. At the best of our knowledge, a classification of Smart City Projects has not been created yet. This paper introduces such a classification, highlighting success factors and analyzing new trends in Smart City.”

Code of Conduct: Cyber Crowdsourcing for Good


Patrick Meier at iRevolution: “There is currently no unified code of conduct for digital crowdsourcing efforts in the development, humanitarian or human rights space. As such, we propose the following principles (displayed below) as a way to catalyze a conversation on these issues and to improve and/or expand this Code of Conduct as appropriate.
This initial draft was put together by Kate ChapmanBrooke Simons and myself. The link above points to this open, editable Google Doc. So please feel free to contribute your thoughts by inserting comments where appropriate. Thank you.
An organization that launches a digital crowdsourcing project must:

  • Provide clear volunteer guidelines on how to participate in the project so that volunteers are able to contribute meaningfully.
  • Test their crowdsourcing platform prior to any project or pilot to ensure that the system will not crash due to obvious bugs.
  • Disclose the purpose of the project, exactly which entities will be using and/or have access to the resulting data, to what end exactly, over what period of time and what the expected impact of the project is likely to be.
  • Disclose whether volunteer contributions to the project will or may be used as training data in subsequent machine learning research
  • ….

An organization that launches a digital crowdsourcing project should:

  • Share as much of the resulting data with volunteers as possible without violating data privacy or the principle of Do No Harm.
  • Enable volunteers to opt out of having their tasks contribute to subsequent machine learning research. Provide digital volunteers with the option of having their contributions withheld from subsequent machine learning studies
  • … “

Spain is trialling city monitoring using sound


Springwise: “There’s more traffic on today’s city streets than there ever has been, and managing it all can prove to be a headache for local authorities and transport bodies. In the past, we’ve seen the City of Calgary in Canada detect drivers’ Bluetooth signals to develop a map of traffic congestion. Now the EAR-IT project in Santander, Spain, is using acoustic sensors to measure the sounds of city streets and determine real time activity on the ground.
Launched as part of the autonomous community’s SmartSantander initiative, the experimental scheme placed hundreds of acoustic processing units around the region. These pick up the sounds being made in any given area and, when processed through an audio recognition engine, can provide data about what’s going on on the street. Smaller ‘motes’ were also developed to provide more accurate location information about each sound.
Created by members of Portugal’s UNINOVA institute and IT consultants EGlobalMark, the system was able to use city noises to detect things such as traffic congestion, parking availability and the location of emergency vehicles based on their sirens. It could then automatically trigger smart signs to display up-to-date information, for example.
The team particularly focused on a junction near the city hospital that’s a hotspot for motor accidents. Rather than force ambulance drivers to risk passing through a red light and into lateral traffic, the sensors were able to detect when and where an emergency vehicle was coming through and automatically change the lights in their favor.
The system could also be used to pick up ‘sonic events’ such as gunshots or explosions and detect their location. The researchers have also trialled an indoor version that can sense if an elderly resident has fallen over or to turn lights off when the room becomes silent.”

Hashtag Standards For Emergencies


Key Findings of New Report by the UN Office for the Coordination of Humanitarian Affairs:”

  • The public is using Twitter for real-time information exchange and for expressing emotional support during a variety of crises, such as wildfires, earthquakes, floods, hurricanes, political protests, mass shootings, and communicable-disease tracking.31 By encouraging proactive standardization of hashtags, emergency responders may be able to reduce a big-data challenge and better leverage crowdsourced information for operational planning and response.
  • Twitter is the primary social media platform discussed in this Think Brief. However, the use of hashtags has spread to other social media platforms, including Sina Weibo, Facebook, Google+ and Diaspora. As a result, the ideas behind hashtag standardization may have a much larger sphere of influence than just this one platform.
  • Three hashtag standards are encouraged and discussed: early standardization of the disaster name (e.g., #Fay), how to report non-emergency needs (e.g., #PublicRep) and requesting emergency assistance (e.g., #911US).
  • As well as standardizing hashtags, emergency response agencies should encourage the public to enable Global Positioning System (GPS) when tweeting during an emergency. This will provide highly detailed information to facilitate response.
  • Non-governmental groups, national agencies and international organizations should discuss the potential added value of monitoring social media during emergencies. These groups need to agree who is establishing the standards for a given country or event, which agency disseminates these prescriptive messages, and who is collecting and validating the incoming crowdsourced reports.
  • Additional efforts should be pursued regarding how to best link crowdsourced information into emergency response operations and logistics. If this information will be collected, the teams should be ready to act on it in a timely manner.”

Politics, Policy and Privatisation in the Everyday Experience of Big Data in the NHS


Chapter by Andrew Goffey ; Lynne Pettinger and Ewen Speed in Martin Hand , Sam Hillyard (ed.) Big Data? Qualitative Approaches to Digital Research (Studies in Qualitative Methodology, Volume 13) : “This chapter explains how fundamental organisational change in the UK National Health Service (NHS) is being effected by new practices of digitised information gathering and use. It analyses the taken-for-granted IT infrastructures that lie behind digitisation and considers the relationship between digitisation and big data.
Design/methodology/approach

Qualitative research methods including discourse analysis, ethnography of software and key informant interviews were used. Actor-network theories, as developed by Science and technology Studies (STS) researchers were used to inform the research questions, data gathering and analysis. The chapter focuses on the aftermath of legislation to change the organisation of the NHS.

Findings

The chapter shows the benefits of qualitative research into specific manifestations information technology. It explains how apparently ‘objective’ and ‘neutral’ quantitative data gathering and analysis is mediated by complex software practices. It considers the political power of claims that data is neutral.

Originality/value

The chapter provides insight into a specific case of healthcare data and. It makes explicit the role of politics and the State in digitisation and shows how STS approaches can be used to understand political and technological practice.”