When Tech Culture And Urbanism Collide


John Tolva: “…We can build upon the success of the work being done at the intersection of technology and urban design, right now.

For one, the whole realm of social enterprise — for-profit startups that seek to solve real social problems — has a huge overlap with urban issues. Impact Engine in Chicago, for instance, is an accelerator squarely focused on meaningful change and profitable businesses. One of their companies, Civic Artworks, has set as its goal rebalancing the community planning process.

The Code for America Accelerator and Tumml, both located in San Francisco, morph the concept of social innovation into civic/urban innovation. The companies nurtured by CfA and Tumml are filled with technologists and urbanists working together to create profitable businesses. Like WorkHands, a kind of LinkedIn for blue collar trades. Would something like this work outside a city? Maybe. Are its effects outsized and scale-ready in a city? Absolutely. That’s the opportunity in urban innovation.

Scale is what powers the sharing economy and it thrives because of the density and proximity of cities. In fact, shared resources at critical density is one of the only good definitions for what a city is. It’s natural that entrepreneurs have overlaid technology on this basic fact of urban life to amplify its effects. Would TaskRabbit, Hailo or LiquidSpace exist in suburbia? Probably, but their effects would be minuscule and investors would get restless. The city in this regard is the platform upon which sharing economy companies prosper. More importantly, companies like this change the way the city is used. It’s not urban planning, but it is urban (re)design and it makes a difference.

A twist that many in the tech sector who complain about cities often miss is that change in a city is not the same thing as change in city government. Obviously they are deeply intertwined; change is mighty hard when it is done at cross-purposes with government leadership. But it happens all the time. Non-government actors — foundations, non-profits, architecture and urban planning firms, real estate developers, construction companies — contribute massively to the shape and health of our cities.

Often this contribution is powered through policies of open data publication by municipal governments. Open data is the raw material of a city, the vital signs of what has happened there, what is happening right now, and the deep pool of patterns for what might happen next.

Tech entrepreneurs would do well to look at the organizations and companies capitalizing on this data as the real change agents, not government itself. Even the data in many cases is generated outside government. Citizens often do the most interesting data-gathering, with tools like LocalData. The most exciting thing happening at the intersection of technology and cities today — what really makes them “smart” — is what is happening at the periphery of city government. It’s easy to belly-ache about government and certainly there are administrations that to do not make data public (or shut it down), but tech companies who are truly interested in city change should know that there are plenty of examples of how to start up and do it.

And yet, the somewhat staid world of architecture and urban-scale design presents the most opportunity to a tech community interested in real urban change. While technology obviously plays a role in urban planning — 3D visual design tools like Revit and mapping services like ArcGIS are foundational for all modern firms — data analytics as a serious input to design matters has only been used in specialized (mostly energy efficiency) scenarios. Where are the predictive analytics, the holistic models, the software-as-a-service providers for the brave new world of urban informatics and The Internet of Things? Technologists, it’s our move.

Something’s amiss when some city governments — rarely the vanguard in technological innovation — have more sophisticated tools for data-driven decision-making than the private sector firms who design the city. But some understand the opportunity. Vannevar Technology is working on it, as is Synthicity. There’s plenty of room for the most positive aspects of tech culture to remake the profession of urban planning itself. (Look to NYU’s Center for Urban Science and Progress and the University of Chicago’s Urban Center for Computation and Data for leadership in this space.)…”

6 New Year’s Strategies for Open Data Entrepreneurs


The GovLab’s Senior Advisor Joel Gurin: “Open Data has fueled a wide range of startups, including consumer-focused websites, business-to-business services, data-management tech firms, and more. Many of the companies in the Open Data 500 study are new ones like these. New Year’s is a classic time to start new ventures, and with 2014 looking like a hot year for Open Data, we can expect more startups using this abundant, free resource. For my new book, Open Data Now, I interviewed dozens of entrepreneurs and distilled six of the basic strategies that they’ve used.
1. Learn how to add value to free Open Data. We’re seeing an inversion of the value proposition for data. It used to be that whoever owned the data—particularly Big Data—had greater opportunities than those who didn’t. While this is still true in many areas, it’s also clear that successful businesses can be built on free Open Data that anyone can use. The value isn’t in the data itself but rather in the analytical tools, expertise, and interpretation that’s brought to bear. One oft-cited example: The Climate Corporation, which built a billion-dollar business out of government weather and satellite data that’s freely available for use.
2. Focus on big opportunities: health, finance, energy, education. A business can be built on just about any kind of Open Data. But the greatest number of startup opportunities will likely be in the four big areas where the federal government is focused on Open Data release. Last June’s Health Datapalooza showcased the opportunities in health. Companies like Opower in energy, GreatSchools in education, and Calcbench, SigFig, and Capital Cube in finance are examples in these other major sectors.
3. Explore choice engines and Smart Disclosure apps. Smart Disclosure – releasing data that consumers can use to make marketplace choices – is a powerful tool that can be the basis for a new sector of online startups. No one, it seems, has quite figured out how to make this form of Open Data work best, although sites like CompareTheMarket in the UK may be possible models. Business opportunities await anyone who can find ways to provide these much-needed consumer services. One example: Kayak, which competed in the crowded travel field by providing a great consumer interface, and which was sold to Priceline for $1.8 billion last year.
4. Help consumers tap the value of personal data. In a privacy-conscious society, more people will be interested in controlling their personal data and sharing it selectively for their own benefit. The value of personal data is just being recognized, and opportunities remain to be developed. There are business opportunities in setting up and providing “personal data vaults” and more opportunity in applying the many ways they can be used. Personal and Reputation.com are two leaders in this field.
5. Provide new data solutions to governments at all levels. Government datasets at the federal, state, and local level can be notoriously difficult to use. The good news is that these governments are now realizing that they need help. Data management for government is a growing industry, as Socrata, OpenGov, 3RoundStones, and others are finding, while companies like Enigma.io are turning government data into a more usable resource.
6. Look for unusual Open Data opportunities. Building a successful business by gathering data on restaurant menus and recipes is not an obvious route to success. But it’s working for Food Genius, whose founders showed a kind of genius in tapping an opportunity others had missed. While the big areas for Open Data are becoming clear, there are countless opportunities to build more niche businesses that can still be highly successful. If you have expertise in an area and see a customer need, there’s an increasingly good chance that the Open Data to help meet that need is somewhere to be found.”

Open Data in Action


Nick Sinai at the White House: “Over the past few years, the Administration has launched a series of Open Data Initiatives, which, have released troves of valuable data in areas such as health, energy, education, public safety, finance, and global development…
Today, in furtherance of this exciting economic dynamic, The Governance Lab (The GovLab) —a research institution at New York University—released the beta version of its Open Data 500 project—an initiative designed to identify, describe, and analyze companies that use open government data in order to study how these data can serve business needs more effectively. As part of this effort, the organization is compiling a list of 500+ companies that use open government data to generate new business and develop new products and services.
This working list of 500+ companies, from sectors ranging from real estate to agriculture to legal services, shines a spotlight on surprising array of innovative and creative ways that open government data is being used to grow the economy – across different company sizes, different geographies, and different industries. The project includes information about  the companies and what government datasets they have identified as critical resources for their business.
Some of examples from the Open Data 500 Project include:
  • Brightscope, a San Diego-based company that leverages data from the Department of Labor, the Security and Exchange Commission, and the Census Bureau to rate consumers’ 401k plans objectively on performance and fees, so companies can choose better plans and employees can make better decisions about their retirement options.
  • AllTuition, a  Chicago-based startup that provides services—powered by data from Department of Education on Federal student financial aid programs and student loans— to help students and parents manage the financial-aid process for college, in part by helping families keep track of deadlines, and walking them through the required forms.
  • Archimedes, a San Francisco healthcare modeling and analytics company, that leverages  Federal open data from the National Institutes of Health, the Centers for Disease Control and Prevention, and the Center for Medicaid and Medicare Services, to  provide doctors more effective individualized treatment plans and to enable patients to make informed health decisions.
You can learn more here about the project and view the list of open data companies here.

See also:
Open Government Data: Companies Cash In

NYU project touts 500 top open-data firms”

Selected Readings on Data Visualization


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of data visualization was originally published in 2013.

Data visualization is a response to the ever-increasing amount of  information in the world. With big data, informatics and predictive analytics, we have an unprecedented opportunity to revolutionize policy-making. Yet data by itself can be overwhelming. New tools and techniques for visualizing information can help policymakers clearly articulate insights drawn from data. Moreover, the rise of open data is enabling those outside of government to create informative and visually arresting representations of public information that can be used to support decision-making by those inside or outside governing institutions.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Duke, D.J., K.W. Brodlie, D.A. Duce and I. Herman. “Do You See What I Mean? [Data Visualization].” IEEE Computer Graphics and Applications 25, no. 3 (2005): 6–9. http://bit.ly/1aeU6yA.

  • In this paper, the authors argue that a more systematic ontology for data visualization to ensure the successful communication of meaning. “Visualization begins when someone has data that they wish to explore and interpret; the data are encoded as input to a visualization system, which may in its turn interact with other systems to produce a representation. This is communicated back to the user(s), who have to assess this against their goals and knowledge, possibly leading to further cycles of activity. Each phase of this process involves communication between two parties. For this to succeed, those parties must share a common language with an agreed meaning.”
  • That authors “believe that now is the right time to consider an ontology for visualization,” and “as visualization move from just a private enterprise involving data and tools owned by a research team into a public activity using shared data repositories, computational grids, and distributed collaboration…[m]eaning becomes a shared responsibility and resource. Through the Semantic Web, there is both the means and motivation to develop a shared picture of what we see when we turn and look within our own field.”

Friendly, Michael. “A Brief History of Data Visualization.” In Handbook of Data Visualization, 15–56. Springer Handbooks Comp.Statistics. Springer Berlin Heidelberg, 2008. http://bit.ly/17fM1e9.

  • In this paper, Friendly explores the “deep roots” of modern data visualization. “These roots reach into the histories of the earliest map making and visual depiction, and later into thematic cartography, statistics and statistical graphics, medicine and other fields. Along the way, developments in technologies (printing, reproduction), mathematical theory and practice, and empirical observation and recording enabled the wider use of graphics and new advances in form and content.”
  • Just as the general the visualization of data is far from a new practice, Friendly shows that the graphical representation of government information has a similarly long history. “The collection, organization and dissemination of official government statistics on population, trade and commerce, social, moral and political issues became widespread in most of the countries of Europe from about 1825 to 1870. Reports containing data graphics were published with some regularity in France, Germany, Hungary and Finland, and with tabular displays in Sweden, Holland, Italy and elsewhere.”

Graves, Alvaro and James Hendler. “Visualization Tools for Open Government Data.” In Proceedings of the 14th Annual International Conference on Digital Government Research, 136–145. Dg.o ’13. New York, NY, USA: ACM, 2013. http://bit.ly/1eNSoXQ.

  • In this paper, the authors argue that, “there is a gap between current Open Data initiatives and an important part of the stakeholders of the Open Government Data Ecosystem.” As it stands, “there is an important portion of the population who could benefit from the use of OGD but who cannot do so because they cannot perform the essential operations needed to collect, process, merge, and make sense of the data. The reasons behind these problems are multiple, the most critical one being a fundamental lack of expertise and technical knowledge. We propose the use of visualizations to alleviate this situation. Visualizations provide a simple mechanism to understand and communicate large amounts of data.”
  • The authors also describe a prototype of a tool to create visualizations based on OGD with the following capabilities:
    • Facilitate visualization creation
    • Exploratory mechanisms
    • Viralization and sharing
    • Repurpose of visualizations

Hidalgo, César A. “Graphical Statistical Methods for the Representation of the Human Development Index and Its Components.” United Nations Development Programme Human Development Reports, September 2010. http://bit.ly/166TKur.

  • In this paper for the United Nations Human Development Programme, Hidalgo argues that “graphical statistical methods could be used to help communicate complex data and concepts through universal cognitive channels that are heretofore underused in the development literature.”
  • To support his argument, representations are provided that “show how graphical methods can be used to (i) compare changes in the level of development experienced by countries (ii) make it easier to understand how these changes are tied to each one of the components of the Human Development Index (iii) understand the evolution of the distribution of countries according to HDI and its components and (iv) teach and create awareness about human development by using iconographic representations that can be used to graphically narrate the story of countries and regions.”

Stowers, Genie. “The Use of Data Visualization in Government.” IBM Center for The Business of Government, Using Technology Series, 2013. http://bit.ly/1aame9K.

  • This report seeks “to help public sector managers understand one of the more important areas of data analysis today — data visualization. Data visualizations are more sophisticated, fuller graphic designs than the traditional spreadsheet charts, usually with more than two variables and, typically, incorporating interactive features.”
  • Stowers also offers numerous examples of “visualizations that include geographical and health data, or population and time data, or financial data represented in both absolute and relative terms — and each communicates more than simply the data that underpin it. In addition to these many examples of visualizations, the report discusses the history of this technique, and describes tools that can be used to create visualizations from many different kinds of data sets.”

NESTA: 14 predictions for 2014


NESTA: “Every year, our team of in-house experts predicts what will be big over the next 12 months.
This year we set out our case for why 2014 will be the year we’re finally delivered the virtual reality experience we were promised two decades ago, the US will lose technological control of the Internet, communities will start crowdsourcing their own political representatives and we’ll be introduced to the concept of extreme volunteering – plus 10 more predictions spanning energy, tech, health, data, impact investment and social policy…
People powered data

The growing movement to take back control of personal data will reach a tipping point, says Geoff Mulgan
2014 will be the year when citizens start to take control over their own data. So far the public has accepted a dramatic increase in use of personal data because it doesn’t impinge much on freedom, and helps to give us a largely free internet.
But all of that could be about to change. Edward Snowden’s NSA revelations have fuelled a growing perception that the big social media firms are cavalier with personal data (a perception not helped by Facebook and Google’s recent moves to make tracking cookies less visible) and the Information Commissioner has described the data protection breaches of many internet firms, banks and others as ‘horrifying’.
According to some this doesn’t matter. Scott McNealy of Sun Microsystems famously dismissed the problem: “you have zero privacy anyway. Get over it.” Mark Zuckerberg claims that young people no longer worry about making their lives transparent. We’re willing to be digital chattels so long as it doesn’t do us any visible harm.
That’s the picture now. But the past isn’t always a good guide to the future. More digitally savvy young people put a high premium on autonomy and control, and don’t like being the dupes of big organisations. We increasingly live with a digital aura alongside our physical identity – a mix of trails, data, pictures. We will increasingly want to shape and control that aura, and will pay a price if we don’t.
That’s why the movement for citizen control over data has gathered momentum. It’s 30 years since Germany enshrined ‘informational self-determination’ in the constitution and other countries are considering similar rules. Organisations like Mydex and Qiy now give users direct control over a store of their personal data, part of an emerging sector of Personal Data Stores, Privacy Dashboards and even ‘Life Management Platforms’. 
In the UK, the government-backed Midata programme is encouraging firms to migrate data back to public control, while the US has introduced green, yellow and blue buttons to simplify the option of taking back your data (in energy, education and the Veterans Administration respectively). Meanwhile a parallel movement encourages people to monetise their own data – so that, for example, Tesco or Experian would have to pay for the privilege of making money out of analysing your purchases and behaviours.
When people are shown what really happens to their data now they are shocked. That’s why we may be near a tipping point. A few more scandals could blow away any remaining complacency about the near future world of ubiquitous facial recognition software (Google Glasses and the like), a world where more people are likely to spy on their neighbours, lovers and colleagues.
The crowdsourced politician

This year we’ll see the rise of the crowdsourced independent parliamentary candidate, says Brenton Caffin
…In response, existing political institutions have sought to improve feedback between the governing and the governed through the tentative embrace of crowdsourcing methods, ranging from digital engagement strategies, open government challenges, to the recent stalled attempt to embrace open primaries by the Conservative Party (Iceland has been braver by designing its constitution by wiki). Though for many, these efforts are both too little and too late. The sense of frustration that no political party is listening to the real needs of people is probably part of the reason Russell Brand’s interview with Jeremy Paxman garnered nine million views in its first month on YouTube.
However a glimpse of an alternative approach may have arrived courtesy of the 2013 Australian Federal Election.
Tired of being taken for granted by the local MP, locals in the traditionally safe conservative seat of Indi embarked on a structured process of community ‘kitchen table’ conversations to articulate an independent account of the region’s needs. The community group, Voice for Indi, later nominated its chair, Cath McGowan, as an independent candidate. It crowdfunded their campaign finances and built a formidable army of volunteers through a sophisticated social media operation….
The rise of ‘extreme’ volunteering

By the end of 2014 the concept of volunteering will move away from the soup kitchen and become an integral part of how our communities operate, says Lindsay Levkoff Lynn
Extreme volunteering is about regular people going beyond the usual levels of volunteering. It is a deeper and more intensive form of volunteering, and I predict we will see more of these amazing commitments of ‘people helping people’ in the years to come.
Let me give you a few early examples of what we are already starting to see in the UK:

  • Giving a whole year of your life in service of kids. That’s what City Year volunteers do – Young people (18-25) dedicate a year, full-time, before university or work to support head teachers in turning around the behaviour and academics of some of the most underprivileged UK schools.
  • Giving a stranger a place to live and making them part of your family. That’s what Shared Lives Plus carers do. They ‘adopt’ an older person or a person with learning disabilities and offer them a place in their family. So instead of institutional care, families provide the full-time care – much like a ‘fostering for adults’ programme. Can you imagine inviting someone to come and live with you?…

Powering European public sector innovation – Towards a new architecture


Report of the expert group on public sector innovation: “The European Union faces an unprecedented crisis in economic growth, which has put public services under tremendous financial pressure. Many governments are also faced with long-term issues such as ageing societies, mounting social security and healthcare costs, high youth unemployment and a public service infrastructure that sometimes lags behind the needs of modern citizens and businesses. Under these conditions, innovation in public services is critical for the continued provision of such public services, in both quantity and quality. Public sector innovation can be defined as the process of generating new ideas, and implementing them to create value for society either through new or improved processes or services. The available evidence indicates that innovation in the public sector mostly happens randomly, rather than as a result of deliberate, systematic and strategic efforts. Innovation in the public sector, through strategic change, needs to become more ‘persistent’ and ‘cumulative’, in pursuit of a new and more collaborative governance model. There is a need for a new architecture for public sector innovation. Much can be done in individual Member States, regions and in local government to build capacity to innovate and to steer change processes. Innovation can emerge at all levels and innovation leadership can come from anyone. It is however the conviction of the expert group that the European institutions – including the Council of Ministers, the European Parliament, and the European Commission – can also play significant roles in fostering innovation both at European Union level and in individual Member States”

The Documented Life


Sherry Turkle in the New York Times: “I’ve been studying people and mobile technology for more than 15 years. Until recently, it was the sharing that seemed most important. People didn’t seem to feel like themselves unless they shared a thought or feeling, even before it was clear in their mind. The new sensibility played on the Cartesian with a twist: “I share, therefore I am.”

These days, we still want to share, but now our first focus is to have, to possess, a photograph of our experience.

I interview people about their selfies. It’s how they keep track of their lives….We interrupt conversations for documentation all the time.

A selfie, like any photograph, interrupts experience to mark the moment. In this, it shares something with all the other ways we break up our day, when we text during class, in meetings, at the theater, at dinners with friends. And yes, at funerals, but also more regularly at church and synagogue services. We text when we are in bed with our partners and spouses. We watch our political representatives text during sessions.

Technology doesn’t just do things for us. It does thing to us, changing not just what we do but who we are. The selfie makes us accustomed to putting ourselves and those around us “on pause” in order to document our lives. It is an extension of how we have learned to put our conversations “on pause” when we send or receive a text, an image, an email, a call. When you get accustomed to a life of stops and starts, you get less accustomed to reflecting on where you are and what you are thinking.

We don’t experience interruptions as disruptions anymore….”

Building tech-powered public services


New publication by Sarah Bickerstaffe from IPPR (UK): “Given the rapid pace of technological change and take-up by the public, it is a question of when not if public services become ‘tech-powered’. This new paper asks how we can ensure that innovations are successfully introduced and deployed.
Can technology improve the experience of people using public services, or does it simply mean job losses and a depersonalised offer to users?
Could tech-powered public services be an affordable, sustainable solution to some of the challenges of these times of austerity?
This report looks at 20 case studies of digital innovation in public services, using these examples to explore the impact of new and disruptive technologies. It considers how tech-powered public services can be delivered, focusing on the area of health and social care in particular.
We identify three key benefits of increasing the role of technology in public services: saving time, boosting user participation, and encouraging users to take responsibility for their own wellbeing.
In terms of how to successfully implement technological innovations in public services, five particular lessons stood out clearly and consistently:

  1. User-based iterative design is critical to delivering a product that solves real-world problems. It builds trust and ensures the technology works in the context in which it will be used.
  2. Public sector expertise is essential in order for a project to make the connections necessary to initial development and early funding.
  3. Access to seed and bridge funding is necessary to get projects off the ground and allow them to scale up.
  4. Strong leadership from within the public sector is crucial to overcoming the resistance that practitioners and managers often show initially.
  5. A strong business case that sets out the quality improvements and cost savings that the innovation can deliver is important to get attention and interest from public services.

The seven headline case studies in this report are:

  • Patchwork creates an elegant solution to join up professionals working with troubled families, in an effort to ensure that frontline support is truly coordinated.
  • Casserole Club links people who like cooking with their neighbours who are in need of a hot meal, employing the simplest possible technology to grow social connections.
  • ADL Smartcare uses a facilitated assessment tool to make professional expertise accessible to staff and service users without years of training, meaning they can carry out assessments together, engaging people in their own care and freeing up occupational therapists to focus where they are needed.
  • Mental Elf makes leading research in mental health freely available via social media, providing accessible summaries to practitioners and patients who would not otherwise have the time or ability to read journal articles, which are often hidden behind a paywall.
  • Patient Opinion provides an online platform for people to give feedback on the care they have received and for healthcare professionals and providers to respond, disrupting the typical complaints process and empowering patients and their families.
  • The Digital Pen and form system has saved the pilot hospital trust three minutes per patient by avoiding the need for manual data entry, freeing up clinical and administrative staff for other tasks.
  • Woodland Wiggle allows children in hospital to enter a magical woodland world through a giant TV screen, where they can have fun, socialise, and do their physiotherapy.”

Selected Readings on Crowdsourcing Data


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing data was originally published in 2013.

As institutions seek to improve decision-making through data and put public data to use to improve the lives of citizens, new tools and projects are allowing citizens to play a role in both the collection and utilization of data. Participatory sensing and other citizen data collection initiatives, notably in the realm of disaster response, are allowing citizens to crowdsource important data, often using smartphones, that would be either impossible or burdensomely time-consuming for institutions to collect themselves. Civic hacking, often performed in hackathon events, on the other hand, is a growing trend in which governments encourage citizens to transform data from government and other sources into useful tools to benefit the public good.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Baraniuk, Chris. “Power Politechs.” New Scientist 218, no. 2923 (June 29, 2013): 36–39. http://bit.ly/167ul3J.

  • In this article, Baraniuk discusses civic hackers, “an army of volunteer coders who are challenging preconceptions about hacking and changing the way your government operates. In a time of plummeting budgets and efficiency drives, those in power have realised they needn’t always rely on slow-moving, expensive outsourcing and development to improve public services. Instead, they can consider running a hackathon, at which tech-savvy members of the public come together to create apps and other digital tools that promise to enhace the provision of healthcare, schools or policing.”
  • While recognizing that “civic hacking has established a pedigree that demonstrates its potential for positive impact,” Baraniuk argues that a “more rigorous debate over how this activity should evolve, or how authorities ought to engage in it” is needed.

Barnett, Brandon, Muki Hansteen Izora, and Jose Sia. “Civic Hackathon Challenges Design Principles: Making Data Relevant and Useful for Individuals and Communities.” Hack for Change, https://bit.ly/2Ge6z09.

  • In this paper, researchers from Intel Labs offer “guiding principles to support the efforts of local civic hackathon organizers and participants as they seek to design actionable challenges and build useful solutions that will positively benefit their communities.”
  • The authors proposed design principles are:
    • Focus on the specific needs and concerns of people or institutions in the local community. Solve their problems and challenges by combining different kinds of data.
    • Seek out data far and wide (local, municipal, state, institutional, non-profits, companies) that is relevant to the concern or problem you are trying to solve.
    • Keep it simple! This can’t be overstated. Focus [on] making data easily understood and useful to those who will use your application or service.
    • Enable users to collaborate and form new communities and alliances around data.

Buhrmester, Michael, Tracy Kwang, and Samuel D. Gosling. “Amazon’s Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?” Perspectives on Psychological Science 6, no. 1 (January 1, 2011): 3–5. http://bit.ly/H56lER.

  • This article examines the capability of Amazon’s Mechanical Turk to act a source of data for researchers, in addition to its traditional role as a microtasking platform.
  • The authors examine the demographics of MTurkers and find that “MTurk participants are slightly more demographically diverse than are standard Internet samples and are significantly more diverse than typical American college samples; (b) participation is affected by compensation rate and task length, but participants can still be recruited rapidly and inexpensively; (c) realistic compensation rates do not affect data quality; and (d) the data obtained are at least as reliable as those obtained via traditional methods.”
  • The paper concludes that, just as MTurk can be a strong tool for crowdsourcing tasks, data derived from MTurk can be high quality while also being inexpensive and obtained rapidly.

Goodchild, Michael F., and J. Alan Glennon. “Crowdsourcing Geographic Information for Disaster Response: a Research Frontier.” International Journal of Digital Earth 3, no. 3 (2010): 231–241. http://bit.ly/17MBFPs.

  • This article examines issues of data quality in the face of the new phenomenon of geographic information being generated by citizens, in order to examine whether this data can play a role in emergency management.
  • The authors argue that “[d]ata quality is a major concern, since volunteered information is asserted and carries none of the assurances that lead to trust in officially created data.”
  • Due to the fact that time is crucial during emergencies, the authors argue that, “the risks associated with volunteered information are often outweighed by the benefits of its use.”
  • The paper examines four wildfires in Santa Barbara in 2007-2009 to discuss current challenges with volunteered geographical data, and concludes that further research is required to answer how volunteer citizens can be used to provide effective assistance to emergency managers and responders.

Hudson-Smith, Andrew, Michael Batty, Andrew Crooks, and Richard Milton. “Mapping for the Masses Accessing Web 2.0 Through Crowdsourcing.” Social Science Computer Review 27, no. 4 (November 1, 2009): 524–538. http://bit.ly/1c1eFQb.

  • This article describes the way in which “we are harnessing the power of web 2.0 technologies to create new approaches to collecting, mapping, and sharing geocoded data.”
  • The authors examine GMapCreator and MapTube, which allow users to do a range of map-related functions such as create new maps, archive existing maps, and share or produce bottom-up maps through crowdsourcing.
  • They conclude that “these tools are helping to define a neogeography that is essentially ‘mapping for the masses,’ while noting that there are many issues of quality, accuracy, copyright, and trust that will influence the impact of these tools on map-based communication.”

Kanhere, Salil S. “Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology, edited by Chittaranjan Hota and Pradip K. Srimani, 19–26. Lecture Notes in Computer Science 7753. Springer Berlin Heidelberg. 2013. https://bit.ly/2zX8Szj.

  • This paper provides a comprehensive overview of participatory sensing — a “new paradigm for monitoring the urban landscape” in which “ordinary citizens can collect multi-modal data streams from the surrounding environment using their mobile devices and share the same using existing communications infrastructure.”
  • In addition to examining a number of innovative applications of participatory sensing, Kanhere outlines the following key research challenges:
    • Dealing with incomplete samples
    •  Inferring user context
    • Protecting user privacy
    • Evaluating data trustworthiness
    • Conserving energy

When the wisdom of crowds meets the kindness of strangers


Tim Kelsey (NHS) on why patient and citizen participation is fundamental to high quality health and care services: “…But above all my priority is to improve the way in which health and care services listen to people  – and can therefore act and change. The work of entrepreneurs and apps developers like Patients Like Me, Patient Opinion and iwantgreatcare confirms the benefits of real time patient and citizen participation. The challenge is to do this at scale: open the doors, invite the whole community into the job of improving our national health service. Share decision making. Everybody needs the opportunity – and should be encouraged – to participate.
In April, the NHS did something unprecedented – it launched the Friends and Family Test (FFT), the first time a health service has reported a single measure of patient satisfaction for every hospital. It asked people to say whether they would recommend local inpatient and A&E services; the results are published every month on NHS Choices. By October more than 1m people had participated and hundreds of thousands had volunteered additional real time comments and feedback to local hospitals. ‘Great news’, said David Cameron – who has championed FFT from the start – in a tweet to mark the milestone, ‘giving patients a stronger voice in the NHS’.
This is the boldest move yet to promote patient voice at volume in the NHS and to concentrate our collective focus on improvement in care. At Hillingdon Hospitals NHS Trust, patients reported they could not sleep at night so staff have launched a ‘comfort at night’ campaign and developed a protocol for patient experience ‘never events’. In Lewisham, patients complained about poor communication and staff attitudes. They now plan daily visits for each patient. In Hull, bereaved families complained they had to pay car parking fees; the Trust has now given free passes to relatives in mourning. Routine feedback enables a different kind of conversation between the patient and the clinician. It is a catalyst for change. Commissioners will have to demonstrate how they are improving FFT for local communities to qualify for Quality Premium incentives.
This kind of customer insight is fundamental to the way we make choices as consumers. The NHS is not a hotel chain, nor a city authority: but there are vital lessons it can learn from Amazon and Trip Adviser about the power of transparency and feedback. In New York, more than 90,000 people every day share their views by phone, email and tweet on rubbish collections, potholes and dangerous buildings – and the city has become safer and cleaner.
Friends and family has its critics: people worry about the potential for gaming, for example. But the evidence, after six months, is of overwhelming human benefit and that’s why every maternity unit started to offer FFT to patients in October and why every NHS service will do so from 2015. It’s also why we are now requiring that every local organisation should offer people the chance to comment on, as well as rate, services from next year (most already do).
Some people ask me how we are ensuring the focus on transparency and participation is inclusive. We have launched Care Connect, a pilot project to test how giving people access by telephone and social media could improve feedback and complaints.  Recognising that digital exclusion is an issue in some of our communities, we have started a partnership with the Tinder Foundation to help 100,000 people learn how to go online for health benefit. None of these initiatives exist in isolation, nor do I see them as ‘silver bullets’.  My aim is to work with and build on existing good practice to make people’s voices heard and help the NHS act on them.
In a characteristically thoughtful talk last week, MT Rainey, social activist and former marketing guru, issued this challenge to the NHS: ‘How will we make the wisdom of the crowd meet the kindness of strangers?’ How do the tools of our age – big data, the internet, the mobile phone – meet the values of our species: compassion and honesty and doing our best for others? Friends and family is a good start. We are witnessing the birth of a new knowledge economy and a new social movement. The future is open.”