The Surprising History of the Infographic


Clive Thompson at the Smithsonian magazine: “As the 2016 election approaches, we’re hearing a lot about “red states” and “blue states.” That idiom has become so ingrained that we’ve almost forgotten where it originally came from: a data visualization.

In the 2000 presidential election, the race between Al Gore and George W. Bush was so razor close that broadcasters pored over electoral college maps—which they typically colored red and blue. What’s more, they talked about those shadings. NBC’s Tim Russert wondered aloud how George Bush would “get those remaining 61 electoral red states, if you will,” and that language became lodged in the popular imagination. America became divided into two colors—data spun into pure metaphor. Now Americans even talk routinely about “purple” states, a mental visualization of political information.

We live in an age of data visualization. Go to any news website and you’ll see graphics charting support for the presidential candidates; open your iPhone and the Health app will generate personalized graphs showing how active you’ve been this week, month or year. Sites publish charts showing how the climate is changing, how schools are segregating, how much housework mothers do versus fathers. And newspapers are increasingly finding that readers love “dataviz”: In 2013, the New York Times’ most-read story for the entire year was a visualization of regional accents across the United States. It makes sense. We live in an age of Big Data. If we’re going to understand our complex world, one powerful way is to graph it.

But this isn’t the first time we’ve discovered the pleasures of making information into pictures. Over a hundred years ago, scientists and thinkers found themselves drowning in their own flood of data—and to help understand it, they invented the very idea of infographics.

**********

The idea of visualizing data is old: After all, that’s what a map is—a representation of geographic information—and we’ve had maps for about 8,000 years. But it was rare to graph anything other than geography. Only a few examples exist: Around the 11th century, a now-anonymous scribe created a chart of how the planets moved through the sky. By the 18th century, scientists were warming to the idea of arranging knowledge visually. The British polymath Joseph Priestley produced a “Chart of Biography,” plotting the lives of about 2,000 historical figures on a timeline. A picture, he argued, conveyed the information “with more exactness, and in much less time, than it [would take] by reading.”

Still, data visualization was rare because data was rare. That began to change rapidly in the early 19th century, because countries began to collect—and publish—reams of information about their weather, economic activity and population. “For the first time, you could deal with important social issues with hard facts, if you could find a way to analyze it,” says Michael Friendly, a professor of psychology at York University who studies the history of data visualization. “The age of data really began.”

An early innovator was the Scottish inventor and economist William Playfair. As a teenager he apprenticed to James Watt, the Scottish inventor who perfected the steam engine. Playfair was tasked with drawing up patents, which required him to develop excellent drafting and picture-drawing skills. After he left Watt’s lab, Playfair became interested in economics and convinced that he could use his facility for illustration to make data come alive.

“An average political economist would have certainly been able to produce a table for publication, but not necessarily a graph,” notes Ian Spence, a psychologist at the University of Toronto who’s writing a biography of Playfair. Playfair, who understood both data and art, was perfectly positioned to create this new discipline.

In one famous chart, he plotted the price of wheat in the United Kingdom against the cost of labor. People often complained about the high cost of wheat and thought wages were driving the price up. Playfair’s chart showed this wasn’t true: Wages were rising much more slowly than the cost of the product.

JULAUG2016_H04_COL_Clive.jpg
Playfair’s trade-balance time-series chart, published in his Commercial and Political Atlas, 1786 (Wikipedia)

“He wanted to discover,” Spence notes. “He wanted to find regularities or points of change.” Playfair’s illustrations often look amazingly modern: In one, he drew pie charts—his invention, too—and lines that compared the size of various country’s populations against their tax revenues. Once again, the chart produced a new, crisp analysis: The British paid far higher taxes than citizens of other nations.

Neurology was not yet a robust science, but Playfair seemed to intuit some of its principles. He suspected the brain processed images more readily than words: A picture really was worth a thousand words. “He said things that sound almost like a 20th-century vision researcher,” Spence adds. Data, Playfair wrote, should “speak to the eyes”—because they were “the best judge of proportion, being able to estimate it with more quickness and accuracy than any other of our organs.” A really good data visualization, he argued, “produces form and shape to a number of separate ideas, which are otherwise abstract and unconnected.”

Soon, intellectuals across Europe were using data visualization to grapple with the travails of urbanization, such as crime and disease….(More)”

Directory of crowdsourcing websites


Directory by Donelle McKinley: “…Here is just a selection of websites for crowdsourcing cultural heritage. Websites are actively crowdsourcing unless indicated with an asterisk…The directory is organized by the type of crowdsourcing process involved, using the typology for crowdsourcing in the humanities developed by Dunn & Hedges (2012). In their study they explain that, “a process is a sequence of tasks, through which an output is produced by operating on an asset”. For example, the Your Paintings Tagger website is for the process of tagging, which is an editorial task. The assets being tagged are images, and the output of the project is metadata, which makes the images easier to discover, retrieve and curate.

Transcription

Alexander Research Library, Wanganui Library * (NZ) Transcription of index cards from 1840 to 2002.

Ancient Lives*, University of Oxford (UK) Transcription of papyri from Greco-Roman Egypt.

AnnoTate, Tate Britain (UK) Transcription of artists’ diaries, letters and sketchbooks.

Decoding the Civil War, The Huntington Library, Abraham Lincoln Presidential Library and Museum &  North Carolina State University (USA). Transcription and decoding of Civil War telegrams from the Thomas T. Eckert Papers.

DIY History, University of Iowa Libraries (USA) Transcription of historical documents.

Emigrant City, New York Public Library (USA) Transcription of handwritten mortgage and bond ledgers from the Emigrant Savings Bank records.

Field Notes of Laurence M. Klauber, San Diego Natural History Museum (USA) Transcription of field notes by the celebrated herpetologist.

Notes from Nature Transcription of natural history museum records.

Measuring the ANZACs, Archives New Zealand and Auckland War Memorial Museum (NZ). Transcription of first-hand accounts of NZ soldiers in WW1.

Old Weather (UK) Transcription of Royal Navy ships logs from the early twentieth century.

Scattered Seeds, Heritage Collections, Dunedin Public Libraries (NZ) Transcription of index cards for Dunedin newspapers 1851-1993

Shakespeare’s World, Folger Shakespeare Library (USA) & Oxford University Press (UK). Transcription of handwritten documents by Shakespeare’s contemporaries. Identification of words that have yet to be recorded in the authoritative Oxford English Dictionary.

Smithsonian Digital Volunteers Transcription Center (USA) Transcription of multiple collections.

Transcribe Bentham, University College London (UK) Transcription of historical manuscripts by philosopher and reformer Jeremy Bentham,

What’s on the menu? New York Public Library (USA) Transcription of historical restaurant menus. …

(Full Directory).

The Billions We’re Wasting in Our Jails


Stephen Goldsmith  and Jane Wiseman in Governing: “By using data analytics to make decisions about pretrial detention, local governments could find substantial savings while making their communities safer….

Few areas of local government spending present better opportunities for dramatic savings than those that surround pretrial detention. Cities and counties are wasting more than $3 billion a year, and often inducing crime and job loss, by holding the wrong people while they await trial. The problem: Only 10 percent of jurisdictions use risk data analytics when deciding which defendants should be detained.

As a result, dangerous people are out in our communities, while many who could be safely in the community are behind bars. Vast numbers of people accused of petty offenses spend their pretrial detention time jailed alongside hardened convicts, learning from them how to be better criminals….

In this era of big data, analytics not only can predict and prevent crime but also can discern who should be diverted from jail to treatment for underlying mental health or substance abuse issues. Avoided costs aggregating in the billions could be better spent on detaining high-risk individuals, more mental health and substance abuse treatment, more police officers and other public safety services.

Jurisdictions that do use data to make pretrial decisions have achieved not only lower costs but also greater fairness and lower crime rates. Washington, D.C., releases 85 percent of defendants awaiting trial. Compared to the national average, those released in D.C. are two and a half times more likely to remain arrest-free and one and a half times as likely to show up for court.

Louisville, Ky., implemented risk-based decision-making using a tool developed by the Laura and John Arnold Foundation and now releases 70 percent of defendants before trial. Those released have turned out to be twice as likely to return to court and to stay arrest-free as those in other jurisdictions. Mesa County, Colo., and Allegheny County, Pa., both have achieved significant savings from reduced jail populations due to data-driven release of low-risk defendants.

Data-driven approaches are beginning to produce benefits not only in the area of pretrial detention but throughout the criminal justice process. Dashboards now in use in a handful of jurisdictions allow not only administrators but also the public to see court waiting times by offender type and to identify and address processing bottlenecks….(More)”

Nudging for Success


Press Release: “A groundbreaking report published today by ideas42 reveals several innovations that college administrators and policymakers can leverage to significantly improve college graduation rates at a time where completion is more out of reach than ever for millions of students.

The student path through college to graduation day is strewn with subtle, often invisible barriers that, over time, hinder students’ progress and cause some of them to drop out entirely. In Nudging for Success: Using Behavioral Science to Improve the Postsecondary Student Journey, ideas42 focuses on simple, low-cost ways to combat these unintentional obstacles and support student persistence and success at every stage in the college experience, from pre-admission to post-graduation. Teams worked with students, faculty and administrators at colleges around the country.

Even for students whose tuition is covered by financial aid, whose academic preparation is exemplary, and who are able to commit themselves full-time to their education, the subtle logistical and psychological sticking points can have a huge impact on their ability to persist and fully reap the benefits of a higher education.

Less than 60% of full-time students graduate from four-year colleges within six years, and less than 30% graduate from community colleges within three years. There are a myriad of factors often cited as deterrents to finishing school, such as the cost of tuition or the need to juggle family and work obligations, but behavioral science and the results of this report demonstrate that lesser-known dynamics like self-perception are also at play.

From increasing financial aid filing to fostering positive friend groups and a sense of belonging on campus, the 16 behavioral solutions outlined in Nudging for Success represent the potential for significant impact on the student experience and persistence. At Arizona State University, sending behaviorally-designed email reminders to students and parents about the Free Application for Federal Student Aid (FAFSA) priority deadline increased submissions by 72% and led to an increase in grant awards. Freshman retention among low-income, first generation, under-represented or other students most at risk of dropping out increased by 10% at San Francisco State University with the use of a testimonial video, self-affirming exercises, and monthly messaging aimed at first-time students.

“This evidence demonstrates how behavioral science can be the key to uplifting millions of Americans through education,” said Alissa Fishbane, Managing Director at ideas42. “By approaching the completion crisis from the whole experience of students themselves, administrators and policymakers have the opportunity to reduce the number of students who start, but do not finish, college—students who take on the financial burden of tuition but miss out on the substantial benefits of earning a degree.”

The results of this work drive home the importance of examining the college experience from the student perspective and through the lens of human behavior. College administrators and policymakers can replicate these gains at institutions across the country to make it simpler for students to complete the degree they started in ways that are often easier and less expensive to implement than existing alternatives—paving the way to stronger economic futures for millions of Americans….(More)”

In Your Neighborhood, Who Draws the Map?


Lizzie MacWillie at NextCity: “…By crowdsourcing neighborhood boundaries, residents can put themselves on the map in critical ways.

Why does this matter? Neighborhoods are the smallest organizing element in any city. A strong city is made up of strong neighborhoods, where the residents can effectively advocate for their needs. A neighborhood boundary marks off a particular geography and calls out important elements within that geography: architecture, street fabric, public spaces and natural resources, to name a few. Putting that line on a page lets residents begin to identify needs and set priorities. Without boundaries, there’s no way to know where to start.

Knowing a neighborhood’s boundaries and unique features allows a group to list its assets. What buildings have historic significance? What shops and restaurants exist? It also helps highlight gaps: What’s missing? What does the neighborhood need more of? What is there already too much of? Armed with this detailed inventory, residents can approach a developer, city council member or advocacy group with hard numbers on what they know their neighborhood needs.

With a precisely defined geography, residents living in a food desert can point to developable vacant land that’s ideal for a grocery store. They can also cite how many potential grocery shoppers live within the neighborhood.

In addition to being able to organize within the neighborhood, staking a claim to a neighborhood, putting it on a map and naming it, can help a neighborhood control its own narrative and tell its story — so someone else doesn’t.

Our neighborhood map project was started in part as a response to consistent misidentification of Dallas neighborhoods by local media, which appears to be particularly common in stories about majority-minority neighborhoods. This kind of oversight can contribute to a false narrative about a place, especially when the news is about crime or violence, and takes away from residents’ ability to tell their story and shape their neighborhood’s future. Even worse is when neighborhoods are completely left off of the map, as if they have no story at all to tell.

Neighborhood mapping can also counter narrative hijacking like I’ve seen in my hometown of Brooklyn, where realtor-driven neighborhood rebranding has led to areas being renamed. These places have their own unique identities and histories, yet longtime residents saw names changed so that real estate sellers could capitalize on increasing property values in adjacent trendy neighborhoods.

Cities across the country — including Dallas, Boston, New York, Chicago,Portland and Seattle — have crowdsourced mapping projects people can contribute to. For cities lacking such an effort, tools like Google Map Maker have been effective….(More)”.

Using Behavioral Science to Combat Climate Change


Cass R. Sunstein and Lucia A. Reisch in the Oxford Research Encyclopedia of Climate Science (Forthcoming): “Careful attention to choice architecture promises to open up new possibilities for reducing greenhouse gas emissions – possibilities that go well beyond, and that may supplement or complement, the standard tools of economic incentives, mandates, and bans. How, for example, do consumers choose between climate-friendly products or services and alternatives that are potentially damaging to the climate but less expensive? The answer may well depend on the default rule. Indeed, climate-friendly default rules may well be a more effective tool for altering outcomes than large economic incentives. The underlying reasons include the power of suggestion; inertia and procrastination; and loss aversion. If well-chosen, climate-friendly defaults are likely to have large effects in reducing the economic and environmental harms associated with various products and activities. In deciding whether to establish climate-friendly defaults, choice architects (subject to legal constraints) should consider both consumer welfare and a wide range of other costs and benefits. Sometimes that assessment will argue strongly in favor of climate-friendly defaults, particularly when both economic and environmental considerations point in their direction. Notably, surveys in the United States and Europe show that majorities in many nations are in favor of climate-friendly defaults….(More)”

Transforming governance: how can technology help reshape democracy?


Research Briefing by Matt Leighninger: “Around the world, people are asking how we can make democracy work in new and better ways. We are frustrated by political systems in which voting is the only legitimate political act, concerned that many republics don’t have the strength or appeal to withstand authoritarian figures, and disillusioned by the inability of many countries to address the fundamental challenges of health, education and economic development.

We can no longer assume that the countries of the global North have ‘advanced’ democracies, and that the nations of the global South simply need to catch up. Citizens of these older democracies have increasingly lost faith in their political institutions; Northerners cherish their human rights and free elections, but are clearly looking for something more. Meanwhile, in the global South, new regimes based on a similar formula of rights and elections have proven fragile and difficult to sustain. And in Brazil, India and other Southern countries, participatory budgeting and other valuable democratic innovations have emerged. The stage is set for a more equitable, global conversation about what we mean by democracy.

How can we adjust our democratic formulas so that they are more sustainable, powerful, fulfilling – and, well, democratic? Some of the parts of this equation may come from the development of online tools and platforms that help people to engage with their governments, with organisations and institutions, and with each other. Often referred to collectively as ‘civic technology’ or ‘civic tech’, these tools can help us map public problems, help citizens generate solutions, gather input for government, coordinate volunteer efforts, and help neighbours remain connected. If we want to create democracies in which citizens have meaningful roles in shaping public decisions and solving public problems, we should be asking a number of questions about civic tech, including:

  • How can online tools best support new forms of democracy?
  • What are the examples of how this has happened?
  • What are some variables to consider in comparing these examples?
  • How can we learn from each other as we move forward?

This background note has been developed to help democratic innovators explore these questions and examine how their work can provide answers….(More)”

What Can Civic Tech Learn From Social Movements?


Stacy Donohue at Omidyar Network: “…In order to spur creative thinking about how the civic tech sector could be accelerated and expanded, we looked to Purpose, a public benefit corporation that works with NGOs, philanthropies, and brands on movement building strategies. We wanted to explore what we might learn from taking the work that Purpose has done mapping the progress of of 21st century social movements and applying its methodology to civic tech.

So why consider viewing civic tech using the lens of 21st century movements? Movements are engines of change in society that enable citizens to create new and better paths to engage with government and to seek recourse on issues that matter to millions of people.  At first glance, civic tech doesn’t appear to be a movement in the purest sense of the term, but on closer inspection, it does share some fundamental characteristics. Like a movement, civic tech is mission driven, is focused on making change that benefits the public, and in most cases enables better public input into decision making.

We believe that better understanding the essential components of movements, and observing the ways in which civic tech does or does not behave like one, can yield insights on how we as a civic tech community can collectively drive the sector forward….

report Engines of Change: What Civic Tech Can Learn From Social Movements….provides a lot of rich insight and detail which we invite everyone to explore.  Meanwhile, we have summarized five key findings:

  1. Grassroots activity is expanding across the US – Activity is no longer centralized around San Francisco and New York; it’s rapidly growing and spreading across the US – in fact, there was an 81% increase in the number of cities hosting civic tech MeetUps from 2013 to 2015, and 45 of 50 states had at least one MeetUp on civic tech in 2015.
  2. Talk is turning to action – We are walking the talk. One way we can see this is that growth in civic tech Twitter discussion is highly correlated with the growth in GitHub contributions to civic tech projects and related Meetup events. Between 2013-2015, over 8,500 people contributed code to GitHub civic tech projects and there were over 76,000 MeetUps for civic tech events. 
  3. There is an engaged core, but it is very small in number – As with most social movements, civic tech has a definite core of highly engaged evangelists, advocates and entrepreneurs that are driving conversations, activity, and events and this is growing. The number of Meetup groups holding multiple events a quarter grew by 136% between 2013 to 2015. And likewise there was a 60% growth in Engaged Tweeters in during this time period.  However, this level of activity is dwarfed by other movements such as climate action.
  4. Civic tech is growing but still lacking scale – There are many positive indications of growth in civic tech; for example, the combination of nonprofit and for-profit funding to the sector increased by almost 120% over the period.  But while growth compares favorably to other movements, again the scale just isn’t there.
  5. Common themes, but no shared vision or identity – Purpose examined the extent to which civic tech exhibits and articulates a shared vision or identity around which members of a movement can rally. What they found is that many fewer people are discussing the same shared set of themes. Two themes – Open Data and Government Transparency – are resonating and gaining traction across the sector and could therefore form the basis of common identity for civic tech.

While each of these insights is important in its own right and requires action to move the sector forward, the main thing that strikes us is the need for a coherent and clearly articulated vision and sense of shared identity for civic tech…

Read the full report: Engines of Change: What Civic Tech Can Learn From Social Movements

Explore the data tool here….(More)”

Using Innovation and Technology to Improve City Services


IBM Center for the Business of Government: “In this report, Professor Greenberg examines a dozen cities across the United States that have award-winning reputations for using innovation and technology to improve the services they provide to their residents. She explores a variety of success factors associated with effective service delivery at the local level, including:

  • The policies, platforms, and applications that cities use for different purposes, such as public engagement, streamlining the issuance of permits, and emergency response
  • How cities can successfully partner with third parties, such as nonprofits, foundations, universities, and private businesses to improve service delivery using technology
  • The types of business cases that can be presented to mayors and city councils to support various changes proposed by innovators in city government

Professor Greenberg identifies a series of trends that drive cities to undertake innovations, such as the increased use of mobile devices by residents. Based on cities’ responses to these trends, she offers a set of findings and specific actions that city officials can act upon to create innovation agendas for their communities. Her report also presents case studies for each of the dozen cities in her review. These cases provide a real-world context, which will allow interested leaders in other cities to see how their own communities might approach similar innovation initiatives.

This report builds on two other IBM Center reports: A Guide for Making Innovation Offices Work, by Rachel Burstein and Alissa Black, and The Persistence of Innovation in Government: A Guide for Public Servants, by Sandford Borins, which examines the use of awards to stimulate innovation in government.

We hope that government leaders who are interested in innovations using technology to improve services will benefit from the governance models and tools described in this report, as they consider how best to leverage innovation and technology initiatives to serve residents more effectively and efficiently….(More)”

Is artificial intelligence key to dengue prevention?


BreakDengue: “Dengue fever outbreaks are increasing in both frequency and magnitude. Not only that, the number of countries that could potentially be affected by the disease is growing all the time.

This growth has led to renewed efforts to address the disease, and a pioneering Malaysian researcher was recently recognized for his efforts to harness the power of big data and artificial intelligence to accurately predict dengue outbreaks.

Dr. Dhesi Baha Raja received the Pistoia Alliance Life Science Award at King’s College London in April of this year, for developing a disease prediction platform that employs technology and data to give people prior warning of when disease outbreaks occur.The medical doctor and epidemiologist has spent years working to develop AIME (Artificial Intelligence in Medical Epidemiology)…

it relies on a complex algorithm, which analyses a wide range of data collected by local government and also satellite image recognition systems. Over 20 variables such as weather, wind speed, wind direction, thunderstorm, solar radiation and rainfall schedule are included and analyzed. Population models and geographical terrain are also included. The ultimate result of this intersection between epidemiology, public health and technology is a map, which clearly illustrates the probability and location of the next dengue outbreak.

The ground-breaking platform can predict dengue fever outbreaks up to two or three months in advance, with an accuracy approaching 88.7 per cent and within a 400m radius. Dr. Dhesi has just returned from Rio de Janeiro, where the platform was employed in a bid to fight dengue in advance of this summer’s Olympics. In Brazil, its perceived accuracy was around 84 per cent, whereas in Malaysia in was over 88 per cent – giving it an average accuracy of 86.37 per cent.

The web-based application has been tested in two states within Malaysia, Kuala Lumpur, and Selangor, and the first ever mobile app is due to be deployed across Malaysia soon. Once its capability is adequately tested there, it will be rolled out globally. Dr. Dhesi’s team are working closely with mobile digital service provider Webe on this.

By making the app free to download, this will ensure the service becomes accessible to all, Dr Dhesi explains.
“With the web-based application, this could only be used by public health officials and agencies. We recognized the need for us to democratize this health service to the community, and the only way to do this is to provide the community with the mobile app.”
This will also enable the gathering of even greater knowledge on the possibility of dengue outbreaks in high-risk areas, as well as monitoring the changing risks as people move to different areas, he adds….(More)”