Smart Cities – International Case Studies


“These case studies were developed by the Inter-American Development Bank (IDB), in association with the Korea Research Institute for Human Settlements (KRIHS).

Anyang, Korea Anyang, a 600,000 population city near Seoul is developing international recognition on its smart city project that has been implemented incrementally since 2003. This initiative began with the Bus Information System to enhance citizen’s convenience at first, and has been expanding its domain into wider Intelligent Transport System as well as crime and disaster prevention in an integrated manner. Anyang is considered a benchmark for smart city with a 2012 Presidential Award in Korea and receives large number of international visits. Anyang’s Integrated Operation and Control Center (IOCC) acts as the platform that gathers, analyzes and distributes information for mobility, disasters management and crime. Anyang is currently utilizing big data for policy development and is continuing its endeavor to expand its smart city services into areas such as waste and air quality management. Download Anyang case study

Medellín, Colombia Medellin is a city that went from being known for its security problems to being an international referent of technological and social innovation, urban transformation, equity, and citizen participation. This report shows how Medellin has implemented a series of strategies that have made it a smart city that is developing capacity and organic structure in the entities that control mobility, the environment, and security. In addition, these initiatives have created mechanisms to communicate and interact with citizens in order to promote continuous improvement of smart services.

Through the Program “MDE: Medellin Smart City,” Medellin is implementing projects to create free Internet access zones, community centers, a Mi-Medellin co-creation portal, open data, online transactions, and other services. Another strategy is the creation of the Smart Mobility System which, through the use of technology, has achieved a reduction in the number of accidents, improvement in mobility, and a reduction in incident response time. Download Medellin case study

Namyangju, Korea

Orlando, U.S.

Pangyo, Korea

Rio de Janeiro, Brazil… 

Santander, España

Singapore

Songdo, Korea

Tel Aviv, Israel(More)”

OpenData.Innovation: an international journey to discover innovative uses of open government data


Nesta: “This paper by Mor Rubinstein (Open Knowledge International) and Josh Cowls and Corinne Cath (Oxford Internet Institute) explores the methods and motivations behind innovative uses of open government data in five specific country contexts – Chile, Argentine, Uruguay, Israel, and Denmark; and considers how the insights it uncovers might be adopted in a UK context.

Through a series of interviews with ‘social hackers’ and open data practitioners and experts in countries with recognised open government data ‘hubs’, the authors encountered a diverse range of practices and approaches in how actors in different sectors of society make innovative uses of open government data. This diversity also demonstrated how contextual factors shape the opportunities and challenges for impactful open government data use.

Based on insights from these international case studies, the paper offers a number of recommendations – around community engagement, data literacy and practices of opening data – which aim to support governments and citizens unlock greater knowledge exchange and social impact through open government data….(More)”

Reforms to improve U.S. government accountability


Alexander B. Howard and Patrice McDermott in Science: “Five decades after the United States first enacted the Freedom of Information Act (FOIA), Congress has voted to make the first major reforms to the statute since 2007. President Lyndon Johnson signed the first FOIA on 4 July 1966, enshrining in law the public’s right to access to information from executive branch government agencies. Scientists and others around the world can use the FOIA to learn what the U.S. government has done in its policies and practices. Proposed reforms should be a net benefit to public understanding of the scientific process and knowledge, by increasing the access of scientists to archival materials and reducing the likelihood of science and scientists being suppressed by official secrecy or bureaucracy.

Although the FOIA has been important for accountability, reform is sorely needed. An analysis of the 15 federal government agencies that received the most FOIA requests found poor to abysmal compliance rates (1, 2). In 2016, the Associated Press found that the Obama Administration had set a new record for unfulfilled FOIA requests (3). Although that has to be considered in the context of a rise in request volume without commensurate increases in resources to address them, researchers have found that most agencies simply ignore routine requests for travel schedules (4). An audit of 165 federal government agencies found that only 40% complied with the E-FOIA Act of 1996; just 67 of them had online libraries that were regularly updated with a substantial number of documents released under FOIA (5).

In the face of growing concerns about compliance, FOIA reform was one of the few recent instances of bicameral bipartisanship in Congress, with both the House and Senate each passing bills this spring with broad support. Now that Congress moved to send the Senate bill on to the president to sign into law, implementation of specific provisions will bear close scrutiny, including the potential impact of disclosure upon scientists who work in or with government agencies (6). Proposed revisions to the FOIA statute would improve how government discloses information to the public, while leaving intact exemptions for privacy, proprietary information, deliberative documents, and national security.

Features of Reforms

One of the major reforms in the House and Senate bills was to codify the “presumption of openness” outlined by President Obama the day after he took office in January 2009 when he declared that FOIA should be administered with a clear presumption: In the face of doubt, “openness” would prevail. This presumption of openness was affirmed by U.S. Attorney General Holder in March 2009. Although these declarations have had limited effect in the agencies (as described above), codifying these reforms into law is crucial not only to ensure that this remains executive branch policy after this president leaves office but also to provide requesters with legal force beyond an executive order….(More)”

The Surprising History of the Infographic


Clive Thompson at the Smithsonian magazine: “As the 2016 election approaches, we’re hearing a lot about “red states” and “blue states.” That idiom has become so ingrained that we’ve almost forgotten where it originally came from: a data visualization.

In the 2000 presidential election, the race between Al Gore and George W. Bush was so razor close that broadcasters pored over electoral college maps—which they typically colored red and blue. What’s more, they talked about those shadings. NBC’s Tim Russert wondered aloud how George Bush would “get those remaining 61 electoral red states, if you will,” and that language became lodged in the popular imagination. America became divided into two colors—data spun into pure metaphor. Now Americans even talk routinely about “purple” states, a mental visualization of political information.

We live in an age of data visualization. Go to any news website and you’ll see graphics charting support for the presidential candidates; open your iPhone and the Health app will generate personalized graphs showing how active you’ve been this week, month or year. Sites publish charts showing how the climate is changing, how schools are segregating, how much housework mothers do versus fathers. And newspapers are increasingly finding that readers love “dataviz”: In 2013, the New York Times’ most-read story for the entire year was a visualization of regional accents across the United States. It makes sense. We live in an age of Big Data. If we’re going to understand our complex world, one powerful way is to graph it.

But this isn’t the first time we’ve discovered the pleasures of making information into pictures. Over a hundred years ago, scientists and thinkers found themselves drowning in their own flood of data—and to help understand it, they invented the very idea of infographics.

**********

The idea of visualizing data is old: After all, that’s what a map is—a representation of geographic information—and we’ve had maps for about 8,000 years. But it was rare to graph anything other than geography. Only a few examples exist: Around the 11th century, a now-anonymous scribe created a chart of how the planets moved through the sky. By the 18th century, scientists were warming to the idea of arranging knowledge visually. The British polymath Joseph Priestley produced a “Chart of Biography,” plotting the lives of about 2,000 historical figures on a timeline. A picture, he argued, conveyed the information “with more exactness, and in much less time, than it [would take] by reading.”

Still, data visualization was rare because data was rare. That began to change rapidly in the early 19th century, because countries began to collect—and publish—reams of information about their weather, economic activity and population. “For the first time, you could deal with important social issues with hard facts, if you could find a way to analyze it,” says Michael Friendly, a professor of psychology at York University who studies the history of data visualization. “The age of data really began.”

An early innovator was the Scottish inventor and economist William Playfair. As a teenager he apprenticed to James Watt, the Scottish inventor who perfected the steam engine. Playfair was tasked with drawing up patents, which required him to develop excellent drafting and picture-drawing skills. After he left Watt’s lab, Playfair became interested in economics and convinced that he could use his facility for illustration to make data come alive.

“An average political economist would have certainly been able to produce a table for publication, but not necessarily a graph,” notes Ian Spence, a psychologist at the University of Toronto who’s writing a biography of Playfair. Playfair, who understood both data and art, was perfectly positioned to create this new discipline.

In one famous chart, he plotted the price of wheat in the United Kingdom against the cost of labor. People often complained about the high cost of wheat and thought wages were driving the price up. Playfair’s chart showed this wasn’t true: Wages were rising much more slowly than the cost of the product.

JULAUG2016_H04_COL_Clive.jpg
Playfair’s trade-balance time-series chart, published in his Commercial and Political Atlas, 1786 (Wikipedia)

“He wanted to discover,” Spence notes. “He wanted to find regularities or points of change.” Playfair’s illustrations often look amazingly modern: In one, he drew pie charts—his invention, too—and lines that compared the size of various country’s populations against their tax revenues. Once again, the chart produced a new, crisp analysis: The British paid far higher taxes than citizens of other nations.

Neurology was not yet a robust science, but Playfair seemed to intuit some of its principles. He suspected the brain processed images more readily than words: A picture really was worth a thousand words. “He said things that sound almost like a 20th-century vision researcher,” Spence adds. Data, Playfair wrote, should “speak to the eyes”—because they were “the best judge of proportion, being able to estimate it with more quickness and accuracy than any other of our organs.” A really good data visualization, he argued, “produces form and shape to a number of separate ideas, which are otherwise abstract and unconnected.”

Soon, intellectuals across Europe were using data visualization to grapple with the travails of urbanization, such as crime and disease….(More)”

Directory of crowdsourcing websites


Directory by Donelle McKinley: “…Here is just a selection of websites for crowdsourcing cultural heritage. Websites are actively crowdsourcing unless indicated with an asterisk…The directory is organized by the type of crowdsourcing process involved, using the typology for crowdsourcing in the humanities developed by Dunn & Hedges (2012). In their study they explain that, “a process is a sequence of tasks, through which an output is produced by operating on an asset”. For example, the Your Paintings Tagger website is for the process of tagging, which is an editorial task. The assets being tagged are images, and the output of the project is metadata, which makes the images easier to discover, retrieve and curate.

Transcription

Alexander Research Library, Wanganui Library * (NZ) Transcription of index cards from 1840 to 2002.

Ancient Lives*, University of Oxford (UK) Transcription of papyri from Greco-Roman Egypt.

AnnoTate, Tate Britain (UK) Transcription of artists’ diaries, letters and sketchbooks.

Decoding the Civil War, The Huntington Library, Abraham Lincoln Presidential Library and Museum &  North Carolina State University (USA). Transcription and decoding of Civil War telegrams from the Thomas T. Eckert Papers.

DIY History, University of Iowa Libraries (USA) Transcription of historical documents.

Emigrant City, New York Public Library (USA) Transcription of handwritten mortgage and bond ledgers from the Emigrant Savings Bank records.

Field Notes of Laurence M. Klauber, San Diego Natural History Museum (USA) Transcription of field notes by the celebrated herpetologist.

Notes from Nature Transcription of natural history museum records.

Measuring the ANZACs, Archives New Zealand and Auckland War Memorial Museum (NZ). Transcription of first-hand accounts of NZ soldiers in WW1.

Old Weather (UK) Transcription of Royal Navy ships logs from the early twentieth century.

Scattered Seeds, Heritage Collections, Dunedin Public Libraries (NZ) Transcription of index cards for Dunedin newspapers 1851-1993

Shakespeare’s World, Folger Shakespeare Library (USA) & Oxford University Press (UK). Transcription of handwritten documents by Shakespeare’s contemporaries. Identification of words that have yet to be recorded in the authoritative Oxford English Dictionary.

Smithsonian Digital Volunteers Transcription Center (USA) Transcription of multiple collections.

Transcribe Bentham, University College London (UK) Transcription of historical manuscripts by philosopher and reformer Jeremy Bentham,

What’s on the menu? New York Public Library (USA) Transcription of historical restaurant menus. …

(Full Directory).

The Billions We’re Wasting in Our Jails


Stephen Goldsmith  and Jane Wiseman in Governing: “By using data analytics to make decisions about pretrial detention, local governments could find substantial savings while making their communities safer….

Few areas of local government spending present better opportunities for dramatic savings than those that surround pretrial detention. Cities and counties are wasting more than $3 billion a year, and often inducing crime and job loss, by holding the wrong people while they await trial. The problem: Only 10 percent of jurisdictions use risk data analytics when deciding which defendants should be detained.

As a result, dangerous people are out in our communities, while many who could be safely in the community are behind bars. Vast numbers of people accused of petty offenses spend their pretrial detention time jailed alongside hardened convicts, learning from them how to be better criminals….

In this era of big data, analytics not only can predict and prevent crime but also can discern who should be diverted from jail to treatment for underlying mental health or substance abuse issues. Avoided costs aggregating in the billions could be better spent on detaining high-risk individuals, more mental health and substance abuse treatment, more police officers and other public safety services.

Jurisdictions that do use data to make pretrial decisions have achieved not only lower costs but also greater fairness and lower crime rates. Washington, D.C., releases 85 percent of defendants awaiting trial. Compared to the national average, those released in D.C. are two and a half times more likely to remain arrest-free and one and a half times as likely to show up for court.

Louisville, Ky., implemented risk-based decision-making using a tool developed by the Laura and John Arnold Foundation and now releases 70 percent of defendants before trial. Those released have turned out to be twice as likely to return to court and to stay arrest-free as those in other jurisdictions. Mesa County, Colo., and Allegheny County, Pa., both have achieved significant savings from reduced jail populations due to data-driven release of low-risk defendants.

Data-driven approaches are beginning to produce benefits not only in the area of pretrial detention but throughout the criminal justice process. Dashboards now in use in a handful of jurisdictions allow not only administrators but also the public to see court waiting times by offender type and to identify and address processing bottlenecks….(More)”

Nudging for Success


Press Release: “A groundbreaking report published today by ideas42 reveals several innovations that college administrators and policymakers can leverage to significantly improve college graduation rates at a time where completion is more out of reach than ever for millions of students.

The student path through college to graduation day is strewn with subtle, often invisible barriers that, over time, hinder students’ progress and cause some of them to drop out entirely. In Nudging for Success: Using Behavioral Science to Improve the Postsecondary Student Journey, ideas42 focuses on simple, low-cost ways to combat these unintentional obstacles and support student persistence and success at every stage in the college experience, from pre-admission to post-graduation. Teams worked with students, faculty and administrators at colleges around the country.

Even for students whose tuition is covered by financial aid, whose academic preparation is exemplary, and who are able to commit themselves full-time to their education, the subtle logistical and psychological sticking points can have a huge impact on their ability to persist and fully reap the benefits of a higher education.

Less than 60% of full-time students graduate from four-year colleges within six years, and less than 30% graduate from community colleges within three years. There are a myriad of factors often cited as deterrents to finishing school, such as the cost of tuition or the need to juggle family and work obligations, but behavioral science and the results of this report demonstrate that lesser-known dynamics like self-perception are also at play.

From increasing financial aid filing to fostering positive friend groups and a sense of belonging on campus, the 16 behavioral solutions outlined in Nudging for Success represent the potential for significant impact on the student experience and persistence. At Arizona State University, sending behaviorally-designed email reminders to students and parents about the Free Application for Federal Student Aid (FAFSA) priority deadline increased submissions by 72% and led to an increase in grant awards. Freshman retention among low-income, first generation, under-represented or other students most at risk of dropping out increased by 10% at San Francisco State University with the use of a testimonial video, self-affirming exercises, and monthly messaging aimed at first-time students.

“This evidence demonstrates how behavioral science can be the key to uplifting millions of Americans through education,” said Alissa Fishbane, Managing Director at ideas42. “By approaching the completion crisis from the whole experience of students themselves, administrators and policymakers have the opportunity to reduce the number of students who start, but do not finish, college—students who take on the financial burden of tuition but miss out on the substantial benefits of earning a degree.”

The results of this work drive home the importance of examining the college experience from the student perspective and through the lens of human behavior. College administrators and policymakers can replicate these gains at institutions across the country to make it simpler for students to complete the degree they started in ways that are often easier and less expensive to implement than existing alternatives—paving the way to stronger economic futures for millions of Americans….(More)”

In Your Neighborhood, Who Draws the Map?


Lizzie MacWillie at NextCity: “…By crowdsourcing neighborhood boundaries, residents can put themselves on the map in critical ways.

Why does this matter? Neighborhoods are the smallest organizing element in any city. A strong city is made up of strong neighborhoods, where the residents can effectively advocate for their needs. A neighborhood boundary marks off a particular geography and calls out important elements within that geography: architecture, street fabric, public spaces and natural resources, to name a few. Putting that line on a page lets residents begin to identify needs and set priorities. Without boundaries, there’s no way to know where to start.

Knowing a neighborhood’s boundaries and unique features allows a group to list its assets. What buildings have historic significance? What shops and restaurants exist? It also helps highlight gaps: What’s missing? What does the neighborhood need more of? What is there already too much of? Armed with this detailed inventory, residents can approach a developer, city council member or advocacy group with hard numbers on what they know their neighborhood needs.

With a precisely defined geography, residents living in a food desert can point to developable vacant land that’s ideal for a grocery store. They can also cite how many potential grocery shoppers live within the neighborhood.

In addition to being able to organize within the neighborhood, staking a claim to a neighborhood, putting it on a map and naming it, can help a neighborhood control its own narrative and tell its story — so someone else doesn’t.

Our neighborhood map project was started in part as a response to consistent misidentification of Dallas neighborhoods by local media, which appears to be particularly common in stories about majority-minority neighborhoods. This kind of oversight can contribute to a false narrative about a place, especially when the news is about crime or violence, and takes away from residents’ ability to tell their story and shape their neighborhood’s future. Even worse is when neighborhoods are completely left off of the map, as if they have no story at all to tell.

Neighborhood mapping can also counter narrative hijacking like I’ve seen in my hometown of Brooklyn, where realtor-driven neighborhood rebranding has led to areas being renamed. These places have their own unique identities and histories, yet longtime residents saw names changed so that real estate sellers could capitalize on increasing property values in adjacent trendy neighborhoods.

Cities across the country — including Dallas, Boston, New York, Chicago,Portland and Seattle — have crowdsourced mapping projects people can contribute to. For cities lacking such an effort, tools like Google Map Maker have been effective….(More)”.

Using Behavioral Science to Combat Climate Change


Cass R. Sunstein and Lucia A. Reisch in the Oxford Research Encyclopedia of Climate Science (Forthcoming): “Careful attention to choice architecture promises to open up new possibilities for reducing greenhouse gas emissions – possibilities that go well beyond, and that may supplement or complement, the standard tools of economic incentives, mandates, and bans. How, for example, do consumers choose between climate-friendly products or services and alternatives that are potentially damaging to the climate but less expensive? The answer may well depend on the default rule. Indeed, climate-friendly default rules may well be a more effective tool for altering outcomes than large economic incentives. The underlying reasons include the power of suggestion; inertia and procrastination; and loss aversion. If well-chosen, climate-friendly defaults are likely to have large effects in reducing the economic and environmental harms associated with various products and activities. In deciding whether to establish climate-friendly defaults, choice architects (subject to legal constraints) should consider both consumer welfare and a wide range of other costs and benefits. Sometimes that assessment will argue strongly in favor of climate-friendly defaults, particularly when both economic and environmental considerations point in their direction. Notably, surveys in the United States and Europe show that majorities in many nations are in favor of climate-friendly defaults….(More)”

Transforming governance: how can technology help reshape democracy?


Research Briefing by Matt Leighninger: “Around the world, people are asking how we can make democracy work in new and better ways. We are frustrated by political systems in which voting is the only legitimate political act, concerned that many republics don’t have the strength or appeal to withstand authoritarian figures, and disillusioned by the inability of many countries to address the fundamental challenges of health, education and economic development.

We can no longer assume that the countries of the global North have ‘advanced’ democracies, and that the nations of the global South simply need to catch up. Citizens of these older democracies have increasingly lost faith in their political institutions; Northerners cherish their human rights and free elections, but are clearly looking for something more. Meanwhile, in the global South, new regimes based on a similar formula of rights and elections have proven fragile and difficult to sustain. And in Brazil, India and other Southern countries, participatory budgeting and other valuable democratic innovations have emerged. The stage is set for a more equitable, global conversation about what we mean by democracy.

How can we adjust our democratic formulas so that they are more sustainable, powerful, fulfilling – and, well, democratic? Some of the parts of this equation may come from the development of online tools and platforms that help people to engage with their governments, with organisations and institutions, and with each other. Often referred to collectively as ‘civic technology’ or ‘civic tech’, these tools can help us map public problems, help citizens generate solutions, gather input for government, coordinate volunteer efforts, and help neighbours remain connected. If we want to create democracies in which citizens have meaningful roles in shaping public decisions and solving public problems, we should be asking a number of questions about civic tech, including:

  • How can online tools best support new forms of democracy?
  • What are the examples of how this has happened?
  • What are some variables to consider in comparing these examples?
  • How can we learn from each other as we move forward?

This background note has been developed to help democratic innovators explore these questions and examine how their work can provide answers….(More)”