Cities in the 21st Century


Book edited by Oriol Nel-lo and Renata Mele: “Cities in the 21st Century provides an overview of contemporary urban development. Written by more than thirty major academic specialists from different countries, it provides information on and analysis of the global network of cities, changes in urban form, environmental problems, the role of technologies and knowledge, socioeconomic developments, and finally, the challenge of urban governance.

In the mid-20th century, architect and planner Josep Lluís Sert wondered if cities could survive; in the early 21st century, we see that cities have not only survived but have grown as never before. Cities today are engines of production and trade, forges of scientific and technological innovation, and crucibles of social change. Urbanization is a major driver of change in contemporary societies; it is a process that involves acute social inequalities and serious environmental problems, but also offers opportunities to move towards a future of greater prosperity, environmental sustainability, and social justice.

With case studies on thirty cities in five continents and a selection of infographics illustrating these dynamic cities, this edited volume is an essential resource for planners and students of urbanization and urban change….(More)”

The Problem With Evidence-Based Policies


Ricardo Hausmann at Project Syndicate: “Many organizations, from government agencies to philanthropic institutions and aid organizations, now require that programs and policies be “evidence-based.” It makes sense to demand that policies be based on evidence and that such evidence be as good as possible, within reasonable time and budgetary limits. But the way this approach is being implemented may be doing a lot of harm, impairing our ability to learn and improve on what we do.

The current so-called “gold standard” of what constitutes good evidence is the randomized control trial, or RCT, an idea that started in medicine two centuries ago, moved to agriculture, and became the rage in economics during the past two decades. Its popularity is based on the fact that it addresses key problems in statistical inference.

For example, rich people wear fancy clothes. Would distributing fancy clothes to poor people make them rich? This is a case where correlation (between clothes and wealth) does not imply causation.

Harvard graduates get great jobs. Is Harvard good at teaching – or just at selecting smart people who would have done well in life anyway? This is the problem of selection bias.

RCTs address these problems by randomly assigning those participating in the trial to receive either a “treatment” or a “placebo” (thereby creating a “control” group). By observing how the two groups differ after the intervention, the effectiveness of the treatment can be assessed. RCTs have been conducted on drugs, micro-loans, training programs, educational tools, and myriad other interventions….

In economics, RCTs have been all the rage, especially in the field of international development, despite critiques by the Nobel laureate Angus Deaton, Lant Pritchett, and Dani Rodrik, who have attacked the inflated claims of RCT’s proponents. One serious shortcoming is external validity. Lessons travel poorly: If an RCT finds out that giving micronutrients to children in Guatemala improves their learning, should you give micronutrients to Norwegian children?

My main problem with RCTs is that they make us think about interventions, policies, and organizations in the wrong way. As opposed to the two or three designs that get tested slowly by RCTs (like putting tablets or flipcharts in schools), most social interventions have millions of design possibilities and outcomes depend on complex combinations between them. This leads to what the complexity scientist Stuart Kauffman calls a “rugged fitness landscape.”

Getting the right combination of parameters is critical. This requires that organizations implement evolutionary strategies that are based on trying things out and learning quickly about performance through rapid feedback loops, as suggested by Matt Andrews, Lant Pritchett and Michael Woolcock at Harvard’s Center for International Development.

RCTs may be appropriate for clinical drug trials. But for a remarkably broad array of policy areas, the RCT movement has had an impact equivalent to putting auditors in charge of the R&D department. That is the wrong way to design things that work. Only by creating organizations that learn how to learn, as so-called lean manufacturing has done for industry, can we accelerate progress….(More)”

Data Collaboratives: Matching Demand with Supply of (Corporate) Data to solve Public Problems


Blog by Stefaan G. Verhulst, IrynaSusha and Alexander Kostura: “Data Collaboratives refer to a new form of collaboration, beyond the public-private partnership model, in which participants from different sectors (private companies, research institutions, and government agencies) share data to help solve public problems. Several of society’s greatest challenges — from climate change to poverty — require greater access to big (but not always open) data sets, more cross-sector collaboration, and increased capacity for data analysis. Participants at the workshop and breakout session explored the various ways in which data collaborative can help meet these needs.

Matching supply and demand of data emerged as one of the most important and overarching issues facing the big and open data communities. Participants agreed that more experimentation is needed so that new, innovative and more successful models of data sharing can be identified.

How to discover and enable such models? When asked how the international community might foster greater experimentation, participants indicated the need to develop the following:

· A responsible data framework that serves to build trust in sharing data would be based upon existing frameworks but also accommodates emerging technologies and practices. It would also need to be sensitive to public opinion and perception.

· Increased insight into different business models that may facilitate the sharing of data. As experimentation continues, the data community should map emerging practices and models of sharing so that successful cases can be replicated.

· Capacity to tap into the potential value of data. On the demand side,capacity refers to the ability to pose good questions, understand current data limitations, and seek new data sets responsibly. On the supply side, this means seeking shared value in collaboration, thinking creatively about public use of private data, and establishing norms of responsibility around security, privacy, and anonymity.

· Transparent stock of available data supply, including an inventory of what corporate data exist that can match multiple demands and that is shared through established networks and new collaborative institutional structures.

· Mapping emerging practices and models of sharing. Corporate data offers value not only for humanitarian action (which was a particular focus at the conference) but also for a variety of other domains, including science,agriculture, health care, urban development, environment, media and arts,and others. Gaining insight in the practices that emerge across sectors could broaden the spectrum of what is feasible and how.

In general, it was felt that understanding the business models underlying data collaboratives is of utmost importance in order to achieve win-win outcomes for both private and public sector players. Moreover, issues of public perception and trust were raised as important concerns of government organizations participating in data collaboratives….(More)”

Technology and the Future of Cities


Mark Gorenberg, Craig Mundie, Eric Schmidt and Marjory Blumenthal at PCAST: “Growing urbanization presents the United States with an opportunity to showcase its innovation strength, grow its exports, and help to improve citizens’ lives – all at once. Seizing this triple opportunity will involve a concerted effort to develop and apply new technologies to enhance the way cities work for the people who live there.

A new report released today by the President’s Council of Advisors on Science and Technology (PCAST), Technology and the Future of Cities, lays out why now is a good time to promote technologies for cities: more (and more diverse) people are living in cities; people are increasingly open to different ways of using space, living, working, and traveling across town; physical infrastructures for transportation, energy, and water are aging; and a wide range of innovations are in reach that can yield better infrastructures and help in the design and operation of city services.

There are also new ways to collect and use information to design and operate systems and services. Better use of information can help make the most of limited resources – whether city budgets or citizens’ time – and help make sure that the neediest as well as the affluent benefit from new technology.

Although the vision of technology’s promise applies city-wide, PCAST suggests that a practical way for cities to adopt infrastructural and other innovation is by starting in a discrete area  – a district, the dimensions of which depend on the innovation in question. Experiences in districts can help inform decisions elsewhere in a given city – and in other cities. PCAST urges broader sharing of information about, and tools for, innovation in cities.

Such sharing is already happening in isolated pockets focused on either specific kinds of information or recipients of specific kinds of funding. A more comprehensive City Web, achieved through broader interconnection, could inform and impel urban innovation. A systematic approach to developing open-data resources for cities is recommended, too.

PCAST recommends a variety of steps to make the most of the Federal Government’s engagement with cities. To begin, it calls for more – and more effective – coordination among Federal agencies that are key to infrastructural investments in cities.  Coordination across agencies, of course, is the key to place-based policy. Building on the White House Smart Cities Initiative, which promotes not only R&D but also deployment of IT-based approaches to help cities solve challenges, PCAST also calls for expanding research and development coordination to include the physical, infrastructural technologies that are so fundamental to city services.

A new era of city design and city life is emerging. If the United States steers Federal investments in cities in ways that foster innovation, the impacts can be substantial. The rest of the world has also seen the potential, with numerous cities showcasing different approaches to innovation. The time to aim for leadership in urban technologies and urban science is now….(More)”

Public-Private Partnerships for Statistics: Lessons Learned, Future Steps


Report by Nicholas Robin, Thilo Klein and Johannes Jütting for Paris 21: “Non-offcial sources of data, big data in particular, are currently attracting enormous interest in the world of official statistics. An impressive body of work focuses on how different types of big data (telecom data, social media, sensors, etc.) can be used to fll specifc data gaps, especially with regard to the post-2015 agenda and the associated technology challenges. The focus of this paper is on a different aspect, but one that is of crucial importance: what are the perspectives of the commercial operations and national statistical offces which respectively produce and might use this data and which incentives, business models and protocols are needed in order to leverage non-offcial data sources within the offcial statistics community?

Public-private partnerships (PPPs) offer signifcant opportunities such as cost effectiveness, timeliness, granularity, new indicators, but also present a range of challenges that need to be surmounted. These comprise technical diffculties, risks related to data confdentiality as well as a lack of incentives. Nevertheless, a number of collaborative projects have already emerged and can be

Nevertheless, a number of collaborative projects have already emerged and can be classified into four ideal types: namely the in-house production of statistics by the data provider, the transfer of private data sets to the end user, the transfer of private data sets to a trusted third party for processing and/or analysis, and the outsourcing of national statistical office functions (the only model which is not centred around a data-sharing dimension). In developing countries, a severe lack of resources and particular statistical needs (to adopt a system-wide approach within national statistical systems and fill statistical gaps which are relevant to national development plans) highlight the importance of harnessing the private sector’s resources and point to the most holistic models (in-house and third party) in which the private sector contributes to the processing and analysis of data. The following key lessons are drawn from four case studies….(More)”

Counting down to ‘Evaluating Digital Citizen Engagement: A practical guide’


Matt Haikin at Aptiva: “Last year, Aptivate led a consortium of researchers and practitioners to explore the role of technology in citizen-engagement and participation in the development sector, and how to evaluate the success of such activities…

The guide was researched, developed and written by the multidisciplinary team of Matt Haikin (Aptivate), Savita Bailur (now at Caribou Digital), Evangelia Berdou (IDS), Claudia Lopes (now at Africa’s Voices), Jonathan Dudding (ICA:UK) and Martin Belcher (now at Palladium Group).

The result – ‘Evaluating Digital Citizen Engagement: A practical guide’ will be published in electronic form on the World Bank’s Open Knowledge Repository any day now.

The Guide forms part of the recommended reading for the World Bank’s high profile Coursera course Citizen Engagement : A game changer for development? …So what can you expect to find in the Guide…

  • Practical tools and guidelines for use in evaluating or designing activities in the expanding field of digital citizen engagement
  • Resources for anyone seeking to better understand the role of digital technology in citizen engagement.
  • Five ‘lenses’ you can use to explore different perspectives through which digital citizen engagement might be viewed (Objective, Control, Participation, Technology, Effects)Detailed advice and tips specific to technology and citizen engagement through every stage of a typical evaluation lifecycle (Scoping, Designing, Planning & Implementing, Analysing, Sharing, Reflecting & Learning)
  • Toolkits to help you design your own research questions and evaluation designs …(More)”

The city as platform


The report of the 2015 Aspen Institute Roundtable on Information Technology: “In the age of ubiquitous Internet connections, smartphones and data, the future vitality of cities is increasingly based on their ability to use digital networks in intelligent, strategic ways. While we are accustomed to thinking of cities as geophysical places governed by mayors, conventional political structures and bureaucracies, this template of city governance is under great pressure to evolve. Urban dwellers now live their lives in all sorts of hyper-connected virtual spaces, pulsating with real-time information, intelligent devices, remote-access databases and participatory crowdsourcing. Expertise is distributed, not centralized. Governance is not just a matter of winning elections and assigning tasks to bureaucracies; it is about the skillful collection and curation of information as a way to create new affordances for commerce and social life.

Except among a small class of vanguard cities, however, the far-reaching implications of the “networked city” for economic development, urban planning, social life and democracy, have not been explored in depth. The Aspen Institute Communications and Society Program thus convened an eclectic group of thirty experts to explore how networking technologies are rapidly changing the urban landscape in nearly every dimension. The goal was to learn how open networks, onlinecooperation and open data can enhance urban planning and administration, and more broadly, how they might improve economic opportunity and civic engagement. The conference, the 24th Annual Aspen Roundtable on Information Technology, also addressed the implications of new digital technologies for urban transportation, public health and safety, and socio-economic inequality….(Download the InfoTech 2015 Report)”

Sticky-note strategy: How federal innovation labs borrow from Silicon Valley


Carten Cordell in the Federal Times: “The framework for an integrated security solution in the Philippines is built on a bedrock of sticky notes. So is the strategy for combating piracy in East Africa and a handful of other plans that Zvika Krieger is crafting in a cauldron of collaboration within the State Department.

More specifically, Krieger, a senior adviser for strategy within the department’s Bureau of Political-Military Affairs, is working in the bureau’s Strategy Lab, just one pocket of federal government where a Silicon Valley-playbook for innovation is being used to develop policy solutions….

Krieger and a host of other policy thinkers learned a new way to channel innovation for policy solutions called human-centered design, or design thinking. While arguably new in government, the framework has long been in use by the tech sector to design products that will serve the needs of their customers. The strategy of group thinking towards a policy — which is more what these innovation labs seek to achieve — has been used before as well….Where the government has started to use HCD is in developing new policy solutions within a multifaceted group of stakeholders that can contribute a well-rounded slate of expertise. The product is a strategy that is developed from the creative thoughts of a team of experts, rather than a single specialized source….

The core tenet of HCD is to establish a meritocracy of ideas that is both empathetic of thought and immune to hierarchy. In order to get innovative solutions for a complex problem, Krieger forms a team of experts and stakeholders. He then mixes in outside thought leaders he calls “wild cards” to give the group outside perspective.

The delicate balance opens discussion and the mix of ideas ultimately form a strategy for handling the problem. That strategy might involve a technology; but it could also be a new partnership, a new function within an office, or a new acquisition program. Because the team is comprised of multiple experts, it can navigate the complexity more thoroughly, and the wild cards can offer their expertise to provide solutions the stakeholders may not have considered….

Human-centered design has been working its way through pockets of the federal government for a few years now. The Office of Personnel Management opened its Innovation Lab in 2012 and was tasked with improving the USAJobs website. The Department of Health and Human Services opened the IDEA Lab in 2013 to address innovation in its mission. The Department of Veteran Affairs has a Center of Innovation to identify new approaches to meet the current and future needs of veterans, and the departments of Defense and State both have innovation labs tackling policy solutions.

The concept is gaining momentum. This fall, the Obama administration released a strategy report calling for a network of innovation labs throughout federal agencies to develop new policy solutions through HCD.

“I think the word is spreading. It’s kind of like a whisper campaign, in the most positive way,” said an administration official with knowledge of innovation labs and HCD strategies, who was not authorized to speak to the press. “I think, again, the only constraint here is that we don’t have enough of them to be able to imbue this knowledge across government. We need many more people.”

A March 2014 GAO report said that the OPM Innovation Lab had not developed consistent performance targets that would allow it to assess the success of its projects. The report recommended more consistent milestones to assess progress, which the agency addressed through a series of pilot programs….

In the State Department’s Bureau of Educational and Cultural Affairs, an innovation lab called the Collaboratory is in its second year of existence, using HCD strategies to improve projects like the Fulbright program and other educational diplomacy efforts.

The Education Diplomacy initiative, for example, used HCD to devise ways to increase education access abroad using State resources. Defining U.S. embassies as the end user, the Collaboratory then analyzed the areas of need at the installations and began crafting policies.

“We identified a couple of area where we thought we could make substantial gains quite quickly and in a budget neutral way,” Collaboratory Deputy Director Paul Kruchoski said. The process allowed multiple stakeholders like the U.S. Agency for International Development, Peace Corps and the Department of Education to help craft the policy and create what Kruchoski called “feedback loops” to refine throughout the embassies…(More)”

 

How to Hold Governments Accountable for the Algorithms They Use


 in Slate: “In 2015 more than 59 million Americans received some form ofbenefit from the Social Security Administration, not just for retirement but also for disability or as a survivor of a deceased worker. It’s a behemoth of a government program, and keeping it solvent has preoccupied the Office of the Chief Actuary of theSocial Security Administration for years. That office makes yearly forecasts of key demographic (such as mortality rates) or economic (for instance, labor forceparticipation) factors that inform how policy can or should change to keep theprogram on sound financial footing. But a recent Harvard University study examinedseveral of these forecasts and found that they were systematically biased—underestimating life expectancy and implying that funds were on firmer financialground than warranted. The procedures and methods that the SSA uses aren’t openfor inspection either, posing challenges to replicating and debugging those predictivealgorithms.

Whether forecasting the solvency of social programs, waging a war, managingnational security, doling out justice and punishment, or educating the populace,government has a lot of decisions to make—and it’s increasingly using algorithms tosystematize and scale that bureaucratic work. In the ideal democratic state, theelectorate chooses a government that provides social goods and exercises itsauthority via regulation. The government is legitimate to the extent that it is heldaccountable to the citizenry. Though as the SSA example shows, tightly heldalgorithms pose issues of accountability that grind at the very legitimacy of thegovernment itself.

One of the immensely useful abilities of algorithms is to rank and prioritize hugeamounts of data, turning a messy pile of items into a neat and orderly list. In 2013 theObama administration announced that it would be getting into the business ofranking colleges, helping the citizens of the land identify and evaluate the “best”educational opportunities. But two years later, the idea of ranking colleges had beenneutered, traded in for what amounts to a data dump of educational statistics calledthe College Scorecard. The human influences, subjective factors, and methodologicalpitfalls involved in quantifying education into rankings would be numerous. Perhapsthe government sensed that any ranking would be dubious—that it would be riddledwith questions of what data was used and how various statistical factors wereweighted. How could the government make such a ranking legitimate in the eyes ofthe public and of the industry that it seeks to hold accountable?

That’s a complicated question that goes far beyond college rankings. But whatever theend goal, government needs to develop protocols for opening up algorithmic blackboxes to democratic processes.

Transparency offers one promising path forward. Let’s consider the new risk-assessment algorithm that the state of Pennsylvania is developing to help make criminal sentencing decisions. Unlike some other states that are pursuing algorithmiccriminal justice using proprietary systems, the level of transparency around thePennsylvania Risk Assessment Project is laudable, with several publicly available in-depth reports on the development of the system….(More)’