Data Mining Reveals the Four Urban Conditions That Create Vibrant City Life


Emerging Technology from the arXiv: “Lack of evidence to city planning has ruined cities all over the world. But data-mining techniques are finally revealing the rules that make cities successful, vibrant places to live. …Back in 1961, the gradual decline of many city centers in the U.S. began to puzzle urban planners and activists alike. One of them, the urban sociologist Jane Jacobs, began a widespread and detailed investigation of the causes and published her conclusions in The Death and Life of Great American Cities, a controversial book that proposed four conditions that are essential for vibrant city life.

Jacobs’s conclusions have become hugely influential. Her ideas have had a significant impact on the development of many modern cities such as Toronto and New York City’s Greenwich Village. However, her ideas have also attracted criticism because of the lack of empirical evidence to back them up, a problem that is widespread in urban planning.
Today, that looks set to change thanks to the work of Marco De Nadai at the University of Trento and a few pals, who have developed a way to gather urban data that they use to test Jacobs’s conditions and how they relate to the vitality of city life. The new approach heralds a new age of city planning in which planners have an objective way of assessing city life and working out how it can be improved.
In her book, Jacobs argues that vibrant activity can only flourish in cities when the physical environment is diverse. This diversity, she says, requires four conditions. The first is that city districts must serve more than two functions so that they attract people with different purposes at different times of the day and night. Second, city blocks must be small with dense intersections that give pedestrians many opportunities to interact. The third condition is that buildings must be diverse in terms of age and form to support a mix of low-rent and high-rent tenants. By contrast, an area with exclusively new buildings can only attract businesses and tenants wealthy enough to support the cost of new building. Finally, a district must have a sufficient density of people and buildings.

While Jacobs’s arguments are persuasive, her critics say there is little evidence to show that these factors are linked with vibrant city life. That changed last year when urban scientists in Seoul, South Korea, published the result of a 10-year study of pedestrian activity in the city at unprecedented resolution. This work successfully tested Jacobs’s ideas for the first time.
However, the data was gathered largely through pedestrian surveys, a process that is time-consuming, costly, and generally impractical for use in most modern cities.
De Nadai and co have come up with a much cheaper and quicker alternative using a new generation of city databases and the way people use social media and mobile phones. The new databases include OpenStreetMap, the collaborative mapping tool; census data, which records populations and building use; land use data, which uses satellite images to classify land use according to various categories; Foursquare data, which records geographic details about personal activity; and mobile-phone records showing the number and frequency of calls in an area.
De Nadai and co gathered this data for six cities in Italy—Rome, Naples, Florence, Bologna, Milan, and Palermo.
Their analysis is straightforward. The team used mobile-phone activity as a measure of urban vitality and land-use records, census data, and Foursquare activity as a measure of urban diversity. Their goal was to see how vitality and diversity are correlated in the cities they studied. The results make for interesting reading….(More)

Open Data Impact: When Demand and Supply Meet


Stefaan Verhulst and Andrew Young at the GovLab: “Today, in “Open Data Impact: When Demand and Supply Meet,” the GovLab and Omidyar Network release key findings about the social, economic, cultural and political impact of open data. The findings are based on 19 detailed case studies of open data projects from around the world. These case studies were prepared in order to address an important shortcoming in our understanding of when, and how, open data works. While there is no shortage of enthusiasm for open data’s potential, nor of conjectural estimates of its hypothetical impact, few rigorous, systematic analyses exist of its concrete, real-world impact…. The 19 case studies that inform this report, all of which can be found at Open Data’s Impact (odimpact.org), a website specially set up for this project, were chosen for their geographic and sectoral representativeness. They seek to go beyond the descriptive (what happened) to the explanatory (why it happened, and what is the wider relevance or impact)….

In order to achieve the potential of open data and scale the impact of the individual projects discussed in our report, we need a better – and more granular – understanding of the enabling conditions that lead to success. We found 4 central conditions (“4Ps”) that play an important role in ensuring success:

Conditions

  • Partnerships: Intermediaries and data collaboratives play an important role in ensuring success, allowing for enhanced matching of supply and demand of data.
  • Public infrastructure: Developing open data as a public infrastructure, open to all, enables wider participation, and a broader impact across issues and sectors.
  • Policies: Clear policies regarding open data, including those promoting regular assessments of open data projects, are also critical for success.
  • Problem definition: Open data initiatives that have a clear target or problem definition have more impact and are more likely to succeed than those with vaguely worded statements of intent or unclear reasons for existence. 

Core Challenges

Finally, the success of a project is also determined by the obstacles and challenges it confronts. Our research uncovered 4 major challenges (“4Rs”) confronting open data initiatives across the globe:

Challenges

  • Readiness: A lack of readiness or capacity (evident, for example, in low Internet penetration or technical literacy rates) can severely limit the impact of open data.
  • Responsiveness: Open data projects are significantly more likely to be successful when they remain agile and responsive—adapting, for instance, to user feedback or early indications of success and failure.
  • Risks: For all its potential, open data does pose certain risks, notably to privacy and security; a greater, more nuanced understanding of these risks will be necessary to address and mitigate them.
  • Resource Allocation: While open data projects can often be launched cheaply, those projects that receive generous, sustained and committed funding have a better chance of success over the medium and long term.

Toward a Next Generation Open Data Roadmap

The report we release today concludes with ten recommendations for policymakers, advocates, users, funders and other stakeholders in the open data community. For each step, we include a few concrete methods of implementation – ways to translate the broader recommendation into meaningful impact.

Together, these 10 recommendations and their means of implementation amount to what we call a “Next Generation Open Data Roadmap.” This roadmap is just a start, and we plan to continue fleshing it out in the near future. For now, it offers a way forward. It is our hope that this roadmap will help guide future research and experimentation so that we can continue to better understand how the potential of open data can be fulfilled across geographies, sectors and demographics.

Additional Resources

In conjunction with the release of our key findings paper, we also launch today an “Additional Resources” section on the Open Data’s Impact website. The goal of that section is to provide context on our case studies, and to point in the direction of other, complementary research. It includes the following elements:

  • A “repository of repositories,” including other compendiums of open data case studies and sources;
  • A compilation of some popular open data glossaries;
  • A number of open data research publications and reports, with a particular focus on impact;
  • A collection of open data definitions and a matrix of analysis to help assess those definitions….(More)

How to Crowdsource the Syrian Cease-Fire


Colum Lynch at Foreign Policy: “Can the wizards of Silicon Valley develop a set of killer apps to monitor the fragile Syria cease-fire without putting foreign boots on the ground in one of the world’s most dangerous countries?

They’re certainly going to try. The “cessation of hostilities” in Syria brokered by the United States and Russia last month has sharply reduced the levels of violence in the war-torn country and sparked a rare burst of optimism that it could lead to a broader cease-fire. But if the two sides lay down their weapons, the international community will face the challenge of monitoring the battlefield to ensure compliance without deploying peacekeepers or foreign troops. The emerging solution: using crowdsourcing, drones, satellite imaging, and other high-tech tools.

The high-level interest in finding a technological solution to the monitoring challenge was on full display last month at a closed-door meeting convened by the White House that brought together U.N. officials, diplomats, digital cartographers, and representatives of Google, DigitalGlobe, and other technology companies. Their assignment was to brainstorm ways of using high-tech tools to keep track of any future cease-fires from Syria to Libya and Yemen.

The off-the-record event came as the United States, the U.N., and other key powers struggle to find ways of enforcing cease-fires from Syria at a time when there is little political will to run the risk of sending foreign forces or monitors to such dangerous places. The United States has turned to high-tech weapons like armed drones as weapons of war; it now wants to use similar systems to help enforce peace.

Take the Syria Conflict Mapping Project, a geomapping program developed by the Atlanta-based Carter Center, a nonprofit founded by former U.S. President Jimmy Carter and his wife, Rosalynn, to resolve conflict and promote human rights. The project has developed an interactive digital map that tracks military formations by government forces, Islamist extremists, and more moderate armed rebels in virtually every disputed Syrian town. It is now updating its technology to monitor cease-fires.

The project began in January 2012 because of a single 25-year-old intern, Christopher McNaboe. McNaboe realized it was possible to track the state of the conflict by compiling disparate strands of publicly available information — including the shelling and aerial bombardment of towns and rebel positions — from YouTube, Twitter, and other social media sites. It has since developed a mapping program using software provided by Palantir Technologies, a Palo Alto-based big data company that does contract work for U.S. intelligence and defense agencies, from the CIA to the FBI….

Walter Dorn, an expert on technology in U.N. peace operations who attended the White House event, said he had promoted what he calls a “coalition of the connected.”

The U.N. or other outside powers could start by tracking social media sites, including Twitter and YouTube, for reports of possible cease-fire violations. That information could then be verified by “seeded crowdsourcing” — that is, reaching out to networks of known advocates on the ground — and technological monitoring through satellite imagery or drones.

Matthew McNabb, the founder of First Mile Geo, a start-up which develops geolocation technology that can be used to gather data in conflict zones, has another idea. McNabb, who also attended the White House event, believes “on-demand” technologies like SurveyMonkey, which provides users a form to create their own surveys, can be applied in conflict zones to collect data on cease-fire violations….(More)

Innovation Prizes in Practice and Theory


Paper by Michael J. Burstein and Fiona Murray: “Innovation prizes in reality are significantly different from innovation prizes in theory. The former are familiar from popular accounts of historical prizes like the Longitude Prize: the government offers a set amount for a solution to a known problem, like £20,000 for a method of calculating longitude at sea. The latter are modeled as compensation to inventors in return for donating their inventions to the public domain. Neither the economic literature nor the policy literature that led to the 2010 America COMPETES Reauthorization Act — which made prizes a prominent tool of government innovation policy — provides a satisfying justification for the use of prizes, nor does either literature address their operation. In this article, we address both of these problems. We use a case study of one canonical, high profile innovation prize — the Progressive Insurance Automotive X Prize — to explain how prizes function as institutional means to achieve exogenously defined innovation policy goals in the face of significant uncertainty and information asymmetries. Focusing on the structure and function of actual innovation prizes as an empirical matter enables us to make three theoretical contributions to the current understanding of prizes. First, we offer a stronger normative justification for prizes grounded in their status as a key institutional arrangement for solving a specified innovation problem. Second, we develop a model of innovation prize governance and then situate that model in the administrative state, as a species of “new governance” or “experimental” regulation. Third, we derive from those analyses a novel framework for choosing among prizes, patents, and grants, one in which the ultimate choice depends on a trade off between the efficacy and scalability of the institutional solution….(More)”

Innovating for pro-poor services: Why politics matter


Nathaniel Mason, Clare Cummings and Julian Doczi for ODI insights: “To solve sustainable development challenges, such as the provision of universal access to basic services, we need new ideas, as well as old ideas applied in new ways and new places. The pace of global innovation, particularly digital innovation, is generating optimism, positioning the world at the start of the ‘Fourth Industrial Revolution’.1 Innovation can make basic services cheaper, more accessible, more relevant and more desirable for poor people. However, we also know few innovations lead to sustainable, systemic change. The barriers the this are often political – including problems related to motivation, power and collective action. Yet, just as political factors can prevent innovations from being widely adopted, politically smart approaches can help in navigating and mitigating these challenges. And, because innovations can alter the balance of power in societies and markets, they can both provoke new and challenging politics themselves and also help unlock systemic political change. When and why does politics affect innovation? What does this mean for donors, foundations and impact investors backing innovations for development?…(More)

Technology and politics: The signal and the noise


Special Issue of The Economist: “…The way these candidates are fighting their campaigns,each in his own way, is proof that politics as usual is no longer an option. The internet and the availability of huge piles of data on everyone and everything are transforming the democratic process, just as they are upending many industries. They are becoming a force in all kinds of things,from running election campaigns and organising protest movements to improving public policy and the delivery of services. This special report will argue that, as a result, the relationship between citizens and those who govern them is changing fundamentally.

Incongruous though it may seem, the forces that are now powering the campaign of Mr Trump—as well as that ofBernie Sanders, the surprise candidate on the Democratic side (Hillary Clinton is less of a success online)—were first seen in full cry during the Arab spring in 2011. The revolution in Egypt and other Arab countries was not instigated by Twitter, Facebook and other social-media services, but they certainly help edit gain momentum. “The internet is an intensifier,” says Marc Lynch of GeorgeWashington University, a noted scholar of the protest movements in the region…..

However, this special report will argue that, in the longer term, online crusading and organising will turn out to matter less to politics in the digital age than harnessing those ever-growing piles of data. The internet and related technologies, such as smart phones and cloud computing, make it cheap and easy not only to communicate but also to collect, store and analyse immense quantities of information. This is becoming ever more important in influencing political outcomes.

America’s elections are a case in point. Mr Cruz with his data savvy is merely following in the footsteps of Barack Obama, who won his first presidential term with the clever application of digital know-how. Campaigners are hoovering up more and more digital information about every voting-age citizen and stashing it away in enormous databases.With the aid of complex algorithms, these data allow campaigners to decide, say, who needs to be reminded to make the trip to the polling station and who may be persuaded to vote for a particular candidate.

No hiding place

In the case of protest movements, the waves of collective action leave a big digital footprint. Using ever more sophisticated algorithms, governments can mine these data.That is changing the balance of power. In the event of another Arab spring, autocrats would not be caught off guard again because they are now able to monitor protests and intervene when they consider it necessary. They can also identify and neutralise the most influential activists. Governments that were digitally blind when the internet first took off in the mid-1990s now have both a telescope and a microscope.

But data are not just changing campaigns and political movements; they affect how policy is made and public services are offered. This is most visible at local-government level. Cities have begun to use them for everything from smoothing traffic flows to identifying fire hazards. Having all this information at their fingertips is bound to change the way these bureaucracies work, and how they interact with citizens. This will not only make cities more efficient, but provide them with data and tools that could help them involve their citizens more.

This report will look at electoral campaigns, protest movements and local government in turn. Readers will note that most of the examples quoted are American and that most of the people quoted are academics. That is because the study of the interrelationship between data and politics is relatively new and most developed in America. But it is beginning to spill out from the ivory towers, and is gradually spreading to other countries.

The growing role of technology in politics raises many questions. How much of a difference, for instance, do digitally enabled protest surges really make? Many seem to emerge from nowhere, then crash almost as suddenly, defeated by hard political realities and entrenched institutions. The Arab spring uprising in Egypt is one example. Once the incumbent president, Hosni Mubarak, was toppled, the coalition that brought him down fell apart, leaving the stage to the old powers, first the Muslim Brotherhood and then the armed forces.

In party politics, some worry that the digital targeting of voters might end up reducing the democratic process to a marketing exercise. Ever more data and better algorithms, they fret, could lead politicians to ignore those unlikely to vote for them. And in cities it is no tclear that more data will ensure that citizens become more engaged….(More)

See also:

Crowdsourced Health


crowdsourcedhealthBook by Elad Yom-Tov: “Most of us have gone online to search for information about health. What are the symptoms of a migraine? How effective is this drug? Where can I find more resources for cancer patients? Could I have an STD? Am I fat? A Pew survey reports more than 80 percent of American Internet users have logged on to ask questions like these. But what if the digital traces left by our searches could show doctors and medical researchers something new and interesting? What if the data generated by our searches could reveal information about health that would be difficult to gather in other ways? In this book, Elad Yom-Tov argues that Internet data could change the way medical research is done, supplementing traditional tools to provide insights not otherwise available. He describes how studies of Internet searches have, among other things, already helped researchers track to side effects of prescription drugs, to understand the information needs of cancer patients and their families, and to recognize some of the causes of anorexia.

Yom-Tov shows that the information collected can benefit humanity without sacrificing individual privacy. He explains why people go to the Internet with health questions; for one thing, it seems to be a safe place to ask anonymously about such matters as obesity, sex, and pregnancy. He describes in detrimental effects of “pro-anorexia” online content; tells how computer scientists can scour search engine data to improve public health by, for example, identifying risk factors for disease and centers of contagion; and tells how analyses of how people deal with upsetting diagnoses help doctors to treat patients and patients to understand their conditions….(More)

“Big data” and “open data”: What kind of access should researchers enjoy?


Paper by Gilles Chatellier, Vincent Varlet, and Corinne Blachier-Poisson in Thérapie: “The healthcare sector is currently facing a new paradigm, the explosion of “big data”. Coupled with advances in computer technology, the field of “big data” appears promising, allowing us to better understand the natural history of diseases, to follow-up new technologies (devices, drugs) implementation and to participate in precision medicine, etc. Data sources are multiple (medical and administrative data, electronic medical records, data from rapidly developing technologies such as DNA sequencing, connected devices, etc.) and heterogeneous while their use requires complex methods for accurate analysis. Moreover, faced with this new paradigm, we must determine who could (or should) have access to which data, how to combine collective interest and protection of personal data and how to finance in the long-term both operating costs and databases interrogation. This article analyses the opportunities and challenges related to the use of open and/or “big data”, … (More)”

The Social Intranet: Insights on Managing and Sharing Knowledge Internally


Paper by Ines Mergel for IBM Center for the Business of Government: “While much of the federal government lags behind, some agencies are pioneers in the internal use of social media tools.  What lessons and effective practices do they have to offer other agencies?

Social intranets,” Dr. Mergel writes, “are in-house social networks that use technologies – such as automated newsfeeds, wikis, chats, or blogs – to create engagement opportunities among employees.”  They also include the use of internal profile pages that help people identify expertise and interest (similar to Facebook or LinkedIn profiles), and that are used in combination with other social Intranet tools such as on-line communities or newsfeeds.

The report documents four case studies of government use of social intranets – two federal government agencies (the Department of State and the National Aeronautics and Space Administration) and two cross-agency networks (the U.S. Intelligence Community and the Government of Canada).

The author observes: “Most enterprise social networking platforms fail,” but identifies what causes these failures and how successful social intranet initiatives can avoid that fate and thrive.  She offers a series of insights for successfully implementing social intranets in the public sector, based on her observations and case studies. …(More)”

On-demand service could be Uber for blood collection


Springwise: Collecting blood for clinical and medical testing can be an arduous process, involving lots of travel, with the added annoyance of patients not turning up to appointments, and time wasted with doctors and phlebotomists.

But now a new phlebotomy service, Iggbo, is looking to change that. Blood collectors can become a freelancer with the new on-demand app; rather than employing a full-time phlebotomist, doctors can use the app to employ a freelancer who is paid per donation collected. The phlebotomists can collect donations at home, rather than having patients wait around for someone to take their blood.

Physicians can book clinical blood donations and Iggbo sends out the request via its smartphone app to its freelance phlebotomists. Iggbo also sends a number of reminders using text and email to ensure patients remember their appointments, and helps to book a new phlebotomist if something falls through.

The app even features an Uber-like ratings system, and Iggbo says it carefully vets all its collectors to ensure the service is safe and efficient. The company says the service will improve the efficiency of the healthcare system for blood testing and collection, and in doing so reduce overall costs….(More)”