The Economist on how “Data are slowly changing the way cities operate…WAITING for a bus on a drizzly winter morning is miserable. But for London commuters Citymapper, an app, makes it a little more bearable. Users enter their destination into a search box and a range of different ways to get there pop up, along with real-time information about when a bus will arrive or when the next Tube will depart. The app is an example of how data are changing the way people view and use cities. Local governments are gradually starting to catch up.
Nearly all big British cities have started to open up access to their data. On October 23rd the second version of the London Datastore, a huge trove of information on everything from crime statistics to delays on the Tube, was launched. In April Leeds City council opened an online “Data Mill” which contains raw data on such things as footfall in the city centre, the number of allotment sites or visits to libraries. Manchester also releases chunks of data on how the city region operates.
Mostly these websites act as tools for developers and academics to play around with. Since the first Datastore was launched in 2010, around 200 apps, such as Citymapper, have sprung up. Other initiatives have followed. “Whereabouts”, which also launched on October 23rd, is an interactive map by the Future Cities Catapult, a non-profit group, and the Greater London Authority (GLA). It uses 235 data sets, some 150 of them from the Datastore, from the age and occupation of London residents to the number of pubs or types of restaurants in an area. In doing so it suggests a different picture of London neighbourhoods based on eight different categories (see map, and its website: whereaboutslondon.org)….”
Research Handbook On Transparency
New book edited by Padideh Ala’i and Robert G. Vaughn: ‘”Transparency” has multiple, contested meanings. This broad-ranging volume accepts that complexity and thoughtfully contrasts alternative views through conceptual pieces, country cases, and assessments of policies–such as freedom of information laws, whistleblower protections, financial disclosure, and participatory policymaking procedures.’
– Susan Rose-Ackerman, Yale University Law School, US
In the last two decades transparency has become a ubiquitous and stubbornly ambiguous term. Typically understood to promote rule of law, democratic participation, anti-corruption initiatives, human rights, and economic efficiency, transparency can also legitimate bureaucratic power, advance undemocratic forms of governance, and aid in global centralization of power. This path-breaking volume, comprising original contributions on a range of countries and environments, exposes the many faces of transparency by allowing readers to see the uncertainties, inconsistencies and surprises contained within the current conceptions and applications of the term….
The expert contributors identify the goals, purposes and ramifications of transparency while presenting both its advantages and shortcomings. Through this framework, they explore transparency from a number of international and comparative perspectives. Some chapters emphasize cultural and national aspects of the issue, with country-specific examples from China, Mexico, the US and the UK, while others focus on transparency within global organizations such as the World Bank and the WTO. A number of relevant legal considerations are also discussed, including freedom of information laws, financial disclosure of public officials and whistleblower protection…”
Ambulance Drone is a flying first aid kit that could save lives
Springwise: “When a medical emergency takes place, the response time can make all the difference between a life saved and a life lost. Unfortunately, ambulances can get stuck in traffic and on average they arrive 10 minutes after the emergency call has been made, in which time a cardiac arrest victim may have already succumbed to a lack of oxygen to the brain. We’ve already seen Germany’s Defikopter use drones to ensure defibrillators are on scene by the time a medical professional arrives, but now the Ambulance Drone is an all-purpose medical toolkit that can be automatically flown to any emergency situation and used to guide citizens to make non-technical lifesaving procedures.
Created by Alex Monton, a graduate of the Delft University of Technology, the drone is custom designed to deliver in the event of an emergency. Inside, it houses a compact defibrillator, medication and CPR aids, as well as other essential supplies for the layperson to use while they wait for a medical professional. The idea is that those at the scene can phone emergency services as normal, giving their location. An ambulance and the Ambulance Drone are despatched immediately, with the drone capable of arriving in around 1 minute.
Once it’s there, the call can be transferred to the drone, which has in-built speakers. This frees the caller’s hands to perform tasks such as placing the victim in the recovery position and preparing the defibrillator, with vocal guidance from the emergency response team. The team can see live video of the event to make sure that any procedures are completed correctly, as well as passing on relevant info to the approaching ambulance…”
Ebola’s Information Paradox
It was a full seven days after Baby Lewis became ill, and four days after the Soho residents began dying in mass numbers, before the outbreak warranted the slightest mention in the London papers, a few short lines indicating that seven people had died in the neighborhood. (The report understated the growing death toll by an order of magnitude.) It took two entire weeks before the press began treating the outbreak as a major news event for the city.
Within Soho, the information channels were equally unreliable. Rumors spread throughout the neighborhood that the entire city had succumbed at the same casualty rate, and that London was facing a catastrophe on the scale of the Great Fire of 1666. But this proved to be nothing more than rumor. Because the Soho crisis had originated with a single-point source — the poisoned well — its range was limited compared with its intensity. If you lived near the Broad Street well, you were in grave danger. If you didn’t, you were likely to be unaffected.
Compare this pattern of information flow to the way news spreads now. On Thursday, Craig Spencer, a New York doctor, was given a diagnosis of Ebola after presenting a high fever, and the entire world learned of the test result within hours of the patient himself learning it. News spread with similar velocity several weeks ago with the Dallas Ebola victim, Thomas Duncan. In a sense, it took news of the cholera outbreak a week to travel the 20 blocks from Soho to Fleet Street in 1854; today, the news travels at nearly the speed of light, as data traverses fiber-optic cables. Thanks to that technology, the news channels have been on permanent Ebola watch for weeks now, despite the fact that, as the joke went on Twitter, more Americans have been married to Kim Kardashian than have died in the United States from Ebola.
As societies and technologies evolve, the velocities vary with which disease and information can spread. The tremendous population density of London in the 19th century enabled the cholera bacterium to spread through a neighborhood with terrifying speed, while the information about that terror moved more slowly. This was good news for the mental well-being of England’s wider population, which was spared the anxiety of following the death count as if it were a stock ticker. But it was terrible from a public health standpoint; the epidemic had largely faded before the official institutions of public health even realized the magnitude of the outbreak….
Information travels faster than viruses do now. This is why we are afraid. But this is also why we are safe.”
European Union Open Data Portal
By providing easy and free access to data, the portal aims to promote their innovative use and unleash their economic potential. It also aims to help foster the transparency and the accountability of the institutions and other bodies of the EU.
The EU Open Data Portal is managed by the Publications Office of the European Union. Implementation of the EU’s open data policy is the responsibility of the Directorate-General for Communications Networks, Content and Technology of the European Commission.
The portal provides a metadata catalogue giving access to data from the institutions and other bodies of the EU. To facilitate reuse, these metadata are based on common encoding rules and standardized vocabularies.To learn more, see Linked Data.
Data are available in both human and machine readable formats for immediate reuse. You will also find a selection of applications built around EU data.To learn more, see Applications.How can I reuse these data?
As a general principle, you can reuse data free of charge, provided that the source is acknowledged (see legal notice).Specific conditions on reuse, related mostly to the protection of third-party intellectual property rights, apply to a small number of data. A link to these conditions is displayed on the relevant data pages.
How can I participate in the portal?
Another important goal of the portal is to engage with the user community around EU open data. You can participate by:
- suggesting datasets,
- giving your feedback and suggestions, and
- sharing your apps or the use you have made with the data from the portal.
On policy and delivery
Speech by Mike Bracken (gov.uk): “…most of the work the civil service does goes unseen, or at least unheralded. But whether it’s Ebola screens, student loans, renewing your car tax, or a thousand other things, that work is vital to everyone in the UK.
Often that work is harder than it needs to be.
I don’t think anyone disagrees that the civil service needs reform. It’s the nature of that reform I want to talk about today.
The Internet has changed everything. Digital is the technological enabler of this century. And, in any sector you care to name, it’s been the lifeblood of organisations that have embraced it, and a death sentence for those that haven’t. If you take away one thing today, please make it this: government is not immune to the seismic changes that digital technology has brought to bear.
The Internet is changing the organising principle of every industry it touches, mostly for the better: finance, retail, media, transport, energy. Some industries refuse to change their organising principle. The music industry was dominated by producers – the record labels – now it’s dominated by digital distribution – like Spotify and their ilk.
Others, like airlines, have rapidly changed how they work internally, and are organised radically differently in order to serve users in a digital age. British Airways used to have over 80 ticket types, with departments and hierarchies competing to attract users. Now it has a handful, and the organisation is digital first and much simpler. These changes are invisible to the majority, but that’s doesn’t make the changes any less significant.
Twenty five years into the era of digital transformation, the Internet has a 100% track record of success making industries simpler to users while forcing organisations to fundamentally change how they’re structured. These characteristics are not going away. Yet the effect on the civil service has been, until very recently, marginal.
This is because we deferred our digital development by grouping digital services into enormous, multi-year IT contracts, or what we refer to as ‘Big IT’. Or in short, we gave away our digital future to the IT crowd. While most large organisations reversed these arrangements we have only recently separated our future strategy – digital literacy and digital service provision – from the same contracts that handle commodity technology. By clinging to this model for 15 years, we have created a huge problem for everyone involved in delivery and policy.
Today I want to talk about two things.
The first is delivery, because I believe delivery to users, not policy, should be the organising principle of a reformed civil service.
And the second is skills, and why it’s time for the civil service to put digital skills at the heart of the machine….”
Chicago uses big data to save itself from urban ills
Aviva Rutkin in the New Scientist: “THIS year in Chicago, some kids will get lead poisoning from the paint or pipes in their homes. Some restaurants will cook food in unsanitary conditions and, here and there, a street corner will be suddenly overrun with rats. These kinds of dangers are hard to avoid in a city of more than 2.5 million people. The problem is, no one knows for certain where or when they will pop up.
The Chicago city government is hoping to change that by knitting powerful predictive models into its everyday city inspections. Its latest project, currently in pilot tests, analyses factors such as home inspection records and census data, and uses the results to guess which buildings are likely to cause lead poisoning in children – a problem that affects around 500,000 children in the US each year. The idea is to identify trouble spots before kids are exposed to dangerous lead levels.
“We are able to prevent problems instead of just respond to them,” says Jay Bhatt, chief innovation officer at the Chicago Department of Public Health. “These models are just the beginning of the use of predictive analytics in public health and we are excited to be at the forefront of these efforts.”
Chicago’s projects are based on the thinking that cities already have what they need to raise their municipal IQ: piles and piles of data. In 2012, city officials built WindyGrid, a platform that collected data like historical facts about buildings and up-to-date streams such as bus locations, tweets and 911 calls. The project was designed as a proof of concept and was never released publicly but it led to another, called Plenario, that allowed the public to access the data via an online portal.
The experience of building those tools has led to more practical applications. For example, one tool matches calls to the city’s municipal hotline complaining about rats with conditions that draw rats to a particular area, such as excessive moisture from a leaking pipe, or with an increase in complaints about garbage. This allows officials to proactively deploy sanitation crews to potential hotspots. It seems to be working: last year, resident requests for rodent control dropped by 15 per cent.
Some predictions are trickier to get right. Charlie Catlett, director of the Urban Center for Computation and Data in Chicago, is investigating an old axiom among city cops: that violent crime tends to spike when there’s a sudden jump in temperature. But he’s finding it difficult to test its validity in the absence of a plausible theory for why it might be the case. “For a lot of things about cities, we don’t have that underlying theory that tells us why cities work the way they do,” says Catlett.
Still, predictive modelling is maturing, as other cities succeed in using it to tackle urban ills….Such efforts can be a boon for cities, making them more productive, efficient and safe, says Rob Kitchin of Maynooth University in Ireland, who helped launched a real-time data site for Dublin last month called the Dublin Dashboard. But he cautions that there’s a limit to how far these systems can aid us. Knowing that a particular street corner is likely to be overrun with rats tomorrow doesn’t address what caused the infestation in the first place. “You might be able to create a sticking plaster or be able to manage it more efficiently, but you’re not going to be able to solve the deep structural problems….”
Traversing Digital Babel
New book by Alon Peled: “The computer systems of government agencies are notoriously complex. New technologies are piled on older technologies, creating layers that call to mind an archaeological dig. Obsolete programming languages and closed mainframe designs offer barriers to integration with other agency systems. Worldwide, these unwieldy systems waste billions of dollars, keep citizens from receiving services, and even—as seen in interoperability failures on 9/11 and during Hurricane Katrina—cost lives. In this book, Alon Peled offers a groundbreaking approach for enabling information sharing among public sector agencies: using selective incentives to “nudge” agencies to exchange information assets. Peled proposes the establishment of a Public Sector Information Exchange (PSIE), through which agencies would trade information.
After describing public sector information sharing failures and the advantages of incentivized sharing, Peled examines the U.S. Open Data program, and the gap between its rhetoric and results. He offers examples of creative public sector information sharing in the United States, Australia, Brazil, the Netherlands, and Iceland. Peled argues that information is a contested commodity, and draws lessons from the trade histories of other contested commodities—including cadavers for anatomical dissection in nineteenth-century Britain. He explains how agencies can exchange information as a contested commodity through a PSIE program tailored to an individual country’s needs, and he describes the legal, economic, and technical foundations of such a program. Touching on issues from data ownership to freedom of information, Peled offers pragmatic advice to politicians, bureaucrats, technologists, and citizens for revitalizing critical information flows.”
The Role Of Open Data In Choosing Neighborhood
PlaceILive Blog: “To what extent is it important to get familiar with our environment?
If we think about how the world surrounding us has changed throughout the years, it is not so unreasonable that, while walking to work, we might encounter some new little shops, restaurants, or gas stations we had never noticed before. Likewise, how many times did we wander about for hours just to find green spaces for a run? And the only one we noticed was even more polluted than other urban areas!
Citizens are not always properly informed about the evolution of the places they live in. And that is why it would be crucial for people to be constantly up-to-date with accurate information of the neighborhood they have chosen or are going to choose.
London is a neat evidence of how transparency in providing data is basic in order to succeed as a Smart City.
The GLA’s London Datastore, for instance, is a public platform of datasets revealing updated figures on the main services offered by the town, in addition to population’s lifestyle and environmental risks. These data are then made more easily accessible to the community through the London Dashboard.
The importance of dispensing free information can be also proved by the integration of maps, which constitute an efficient means of geolocation. Consulting a map where it’s easy to find all the services you need as close as possible can be significant in the search for a location.
(source: Smart London Plan)
The Open Data Index, published by The Open Knowledge Foundation in 2013, is another useful tool for data retrieval: it showcases a rank of different countries in the world with scores based on openness and availability of data attributes such as transport timetables and national statistics.
Here it is possible to check UK Open Data Census and US City Open Data Census.
As it was stated, making open data available and easily findable online not only represented a success for US cities but favoured apps makers and civic hackers too. Lauren Reid, a spokesperson at Code for America, reported according to Government Technology: “The more data we have, the better picture we have of the open data landscape.”
That is, on the whole, what Place I Live puts the biggest effort into: fostering a new awareness of the environment by providing free information, in order to support citizens willing to choose the best place they can live.
The outcome is soon explained. The website’s homepage offers visitors the chance to type address of their interest, displaying an overview of neighborhood parameters’ evaluation and a Life Quality Index calculated for every point on the map.
The research of the nearest medical institutions, schools or ATMs thus gets immediate and clear, as well as the survey about community’s generic information. Moreover, data’s reliability and accessibility are constantly examined by a strong team of professionals with high competence in data analysis, mapping, IT architecture and global markets.
For the moment the company’s work is focused on London, Berlin, Chicago, San Francisco and New York, while higher goals to reach include more than 200 cities.
US Open Data Census finally saw San Francisco’s highest score achievement as a proof of the city’s labour in putting technological expertise at everyone’s disposal, along with the task of fulfilling users’ needs through meticulous selections of datasets. This challenge seems to be successfully overcome by San Francisco’s new investment, partnering with the University of Chicago, in a data analytics dashboard on sustainability performance statistics named Sustainable Systems Framework, which is expected to be released in beta version by the the end of 2015’s first quarter.
Another remarkable collaboration in Open Data’s spread comes from the Bartlett Centre for Advanced Spatial Analysis (CASA) of the University College London (UCL); Oliver O’Brien, researcher at UCL Department of Geography and software developer at the CASA, is indeed one of the contributors to this cause.
Among his products, an interesting accomplishment is London’s CityDashboard, a real-time reports’ control panel in terms of spatial data. The web page also allows to visualize the whole data translated into a simplified map and to look at other UK cities’ dashboards.
Plus, his Bike Share Map is a live global view to bicycle sharing systems in over a hundred towns around the world, since bike sharing has recently drawn a greater public attention as an original form of transportation, in Europe and China above all….”
Atlas of Cities
New book edited by Paul Knox: “More than half the world’s population lives in cities, and that proportion is expected to rise to three-quarters by 2050. Urbanization is a global phenomenon, but the way cities are developing, the experience of city life, and the prospects for the future of cities vary widely from region to region. The Atlas of Cities presents a unique taxonomy of cities that looks at different aspects of their physical, economic, social, and political structures; their interactions with each other and with their hinterlands; the challenges and opportunities they present; and where cities might be going in the future.
Each chapter explores a particular type of city—from the foundational cities of Greece and Rome and the networked cities of the Hanseatic League, through the nineteenth-century modernization of Paris and the industrialization of Manchester, to the green and “smart” cities of today. Expert contributors explore how the development of these cities reflects one or more of the common themes of urban development: the mobilizing function (transport, communication, and infrastructure); the generative function (innovation and technology); the decision-making capacity (governance, economics, and institutions); and the transformative capacity (society, lifestyle, and culture)….
Table of Contents; Introduction[PDF] “