How Big Data is Helping to Tackle Climate Change


Bernard Marr at DataInformed: “Climate scientists have been gathering a great deal of data for a long time, but analytics technology’s catching up is comparatively recent. Now that cloud, distributed storage, and massive amounts of processing power are affordable for almost everyone, those data sets are being put to use. On top of that, the growing number of Internet of Things devices we are carrying around are adding to the amount of data we are collecting. And the rise of social media means more and more people are reporting environmental data and uploading photos and videos of their environment, which also can be analyzed for clues.

Perhaps one of the most ambitious projects that employ big data to study the environment is Microsoft’s Madingley, which is being developed with the intention of creating a simulation of all life on Earth. The project already provides a working simulation of the global carbon cycle, and it is hoped that, eventually, everything from deforestation to animal migration, pollution, and overfishing will be modeled in a real-time “virtual biosphere.” Just a few years ago, the idea of a simulation of the entire planet’s ecosphere would have seemed like ridiculous, pie-in-the-sky thinking. But today it’s something into which one of the world’s biggest companies is pouring serious money. Microsoft is doing this because it believes that analytical technology has finally caught up with the ability to collect and store data.

Another data giant that is developing tools to facilitate analysis of climate and ecological data is EMC. Working with scientists at Acadia National Park in Maine, the company has developed platforms to pull in crowd-sourced data from citizen science portals such as eBird and iNaturalist. This allows park administrators to monitor the impact of climate change on wildlife populations as well as to plan and implement conservation strategies.

Last year, the United Nations, under its Global Pulse data analytics initiative, launched the Big Data Climate Challenge, a competition aimed to promote innovate data-driven climate change projects. Among the first to receive recognition under the program is Global Forest Watch, which combines satellite imagery, crowd-sourced witness accounts, and public datasets to track deforestation around the world, which is believed to be a leading man-made cause of climate change. The project has been promoted as a way for ethical businesses to ensure that their supply chain is not complicit in deforestation.

Other initiatives are targeted at a more personal level, for example by analyzing transit routes that could be used for individual journeys, using Google Maps, and making recommendations based on carbon emissions for each route.

The idea of “smart cities” is central to the concept of the Internet of Things – the idea that everyday objects and tools are becoming increasingly connected, interactive, and intelligent, and capable of communicating with each other independently of humans. Many of the ideas put forward by smart-city pioneers are grounded in climate awareness, such as reducing carbon dioxide emissions and energy waste across urban areas. Smart metering allows utility companies to increase or restrict the flow of electricity, gas, or water to reduce waste and ensure adequate supply at peak periods. Public transport can be efficiently planned to avoid wasted journeys and provide a reliable service that will encourage citizens to leave their cars at home.

These examples raise an important point: It’s apparent that data – big or small – can tell us if, how, and why climate change is happening. But, of course, this is only really valuable to us if it also can tell us what we can do about it. Some projects, such as Weathersafe, which helps coffee growers adapt to changing weather patterns and soil conditions, are designed to help humans deal with climate change. Others are designed to tackle the problem at the root, by highlighting the factors that cause it in the first place and showing us how we can change our behavior to minimize damage….(More)”

Data-Driven Innovation: Big Data for Growth and Well-Being


“A new OECD report on data-driven innovation finds that countries could be getting much more out of data analytics in terms of economic and social gains if governments did more to encourage investment in “Big Data” and promote data sharing and reuse.

The migration of economic and social activities to the Internet and the advent of The Internet of Things – along with dramatically lower costs of data collection, storage and processing and rising computing power – means that data-analytics is increasingly driving innovation and is potentially an important new source of growth.

The report suggest countries act to seize these benefits, by training more and better data scientists, reducing barriers to cross-border data flows, and encouraging investment in business processes to incorporate data analytics.

Few companies outside of the ICT sector are changing internal procedures to take advantage of data. For example, data gathered by companies’ marketing departments is not always used by other departments to drive decisions and innovation. And in particular, small and medium-sized companies face barriers to the adoption of data-related technologies such as cloud computing, partly because they have difficulty implementing organisational change due to limited resources, including the shortage of skilled personnel.

At the same time, governments will need to anticipate and address the disruptive effects of big data on the economy and overall well-being, as issues as broad as privacy, jobs, intellectual property rights, competition and taxation will be impacted. Read the Policy Brief

TABLE OF CONTENTS
Preface
Foreword
Executive summary
The phenomenon of data-driven innovation
Mapping the global data ecosystem and its points of control
How data now drive innovation
Drawing value from data as an infrastructure
Building trust for data-driven innovation
Skills and employment in a data-driven economy
Promoting data-driven scientific research
The evolution of health care in a data-rich environment
Cities as hubs for data-driven innovation
Governments leading by example with public sector data

 

Revolution Delayed: The Impact of Open Data on the Fight against Corruption


Report by RiSSC – Research Centre on Security and Crime (Italy): “In the recent years, the demand for Open Data picked up stream among stakeholders to increasing transparency and accountability of the Public Sector. Governments are supporting Open Data supply, to achieve social and economic benefits, return on investments, and political consensus.

While it is self-evident that Open Data contributes to greater transparency – as it makes data more available and easy to use by the public and governments, its impact on fighting corruption largely depends on the ability to analyse it and develop initiatives that trigger both social accountability mechanisms, and government responsiveness against illicit or inappropriate behaviours.

To date, Open Data Revolution against corruption is delayed. The impact of Open Data on the prevention and repression of corruption, and on the development of anti- corruption tools, appears to be limited, and the return on investments not yet forthcoming. Evidence remains anecdotal, and a better understanding on the mechanisms and dynamics of using Open Data against corruption is needed.

The overall objective of this exploratory study is to provide evidence on the results achieved by Open Data, and recommendations for the European Commission and Member States’ authorities, for the implementation of effective anti-corruption strategies based on transparency and openness, to unlock the potential impact of “Open Data revolution” against Corruption.

The project has explored the legal framework and the status of implementation of Open Data policies in four EU Countries – Italy, United Kingdom, Spain, and Austria. TACOD project has searched for evidence on Open Data role on law enforcement cooperation, anti-corruption initiatives, public campaigns, and investigative journalism against corruption.

RiSSC – Research Centre on Security and Crime (Italy), the University of Oxford and the University of Nottingham (United Kingdom), Transparency International (Italy and United Kingdom), the Institute for Conflict Resolution (Austria), and Blomeyer&Sanz (Spain), have carried out the research between January 2014 and February 2015, under an agreement with the European Commission – DH Migration and Home Affairs. The project has been coordinated by RiSSC, with the support of a European Working Group of Experts, chaired by prof. Richard Rose, and an external evaluator, Mr. Andrea Menapace, and it has benefited from the contribution of many experts, activists, representatives of Institutions in the four Countries….(More)

Open governance systems: Doing more with more


Paper by Jeremy Millard in Government Information Quarterly: “This paper tackles many of the important issues and discussions taking place in Europe and globally about the future of the public sector and how it can use Information and Communication Technology (ICT) to respond innovatively and effectively to some of the acute societal challenges arising from the financial crisis as well as other deeper rooted global problems. These include inequality, poverty, corruption and migration, as well as climate change, loss of habitat and the ageing society. A conceptual framework for open governance systems enabled by ICT is proposed, drawing on evidence and examples from around the world as well as a critical appraisal of both academic and grey literature. The framework constructs a system of open assets, open services and open engagement, and this is used to move the e-government debate forward from a preoccupation with lean and small governments which ‘do more with less’ to examine the potential for open governance systems to also ‘do more with more’. This is achieved by enabling an open government and open public sector, as part of this open governance system, to ‘do more by leveraging more’ of the existing assets and resources across the whole of society, and not just within the public sector, many of which are unrealised and untapped, so in effect are ‘wasted’. The paper argues that efficiencies and productivity improvements are essential at all levels and across all actors, as is maximising both public and private value, but that they must also be seen at the societal level where trade-offs and interactions are required, and not only at the individual actor level….(More)”

5 tech trends that will transform governments


Zac Bookman at the World Economic Forum: “…The public sector today looks a bit like the consumer industry of 1995 and the enterprise space in 2005: it is at the beginning of a large-scale digital metamorphosis. The net result will be years of saved time, better decisions and stronger communities.

Here are five trends that will define this transformation in the coming decade:

  1. Real-time operations

Many industries in the global economy already operate in real time. ….

Governments are different. They often access accurate data only on a monthly or quarterly basis, even though they make critical decisions every day. This will change with software deployments that help governments unleash and use current data to make more informed decisions about how they can allocate public resources effectively.

  1. Smarter cities  

Studies on human migration patterns indicate that more people are moving to cities. By 2025, an estimated 60% of the world’s population will live in an urban centre. High rates of urbanization will force cities to use their existing resources more efficiently. Networked infrastructures – including roads, phone lines, cable networks, satellites and the internet – will be important parts of the solution to this challenge….For example, MIT and Copenhagen recently collaborated on an electric-hybrid bike wheel that monitors pollution, road conditions and traffic. The wheel allows cities to monitor their environments at a level that was previously unfeasible with cheap sensors and manual labour, offering a quantum leap in networking capability without using further human or capital resources.

  1. Increased citizen engagement

Smart networks are wonderful things, but cities need to guard themselves against making efficiency a sacred cow. There is inherent tension between the ideals of democracy and efficiency, between the openness of platforms that encourage engagement and centralized systems. Rather than focus solely on making everything smart, cities will have to focus on slowing down and improving the quality of life.

These considerations will cause cities to increase citizen engagement. Transparency is a subset of this goal. Open data platforms, such as data.gov and data.gov.uk, host troves of machine-readable government information that allow communities to target and solve problems for which governments do not have the bandwidth. Crowdfunding platforms, such as neighbor.ly, allow citizens to participate in the civic process by enabling them to invest in local capital projects. These types of civic tech platforms will continue to grow, and they will be vital to the health of future democracies.

  1. 21st-century reporting software for governments

The information technology that powers government is notoriously antiquated. …

New reporting technology, such as the system from OpenGov, will automatically pull and display data from governments’ accounting systems. These capabilities empower employees to find information in seconds that would have previously taken hours, days or even weeks to find. They will expand inter-departmental collaboration on core functions, such as budgeting. And they will also allow governments to compare themselves with other governments. In the next decade, advanced reporting software will save billions of dollars by streamlining processes, improving decisions and offering intelligent insights across the expenditure spectrum.

  1. Inter-governmental communication

The internet was conceived as a knowledge-sharing platform. Over the past few decades, technologists have developed tools such as Google and Wikipedia to aid the flow of information on the web and enable ever greater knowledge sharing. Today, you can find nearly any piece of information in a matter of seconds. Governments, however, have not benefited from the rapid development of such tools for their industry, and most information sharing still occurs offline, over email, or on small chat forums. Tools designed specifically for government data will allow governments to embrace the inherent knowledge-sharing infrastructure of the internet….(More)”

‘Airbnb for refugees’ group overwhelmed by offers of help


 at The Guardian: “A German group which matchmakes citizens willing to share their homes with refugees said it had been overwhelmed by offers of support, with plans in the works for similar schemes in other European countries.

The Berlin-based Refugees Welcome, which has been described as an “Airbnb for refugees”, has helped people fleeing from Afghanistan, Burkina Faso, Mali, Nigeria, Pakistan, Somalia and Syria.

More than 780 Germans have signed up to the Refugees Welcome website and 26 people have been placed in private homes so far. Two of the site’s founders, Jonas Kakoschke, 31, and Mareike Geiling, 28, live with 39-year-old Bakari, a refugee from Mali, whom they are helping with German classes while he waits for a work permit.

A spokesman said the project’s growing success has now led to offers of help to set up similar schemes in other EU countries, including Greece, Portugal and the UK, with a comparable project in Austria already up and running since January.

Over the weekend, thousands of Icelanders offered to accommodate Syrian refugees in their own homes in an open letter to the government about the migration crisis….(More)”

Citizen Urban Science


New report by Anthony Townsend and Alissa Chisholm at the Cities of Data Project: “Over the coming decades, the world will continue to urbanize rapidly amidst an historic migration of computing power off the desktop, unleashing new opportunities for data collection that reveal how cities function. In a recent report, Making Sense of the Science of Cities (bit.ly/sciencecities) we described an emerging global research movement that seeks establish a new urban science built atop this new infrastructure of instruments. But will this new intellectual venture be an inclusive endeavor? What role is 1 there for the growing ranks of increasingly well-equipped and well-informed citizen volunteers and amateur investigators to work alongside professional scientists? How are researchers, activists and city governments exploring that potential today? Finally, what can be done to encourage and accelerate experimentation?

This report examines three case studies that provide insight into emerging models of citizen science, highlighting the possibilities of citizen-university-government collaborative research, and the important role of open data platforms to enable these partnerships….(More)”

Algorithmic Citizenship


Citizen-Ex: “Algorithmic Citizenship is a new form of citizenship, one where your citizenship, and therefore both your allegiances and your rights, are constantly being questioned, calculated, and rewritten.

Most people are assigned a citizenship at birth, in one of two ways. You may receive your citizenship from the place you’re born, which is called jus soli, or the right of soil. If you’re born in a place, that’s where you’re a citizen of. This is true in a lot of North and South America, for example – but not much of the rest of the world. You may get your citizenship based on where your parents are citizens of, which is called jus sanguinis, or the right of blood. Everybody is supposed to have a citizenship, although millions of stateless people do not, as a result of war, migration or the collapse of existing states. Many people also change citizenship over the course of their life, through various legal mechanisms. Some countries allow you to hold more than one citizenship at once, and some do not.

Having a citizenship means that you have a place in the world, an allegiance to a state. That state is supposed to guarantee you certain rights, like freedom from arrest, imprisonment, torture, or surveillance – depending on which state you belong to. Hannah Arendt famously said that “citizenship is the right to have rights”. To tamper with ones citizenship is to endanger ones most fundamental rights. Without citizenship, we have no rights at all.

Algorithmic Citizenship is a form of citizenship which is not assigned at birth, or through complex legal documents, but through data. Like other computerised processes, it can happen at the speed of light, and it can happen over and over again, constantly revising and recalculating. It can split a single citizenship into an infinite number of sub-citizenships, and count and weight them over time to produce combinations of affiliations to different states.

Citizen Ex calculates your Algorithmic Citizenship based on where you go online. Every site you visit is counted as evidence of your affiliation to a particular place, and added to your constantly revised Algorithmic Citizenship. Because the internet is everywhere, you can go anywhere – but because the internet is real, this also has consequences….(More)”

Colombia’s Data-Driven Fight Against Crime


One Monday in 1988, El Mundo newspaper of Medellín, Colombia, reported, as it did every Monday, on the violent deaths in the city of two million people over the weekend. An article giving an hour-by-hour description of the deaths from Saturday night to Sunday night was remarkable for, among other things, the journalist’s skill in finding different ways to report a murder. “Someone took the life of Luís Alberto López at knife point … Luís Alberto Patiño ceased to exist with a bullet in his head … Mario Restrepo turned up dead … An unidentified person killed Néstor Alvarez with three shots.” In reporting 27 different murders, the author repeated his phrasing only once.

….What Guerrero did to make Cali safer was remarkable because it worked, and because of the novelty of his strategy. Before becoming mayor, Guerrero was not a politician, but a Harvard-trained epidemiologist who was president of the Universidad del Valle in Cali. He set out to prevent murder the way a doctor prevents disease. What public health workers are doing now to stop the spread of Ebola, Guerrero did in Cali to stop the spread of violence.

Although his ideas have now been used in dozens of cities throughout Latin America, they are worth revisiting because they are not employed in the places that need them most. The most violent places in Latin America are Honduras, El Salvador and Guatemala — indeed, they are among the most violent countries in the world not at war. The wave of youth migration to the United States is from these countries, and the refugees are largely fleeing violence.

One small municipality in El Salvador, Santa Tecla, has employed Cali’s strategies since about 10 years ago, and the homicide rate has dropped there. But Santa Tecla is an anomaly. Most of the region’s cities have not tried to do what Guerrero did — and they are failing to protect their citizens….

Guerrero went on to spread his ideas. Working with the Pan-American Health Organization and the Inter-American Development Bank, he took his epidemiological methods to 18 other countries.

“The approach was very low-cost and pragmatic,” said Joan Serra Hoffman, a senior specialist in crime and violence prevention in Latin America and the Caribbean at the World Bank. “You could see it was conceived by someone who was an academic and a policy maker. It can be fully operational for between $50,000 and $80,000.”…

Mapping the Age of Every Building in Manhattan


Kriston Capps at CityLab: “The Harlem Renaissance was the epicenter of new movements in dance, poetry, painting, and literature, and its impact still registers in all those art forms. If you want to trace the Harlem Renaissance, though, best look to Harlem itself.
Many if not most of the buildings in Harlem today rose between 1900 and 1940—and a new mapping tool called Urban Layers reveals exactly where and when. Harlem boasts very few of the oldest buildings in Manhattan today, but it does represent the island’s densest concentration of buildings constructed during the Great Migration.
Thanks to Morphocode‘s Urban Layers, it’s possible to locate nearly every 19th-century building still standing in Manhattan today. That’s just one of the things that you can isolate with the map, which combines two New York City building datasets (PLUTO and Building Footprints) and Mapbox GL JS vector technology to generate an interactive architectural history.
So, looking specifically at Harlem again (with some of the Upper West Side thrown in for good measure), it’s easy to see that very few of the buildings that went up between 1765 to 1860 still stand today….”