Stefaan Verhulst
Book edited by Peris-Ortiz, Marta, Bennett, Dag, and Pérez-Bustamante Yábar, Diana: “This volume provides the most current research on smart cities. Specifically, it focuses on the economic development and sustainability of smart cities and examines how to transform older industrial cities into sustainable smart cities. It aims to identify the role of the following elements in the creation and management of smart cities:
- Citizen participation and empowerment
- Value creation mechanisms
- Public administration
- Quality of life and sustainability
- Democracy
- ICT
- Private initiatives and entrepreneurship
Regardless of their size, all cities are ultimately agglomerations of people and institutions. Agglomeration economies make it possible to attain minimum efficiencies of scale in the organization and delivery of services. However, the economic benefits do not constitute the main advantage of a city. A city’s status rests on three dimensions: (1) political impetus, which is the result of citizens’ participation and the public administration’s agenda; (2) applications derived from technological advances (especially in ICT); and (3) cooperation between public and private initiatives in business development and entrepreneurship. These three dimensions determine which resources are necessary to create smart cities. But a smart city, ideal in the way it channels and resolves technological, social and economic-growth issues, requires many additional elements to function at a high-performance level, such as culture (an environment that empowers and engages citizens) and physical infrastructure designed to foster competition and collaboration, encourage new ideas and actions, and set the stage for new business creation. …(More)”.
A survey and primer by John S. Davis II and Osonde Osoba at Rand: “Anonymization or de-identification techniques are methods for protecting the privacy of subjects in sensitive data sets while preserving the utility of those data sets. The efficacy of these methods has come under repeated attacks as the ability to analyze large data sets becomes easier. Several researchers have shown that anonymized data can be reidentified to reveal the identity of the data subjects via approaches such as so-called “linking.” In this report, we survey the anonymization landscape of approaches for addressing re-identification and we identify the challenges that still must be addressed to ensure the minimization of privacy violations. We also review several regulatory policies for disclosure of private data and tools to execute these policies….(More)”.
Book by Victoria Gordon, Jeffery L. Osgood, Jr., Daniel Boden: “Although citizen engagement is a core public service value, few public administrators receive training on how to share leadership with people outside the government.Participatory Budgeting in the United States serves as a primer for those looking to understand a classic example of participatory governance, engaging local citizens in examining budgetary constraints and priorities before making recommendations to local government. Utilizing case studies and an original set of interviews with community members, elected officials, and city employees, this book provides a rare window onto the participatory budgeting process through the words and experiences of the very individuals involved. The central themes that emerge from these fascinating and detailed cases focus on three core areas: creating the participatory budgeting infrastructure; increasing citizen participation in participatory budgeting; and assessing and increasing the impact of participatory budgeting. This book provides students, local government elected officials, practitioners, and citizens with a comprehensive understanding of participatory budgeting and straightforward guidelines to enhance the process of civic engagement and democratic values in local communities….(More)”
Magpi : “As mobile devices have gotten less and less expensive – and as millions worldwide have climbed out of poverty – it’s become quite common to see a mobile phone in every person’s hand, or at least in every family, and this means that we can utilize an additional approach to data collection that were simply not possible before….
In our Remote Data Collection Guide, we discuss these new technologies and the:
- Key benefits of remote data collection in each of three different situations.
- The direct impact of remote data collection on reducing the cost of your efforts.
- How to start the process of choosing the right option for your needs….(More)”
The Centre for Public Impact: “… we believe that the touchstone for any government should be the results it achieves for its citizens: its public impact. To help power the journey from idea to impact, we have developed the Public Impact Fundamentals, a systematic attempt to understand what makes a successful policy outcome and describe what can be done to maximise the chances of achieving public impact.
We have worked closely with the most senior academics from the world’s leading public policy schools, as well as senior government officials from across the globe. We have sought to develop a framework underpinned by cutting-edge thinking from academia and tested by government officials so that it can be immediately usable.
We have found that three things are fundamental to improved public impact: Legitimacy, Policy and Action. Legitimacy – the underlying support for a policy and the attempts to achieve it; Policy – the design quality of policies intended to achieve impact; and Action – translation of policies into real-world effect. Within each Fundamental are three elements, which collectively contribute to performance and lead to improved public impact.
We did not develop the Fundamentals with the view to it being a universal and prescriptive list – instead we are interested to see whether they are consistent with the day-to-day activities of practitioners. We anticipate that once they are deployed in real world scenarios, new and interesting uses will develop. Practitioners might find the Public Impact Fundamentals useful for self-assessments, forward planning or progress tracking. We look forward to working with policymakers to refine the uses of the Public Impact Fundamentals. (Full Report)”
DataFloq: “The earth is having a difficult time, for quite some time already. Deforestation is still happening at a large scale across the globe. In Brazil alone 40,200 hectares were deforested in the past year. The great pacific garbage patch is still growing and smog in Beijing is more common than a normal bright day. This is nothing new unfortunately. A possible solution is however. Since a few years, scientists, companies and governments are turning to Big Data to solve such problems or even prevent them from happening. It turns out that Big Data can help save the earth and if done correctly, this could lead to significant results in the coming years. Let’s have a look at some fascinating use cases of how Big Data can contribute:
Monitoring Biodiversity Across the Globe
Conservation International, a non-profit environmental organization with a mission to protect nature and its biodiversity, crunches vast amounts of data from images to monitor biodiversity around the world. At 16 important sites across the continents, they have installed over a 1000 smart cameras. Thanks to the motion sensor, these cameras will captures images as soon as the sensor is triggered by animals passing by. Per site these cameras cover approximately 2.000 square kilometres…. They automatically determine which species have appeared in the images and enrich the data with other information such as climate data, flora and fauna data and land use data to better understand how animal populations change over time…. the Wildlife Picture Index (WPI) Analytics System, a project dashboard and analytics tool for visualizing user-friendly, near real-time data-driven insights on the biodiversity. The WPI monitors ground-dwelling tropical medium and large mammals and birds, species that are important economically, aesthetically and ecologically.
Using Satellite Imagery to Combat Deforestation
Mapping deforestation is becoming a lot easier today thanks to Big Data. Imagery analytics allows environmentalists and policy makers to monitor, almost in real-time, the status of forests around the globe with the help of satellite imagery. New tools like the Global Forest Watch uses a massive amount of high-resolution NASA satellite imagery to assist conservation organizations, governments and concerned citizens monitor deforestation in “near-real time.”…
But that’s not all. Planet Labs has developed a tiny satellite that they are currently sending into space, dozens at a time. The satellite measures only 10 by 10 by 30 centimeters but is outfitted with the latest technology. They aim to create a high-resolution image of every spot on the earth, updated daily. Once available, this will generate massive amounts of data that they will open source for others to develop applications that will improve earth.
Monitoring and Predicting with Smart Oceans
Over 2/3 of the world consists of oceans and also these oceans can be monitored. Earlier this year, IBM Canada and Ocean Networks Canada announced a three-year program to better understand British Colombia’s Oceans. Using the latest technology and sensors, they want to predict offshore accidents, natural disasters and tsunamis and forecast the impact of these incidents. Using hundreds of cabled marine sensors they are capable of monitoring waves, currents, water quality and vessel traffic in some of the major shipping channels….These are just three examples of how Big Data can help save the planet. There are of course a lot more fascinating examples and here is list of 10 of such use cases….(More)”
Mark Rockwell at FCW: “The internet of things is tracking Hurricane Matthew. As the monster storm draws a bead on the south Atlantic coast after wreaking havoc in the Caribbean, its impact will be measured by a sensor network deployed by the U.S. Geological Survey.
USGS hurricane response crews are busy installing two kinds of sensors in areas across four states where the agency expects the storm to hit hardest. The information the sensors collect will help with disaster recovery efforts and critical weather forecasts for the National Weather Service and the Federal Emergency Management Agency.
As is the case with most things these days, the storm will be tracked online.
The information collected will be distributed live on the USGS Flood Viewer to help federal and state officials gauge the extent and the storm’s damage as it passes through each area.
FEMA, which tasked USGS with the sensor distribution, is also talking with other federal and state officials further up the Atlantic coastline about whether the equipment is needed there. Recent forecasts call for Matthew to take a sharp easterly turn and head out to sea as it reaches the North Carolina coast.
USGS crews are in installing storm-surge sensors at key sites along the coasts of North Carolina, South Carolina, Georgia, and Florida in anticipation of the storm, said Brian McCallum, associate director for data at the USGS South Atlantic Water Science Center.
In all, USGS is deploying more than 300 additional weather and condition sensors, he told FCW in an interview on Oct. 5.
The devices come in two varieties. The first are 280 storm surge sensors, set out in protective steel tubes lashed to piers, bridges and other solid structures in the storm’s projected path. The low-cost devices will provide the highest density of storm data, such as depth and duration of the storm surge, McCallum said. The devices won’t communicate their information in real time, however; McCallum said USGS crews will come in behind the storm to upload the sensor data to the Internet.
The second set of sensors, however, could be thought of as the storm’s “live tweets.” USGS is installing 25 rapid-deployment gauges to augment its existing collection of sensors and fill in gaps along the coast….(More)”
Book edited by Thakuriah, Piyushimita (Vonu), Tilahun, Nebiyou, and Zellner, Moira: “… introduces the latest thinking on the use of Big Data in the context of urban systems, including research and insights on human behavior, urban dynamics, resource use, sustainability and spatial disparities, where it promises improved planning, management and governance in the urban sectors (e.g., transportation, energy, smart cities, crime, housing, urban and regional economies, public health, public engagement, urban governance and political systems), as well as Big Data’s utility in decision-making, and development of indicators to monitor economic and social activity, and for urban sustainability, transparency, livability, social inclusion, place-making, accessibility and resilience…(More)”
Book by Stella Z. Theodoulou and Ravi K. Roy: “In a modern democratic nation, everyday life is shaped by the decisions of those who manage and administer public policies. This Very Short Introduction provides a practical insight into the development and delivery of the decisions that shape how individuals, and society as a whole, live and interact.
- Covers all areas of public administration, including public safety, social welfare, public transport and state provided education
- Offers a global perspective, drawing on real case studies taken from a wide array of countries
- Considers the issues and challenges which confront the public sector worldwide….(More)”
Julie Simon at NESTA: “Democratic theory has tended to take a pretty dim view of people and their ability to make decisions. Many political philosophers believe that people are at best uninformed and at worst, ignorant and incompetent. This view is a common justification for our system of representative democracy – people can’t be trusted to make decisions so this responsibility should fall to those who have the expertise, knowledge or intelligence to do so.
Think back to what Edmund Burke said on the subject in his speech to the Electors of Bristol in 1774, “Your representative owes you, not his industry only, but his judgement; and he betrays, instead of serving you, if he sacrifices it to your opinion.” He reminds us that “government and legislation are matters of reason and judgement, and not of inclination”. Others, like the journalist Charles Mackay, whose book on economic bubbles and crashes,Extraordinary Popular Delusions and the Madness of Crowds, had an even more damning view of the crowd’s capacity to exercise either judgement or reason.
The thing is, if you believe that ‘the crowd’ isn’t wise then there isn’t much point in encouraging participation – these sorts of activities can only ever be tokenistic or a way of legitimising the decisions taken by others.
There are then those political philosophers who effectively argue that citizens’ incompetence doesn’t matter. They argue that the aggregation of views – through voting – eliminates ‘noise’ which enables you to arrive at optimal decisions. The larger the group, the better its decisions will be. The corollary of this view is that political decision making should involve mass participation and regular referenda – something akin to the Swiss model.
Another standpoint is to say that there is wisdom within crowds – it’s just that it’s domain specific, unevenly distributed and quite hard to transfer. This idea was put forward by Friedrich Hayek in his seminal 1945 essay on The Use of Knowledge in Society in which he argues that:
“…the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form, but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is thus not merely a problem of how to allocate ‘given’ resources……it is a problem of the utilization of knowledge not given to anyone in its totality”.
Hayek argued that it was for this reason that central planning couldn’t work since no central planner could ever aggregate all the knowledge distributed across society to make good decisions.
More recently, Eric Von Hippel built on these foundations by introducing the concept of information stickiness; information is ‘sticky’ if it is costly to move from one place to another. One type of information that is frequently ‘sticky’ is information about users’ needs and preferences.[1] This helps to account for why manufacturers tend to develop innovations which are incremental – meeting already identified needs – and why so many organisations are engaging users in their innovation processes: if knowledge about needs and tools for developing new solutions can be co-located in the same place (i.e. the user) then the cost of transferring sticky information is eliminated…..
There is growing evidence on how crowdsourcing can be used by governments to solve clearly defined technical, scientific or informational problems. Evidently there are significant needs and opportunities for governments to better engage citizens to solve these types of problems.
There’s also a growing body of evidence on how digital tools can be used to support and promote collective intelligence….
So, the critical task for public officials is to have greater clarity over the purpose of engagement – in order to better understand which methods of engagement should be used and what kinds of groups should be targeted.
At the same time, the central question for researchers is when and how to tap into collective intelligence: what tools and approaches can be used when we’re looking at arenas which are often sites of contestation? Should this input be limited to providing information and expertise to be used by public officials or representatives, or should these distributed experts exercise some decision making power too? And when we’re dealing with value based judgements when should we rely on large scale voting as a mechanism for making ‘smarter’ decisions and when are deliberative forms of engagement more appropriate? These are all issues we’re exploring as part of our ongoing programme of work on democratic innovations….(More)”