Paper by Guy Grossman, Melina Platas and Jonathan Rodden: “We examine the effect on service delivery outcomes of a new information communication technology (ICT) platform that allows citizens to send free and anonymous messages to local government officials, thus reducing the cost and increasing the efficiency of communication about public services. In particular, we use a field experiment to assess the extent to which the introduction of this ICT platform improved monitoring by the district, effort by service providers, and inputs at service points in health, education and water in Arua District, Uganda. Despite relatively high levels of system uptake, enthusiasm of district officials, and anecdotal success stories, we find evidence of only marginal and uneven short-term improvements in health and water services, and no discernible long-term effects. Relatively few messages from citizens provided specific, actionable information about service provision within the purview and resource constraints of district officials, and users were often discouraged by officials’ responses. Our findings suggest that for crowd-sourced ICT programs to move from isolated success stories to long-term accountability enhancement, the quality and specific content of reports and responses provided by users and officials is centrally important….(More)”.
The Mobility Space Report: What the Street!?
“What the Street!? was derived out of the question “How do new and old mobility concepts change our cities?”. It was raised by Michael Szell and Stephan Bogner during their residency at moovel lab. With support of the lab team they set out to wrangle data of cities around the world to develop and design this unique Mobility Space Report.
What the Street!? was made out of open-source software and resources. Thanks to the OpenStreetMap contributors and many other pieces we put together the puzzle of urban mobility space seen above….
If you take a snapshot of Berlin from space on a typical time of the day, you see 60,000 cars on the streets and 1,200,000 cars parked. Why are so many cars parked? Because cars are used only 36 minutes per day, while 95% of the time they just stand around unused. In Berlin, these 1.2 million parking spots take up the area of 64,000 playgrounds, or the area of 4 Central Parks.
If you look around the world, wasted public space is not particular to Berlin – many cities have the same problem. But why is so much space wasted in the first place? How “fair” is the distribution of space towards other forms of mobility like bikes and trams? Is there an arrogance of space? If so, how could we raise awareness or even improve the situation?
Who “owns” the city?
Let us first look at how much space there is in a city for moving around, and how it is allocated between bikes, rails, and cars. With What the Street!? – The Mobility Space Report, we set out to provide a public tool for exploring this urban mobility space and to answer our questions systematically, interactively, and above all, in a fun way. Inspired by recently developed techniques in data visualization of unrolling, packing, and ordering irregular shapes, we packed and rolled all mobility spaces into rectangular bins to visualize the areas they take up.
How do you visualize the total area taken by parking spaces? – You pack them tightly.
How do you visualize the total area taken by streets and tracks? – You roll them up tightly.…(More)”.
Crowdsourcing website is helping volunteers save lives in hurricane-hit Houston
By Monday morning, the 27-year-old developer, sitting in his leaky office, had slapped together an online mapping tool to track stranded residents. A day later, nearly 5,000 people had registered to be rescued, and 2,700 of them were safe.
If there’s a silver lining to Harvey, it’s the flood of civilian volunteers such as Marchetti who have joined the rescue effort. It became pretty clear shortly after the storm started pounding Houston that the city would need their help. The heavy rains quickly outstripped authorities’ ability to respond. People watched water levels rise around them while they waited on hold to get connected to a 911 dispatcher. Desperate local officials asked owners of high-water vehicles and boats to help collect their fellow citizens trapped on second-stories and roofs.
In the past, disaster volunteers have relied on social media and Zello, an app that turns your phone into a walkie-talkie, to organize. … Harvey’s magnitude, both in terms of damage and the number of people anxious to pitch in, also overwhelmed those grassroots organizing methods, says Marchetti, who spent the spent the first days after the storm hit monitoring Facebook and Zello to figure out what was needed where.
“The channels were just getting overloaded with people asking ‘Where do I go?’” he says. “We’ve tried to cut down on the level of noise.”
The idea behind his project, Houstonharveyrescue.com, is simple. The map lets people in need register their location. They are asked to include details—for example, if they’re sick or have small children—and their cell phone numbers.
The army of rescuers, who can also register on the site, can then easily spot the neediest cases. A team of 100 phone dispatchers follows up with those wanting to be rescued, and can send mass text messages with important information. An algorithm weeds out any repeats.
It might be one of the first open-sourced rescue missions in the US, and could be a valuable blueprint for future disaster volunteers. (For a similar civilian-led effort outside the US, look at Tijuana’s Strategic Committee for Humanitarian Aid, a Facebook group that sprouted last year when the Mexican border city was overwhelmed by a wave of Haitian immigrants.)…(More)”.
The Tech Revolution That’s Changing How We Measure Poverty
Alvin Etang Ndip at the Worldbank: “The world has an ambitious goal to end extreme poverty by 2030. But, without good poverty data, it is impossible to know whether we are making progress, or whether programs and policies are reaching those who are the most in need.
Countries, often in partnership with the World Bank Group and other agencies, measure poverty and wellbeing using household surveys that help give policymakers a sense of who the poor are, where they live, and what is holding back their progress. Once a paper-and-pencil exercise, technology is beginning to revolutionize the field of household data collection, and the World Bank is tapping into this potential to produce more and better poverty data….
“Technology can be harnessed in three different ways,” says Utz Pape, an economist with the World Bank. “It can help improve data quality of existing surveys, it can help to increase the frequency of data collection to complement traditional household surveys, and can also open up new avenues of data collection methods to improve our understanding of people’s behaviors.”
As technology is changing the field of data collection, researchers are continuing to find new ways to build on the power of mobile phones and tablets.
The World Bank’s Pulse of South Sudan initiative, for example, takes tablet-based data collection a step further. In addition to conducting the household survey, the enumerators also record a short, personalized testimonial with the people they are interviewing, revealing a first-person account of the situation on the ground. Such testimonials allow users to put a human face on data and statistics, giving a fuller picture of the country’s experience.
Real-time data through mobile phones
At the same time, more and more countries are generating real-time data through high-frequency surveys, capitalizing on the proliferation of mobile phones around the world. The World Bank’s Listening to Africa (L2A) initiative has piloted the use of mobile phones to regularly collect information on living conditions. The approach combines face-to-face surveys with follow-up mobile phone interviews to collect data that allows to monitor well-being.
The initiative hands out mobile phones and solar chargers to all respondents. To minimize the risk of people dropping out, the respondents are given credit top-ups to stay in the program. From monitoring health care facilities in Tanzania to collecting data on frequency of power outages in Togo, the initiative has been rolled out in six countries and has been used to collect data on a wide range of areas. …
Technology-driven data collection efforts haven’t been restricted to the Africa region alone. In fact, the approach was piloted early in Peru and Honduras with the Listening 2 LAC program. In Europe and Central Asia, the World Bank has rolled out the Listening to Tajikistan program, which was designed to monitor the impact of the Russian economic slowdown in 2014 and 2015. Initially a six-month pilot, the initiative has now been in operation for 29 months, and a partnership with UNICEF and JICA has ensured that data collection can continue for the next 12 months. Given the volume of data, the team is currently working to create a multidimensional fragility index, where one can monitor a set of well-being indicators – ranging from food security to quality jobs and public services – on a monthly basis…
There are other initiatives, such as in Mexico where the World Bank and its partners are using satellite imagery and survey data to estimate how many people live below the poverty line down to the municipal level, or guiding data collectors using satellite images to pick a representative sample for the Somali High Frequency Survey. However, despite the innovation, these initiatives are not intended to replace traditional household surveys, which still form the backbone of measuring poverty. When better integrated, they can prove to be a formidable set of tools for data collection to provide the best evidence possible to policymakers….(More)”
Chicago police see less violent crime after using predictive code
Jon Fingas at Engadget: “Law enforcement has been trying predictive policing software for a while now, but how well does it work when it’s put to a tough test? Potentially very well, according to Chicago police. The city’s 7th District police reportthat their use of predictive algorithms helped reduce the number of shootings 39 percent year-over-year in the first 7 months of 2017, with murders dropping by 33 percent. Three other districts didn’t witness as dramatic a change, but they still saw 15 to 29 percent reductions in shootings and a corresponding 9 to 18 percent drop in murders.
It mainly comes down to knowing where and when to deploy officers. One of the tools used in the 7th District, HunchLab, blends crime statistics with socioeconomic data, weather info and business locations to determine where crimes are likely to happen. Other tools (such as the Strategic Subject’s List and ShotSpotter) look at gang affiliation, drug arrest history and gunfire detection sensors.
If the performance holds, It’ll suggest that predictive policing can save lives when crime rates are particularly high, as they have been on Chicago’s South Side. However, both the Chicago Police Department and academics are quick to stress that algorithms are just one part of a larger solution. Officers still have be present, and this doesn’t tackle the underlying issues that cause crime, such as limited access to education and a lack of economic opportunity. Still, any successful reduction in violence is bound to be appreciated….(More)”.
Rise of the Government Chatbot
Zack Quaintance at Government Technology: “A robot uprising has begun, except instead of overthrowing mankind so as to usher in a bleak yet efficient age of cold judgment and colder steel, this uprising is one of friendly robots (so far).
Which is all an alarming way to say that many state, county and municipal governments across the country have begun to deploy relatively simple chatbots, aimed at helping users get more out of online public services such as a city’s website, pothole reporting and open data. These chatbots have been installed in recent months in a diverse range of places including Kansas City, Mo.; North Charleston, S.C.; and Los Angeles — and by many indications, there is an accompanying wave of civic tech companies that are offering this tech to the public sector.
They range from simple to complex in scope, and most of the jurisdictions currently using them say they are doing so on somewhat of a trial or experimental basis. That’s certainly the case in Kansas City, where the city now has a Facebook chatbot to help users get more out of its open data portal.
“The idea was never to create a final chatbot that was super intelligent and amazing,” said Eric Roche, Kansas City’s chief data officer. “The idea was let’s put together a good effort, and put it out there and see if people find it interesting. If they use it, get some lessons learned and then figure out — either in our city, or with developers, or with people like me in other cities, other chief data officers and such — and talk about the future of this platform.”
Roche developed Kansas City’s chatbot earlier this year by working after hours with Code for Kansas City, the local Code for America brigade — and he did so because since in the four-plus years the city’s open data program has been active, there have been regular concerns that the info available through it was hard to navigate, search and use for average citizens who aren’t data scientists and don’t work for the city (a common issue currently being addressed by many jurisdictions). The idea behind the Facebook chatbot is that Roche can program it with a host of answers to the most prevalent questions, enabling it to both help interested users and save him time for other work….
In North Charleston, S.C., the city has adopted a text-based chatbot, which goes above common 311-style interfaces by allowing users to report potholes or any other lapses in city services they may notice. It also allows them to ask questions, which it subsequently answers by crawling city websites and replying with relevant links, said Ryan Johnson, the city’s public relations coordinator.
North Charleston has done this by partnering with a local tech startup that has deep roots in the area’s local government. The company is called Citibot …
With Citibot, residents can report a pothole at 2 a.m., or they can get info about street signs or trash pickup sent right to their phones.
There are also more complex chatbot technologies taking hold at both the civic and state levels, in Los Angeles and Mississippi, to be exact.
Mississippi’s chatbot is called Missi, and its capabilities are vast and nuanced. Residents can even use it for help submitting online payments. It’s accessible by clicking a small chat icon on the side of the website.
Back in May, Los Angeles rolled out Chip, or City Hall Internet Personality, on the Los Angeles Business Assistance Virtual Network. The chatbot aims to assist visitors by operating as a 24/7 digital assistant for visitors to the site, helping them navigate it and better understand its services by answering their inquiries. It is capable of presenting info from anywhere on the site, and it can even go so far as helping users fill out forms or set up email alerts….(More)”
Algorithmic Transparency for the Smart City
Paper by Robert Brauneis and Ellen P. Goodman: “Emerging across many disciplines are questions about algorithmic ethics – about the values embedded in artificial intelligence and big data analytics that increasingly replace human decisionmaking. Many are concerned that an algorithmic society is too opaque to be accountable for its behavior. An individual can be denied parole or denied credit, fired or not hired for reasons she will never know and cannot be articulated. In the public sector, the opacity of algorithmic decisionmaking is particularly problematic both because governmental decisions may be especially weighty, and because democratically-elected governments bear special duties of accountability. Investigative journalists have recently exposed the dangerous impenetrability of algorithmic processes used in the criminal justice field – dangerous because the predictions they make can be both erroneous and unfair, with none the wiser.
We set out to test the limits of transparency around governmental deployment of big data analytics, focusing our investigation on local and state government use of predictive algorithms. It is here, in local government, that algorithmically-determined decisions can be most directly impactful. And it is here that stretched agencies are most likely to hand over the analytics to private vendors, which may make design and policy choices out of the sight of the client agencies, the public, or both. To see just how impenetrable the resulting “black box” algorithms are, we filed 42 open records requests in 23 states seeking essential information about six predictive algorithm programs. We selected the most widely-used and well-reviewed programs, including those developed by for-profit companies, nonprofits, and academic/private sector partnerships. The goal was to see if, using the open records process, we could discover what policy judgments these algorithms embody, and could evaluate their utility and fairness.
To do this work, we identified what meaningful “algorithmic transparency” entails. We found that in almost every case, it wasn’t provided. Over-broad assertions of trade secrecy were a problem. But contrary to conventional wisdom, they were not the biggest obstacle. It will not usually be necessary to release the code used to execute predictive models in order to dramatically increase transparency. We conclude that publicly-deployed algorithms will be sufficiently transparent only if (1) governments generate appropriate records about their objectives for algorithmic processes and subsequent implementation and validation; (2) government contractors reveal to the public agency sufficient information about how they developed the algorithm; and (3) public agencies and courts treat trade secrecy claims as the limited exception to public disclosure that the law requires. Although it would require a multi-stakeholder process to develop best practices for record generation and disclosure, we present what we believe are eight principal types of information that such records should ideally contain….(More)”.
Smart or dumb? The real impact of India’s proposal to build 100 smart cities
In promoting this objective, it gave the example of a large development in the island city of Mumbai, Bhendi Bazaar. There, 3-5 storey housing would be replaced with towers of between 40 to 60 storeys to increase density. This has come to be known as “vertical with a vengeance”.
We have obtained details of the proposed project from the developer and the municipal authorities. Using an extended urban metabolism model, which measures the impacts of the built environment, we have assessed its overall impact. We determined how the flows of materials and energy will change as a result of the redevelopment.
Our research shows that the proposal is neither smart nor sustainable.
Measuring impacts
The Indian government clearly defined what they meant with “smart”. Over half of the 11 objectives were environmental and main components of the metabolism of a city. These include adequate water and sanitation, assured electricity, efficient transport, reduced air pollution and resource depletion, and sustainability.
We collected data from various primary and secondary sources. This included physical surveys during site visits, local government agencies, non-governmental organisations, the construction industry and research.
We then made three-dimensional models of the existing and proposed developments to establish morphological changes, including building heights, street widths, parking provision, roof areas, open space, landscaping and other aspects of built form.
Demographic changes (population density, total population) were based on census data, the developer’s calculations and an assessment of available space. Such information about the magnitude of the development and the associated population changes allowed us to analyse the additional resources required as well as the environmental impact….
Case studies such as Bhendi Bazaar provide an example of plans for increased density and urban regeneration. However, they do not offer an answer to the challenge of limited infrastructure to support the resource requirements of such developments.
The results of our research indicate significant adverse impacts on the environment. They show that the metabolism increases at a greater rate than the population grows. On this basis, this proposed development for Mumbai, or the other 99 cities, should not be called smart or sustainable.
With policies that aim to prevent urban sprawl, cities will inevitably grow vertically. But with high-rise housing comes dependence on centralised flows of energy, water supplies and waste disposal. Dependency in turn leads to vulnerability and insecurity….(More)”.
The hidden costs of open data
Sara Friedman at GCN: “As more local governments open their data for public use, the emphasis is often on “free” — using open source tools to freely share already-created government datasets, often with pro bono help from outside groups. But according to a new report, there are unforeseen costs when it comes pushing government datasets out of public-facing platforms — especially when geospatial data is involved.
The research, led by University of Waterloo professor Peter A. Johnson and McGill University professor Renee Sieber, was based on work as part of Geothink.ca partnership research grant and exploration of the direct and indirect costs of open data.
Costs related to data collection, publishing, data sharing, maintenance and updates are increasingly driving governments to third-party providers to help with hosting, standardization and analytical tools for data inspection, the researchers found. GIS implementation also has associated costs to train staff, develop standards, create valuations for geospatial data, connect data to various user communities and get feedback on challenges.
First, open data can create “smoke and mirrors” situation where insufficient resources are put toward deploying open data for government use. Users then experience “transaction costs” when it comes to working in specialist data formats that need additional skills, training and software to use.
Second, the level of investment and quality of open data can lead to “material benefits and social privilege” for communities that devote resources to providing more comprehensive platforms.
While there are some open source data platforms, the majority of solutions are proprietary and charged on a pro-rata basis, which can present a challenge for cities with larger, poor populations compared to smaller, wealthier cities. Issues also arise when governments try to combine their data sets, leading to increased costs to reconcile problems.
The third problem revolves around the private sector pushing for the release of data sets that can benefit their business objectives. Companies could push for the release high-value sets, such as a real-time transit data, to help with their product development goals. This can divert attention from low-value sets, such as those detailing municipal services or installations, that could have a bigger impact on residents “from a civil society perspective.”
If communities decide to release the low-value sets first, Johnson and Sieber think the focus can then be shifted to high-value sets that can help recoup the costs of developing the platforms.
Lastly, the report finds inadvertent consequences could result from tying open data resources to private-sector companies. Public-private open data partnerships could lead to infrastructure problems that prevent data from being widely shared, and help private companies in developing their bids for public services….
Johnson and Sieber encourage communities to ask the following questions before investing in open data:
- Who are the intended constituents for this open data?
- What is the purpose behind the structure for providing this data set?
- Does this data enable the intended users to meet their goals?
- How are privacy concerns addressed?
- Who sets the priorities for release and updates?…(More)”
Read the full report here.
Waste Is Information
Book by Dietmar Offenhuber: “Waste is material information. Landfills are detailed records of everyday consumption and behavior; much of what we know about the distant past we know from discarded objects unearthed by archaeologists and interpreted by historians. And yet the systems and infrastructures that process our waste often remain opaque. In this book, Dietmar Offenhuber examines waste from the perspective of information, considering emerging practices and technologies for making waste systems legible and how the resulting datasets and visualizations shape infrastructure governance. He does so by looking at three waste tracking and participatory sensing projects in Seattle, São Paulo, and Boston.
Offenhuber expands the notion of urban legibility—the idea that the city can be read like a text—to introduce the concept of infrastructure legibility. He argues that infrastructure governance is enacted through representations of the infrastructural system, and that these representations stem from the different stakeholders’ interests, which drive their efforts to make the system legible. The Trash Track project in Seattle used sensor technology to map discarded items through the waste and recycling systems; the Forager project looked at the informal organization processes of waste pickers working for Brazilian recycling cooperatives; and mobile systems designed by the city of Boston allowed residents to report such infrastructure failures as potholes and garbage spills. Through these case studies, Offenhuber outlines an emerging paradigm of infrastructure governance based on a complex negotiation among users, technology, and the city….(More)”.