“Make FOIA Work is about re-imagining journalism through design, participation and collaboration. Faculty, staff and students at Emerson College and the Engagement Lab staff worked alongside the Boston Institute of Nonprofit Journalism (BINJ) and MuckRock, two independent and alternative news and information platforms and publishers, to produce a data-driven and engagement-based investigative reporting series that exposes corruption around the sales of guns in Massachusetts. Through design studios in participatory methods and data visualization, project participants created a participatory guide book for journalists, practitioners and community members on how to undertake participatory design projects with a focus on FOIA requests, community participation, and collaboration. The project also highlights the course syllabi in participatory design methods and data visualization….(More)”.
Open Urban Data and the Sustainable Development Goals
Conference Paper by Christine Meschede and Tobias Siebenlist: “Since the adoption of the United Nations’ Sustainable Development Goals (SDGs) in 2015 – an ambitious agenda to end poverty, combat environmental threats and ensure prosperity for everyone – some effort has been made regarding the adequate measuring of the progress on its targets. As the crucial point is the availability of sufficient, comparable information, open data can play a key role. The coverage of open data, i.e., data that is machine-readable, freely available and reusable for everyone, is assessed by several measurement tools. We propose the use of open governmental data to make the achievement of SDGs easy and transparent to measure. For this purpose, a mapping of the open data categories to the SDGs is presented. Further, we argue that the SDGs need to be tackled in particular at the city level. For analyzing the current applicability of open data for measuring progress on the SDGs, we provide a small-scale case study on German open data portals and the embedded data categories and datasets. The results suggest that further standardization is needed in order to be able to use open data for comparing cities and their progress towards the SDGs….(More)”.
Capacities for urban transformations governance and the case of New York City
Paper by Katharina Hölscher et al: “The narrative of urban sustainability transformations epitomises the hope that urban governance can create the conditions to plan and govern cities in a way that they contribute to local and global sustainability and resilience. So far, urban governance is not delivering: novel governance approaches are emerging in cities worldwide, yet are unable to transform conventional policymaking and planning to allow for innovative, co-beneficial and long-term solutions and actions to emerge and institutionalise. We present a capacities framework for urban transformations governance, starting from the need to fulfil distinct output functions (‘what needs to happen’) for mobilising and influencing urban transformation dynamics. The framework helps to diagnose and inform urban governance for responding to disturbances (stewarding capacity), phasing-out drivers of path-dependency (unlocking capacity), creating and embedding novelties (transformative capacity) and coordinating multi-actor processes (orchestrating capacity). Our case study of climate governance in New York City exemplifies the framework’s applicability and explanatory power to identify conditions and activities facilitating transformation (governance), and to reveal gaps and barriers of these vis-à-vis the existing governance regime. Our framework thereby functions as a tool to explore what new forms of urban transformation governance are emerging, how effective these are, and how to strengthen capacities….(More)”.
Study finds that a GPS outage would cost $1 billion per day
Eric Berger at Ars Technica: “….one of the most comprehensive studies on the subject has assessed the value of this GPS technology to the US economy and examined what effect a 30-day outage would have—whether it’s due to a severe space weather event or “nefarious activity by a bad actor.” The study was sponsored by the US government’s National Institutes of Standards and Technology and performed by a North Carolina-based research organization named RTI International.
Economic effect
As part of the analysis, researchers spoke to more than 200 experts in the use of GPS technology for various services, from agriculture to the positioning of offshore drilling rigs to location services for delivery drivers. (If they’d spoken to me, I’d have said the value of using GPS to navigate Los Angeles freeways and side streets was incalculable). The study covered a period from 1984, when the nascent GPS network was first opened to commercial use, through 2017. It found that GPS has generated an estimated $1.4 trillion in economic benefits during that time period.
The researchers found that the largest benefit, valued at $685.9 billion, came in the “telecommunications” category, including improved reliability and bandwidth utilization for wireless networks. Telematics (efficiency gains, cost reductions, and environmental benefits through improved vehicle dispatch and navigation) ranked as the second most valuable category at $325 billion. Location-based services on smartphones was third, valued at $215 billion.
Notably, the value of GPS technology to the US economy is growing. According to the study, 90 percent of the technology’s financial impact has come since just 2010, or just 20 percent of the study period. Some sectors of the economy are only beginning to realize the value of GPS technology, or are identifying new uses for it, the report says, indicating that its value as a platform for innovation will continue to grow.
Outage impact
In the case of some adverse event leading to a widespread outage, the study estimates that the loss of GPS service would have a $1 billion per-day impact, although the authors acknowledge this is at best a rough estimate. It would likely be higher during the planting season of April and May, when farmers are highly reliant on GPS technology for information about their fields.
To assess the effect of an outage, the study looked at several different variables. Among them was “precision timing” that enables a number of wireless services, including the synchronization of traffic between carrier networks, wireless handoff between base stations, and billing management. Moreover, higher levels of precision timing enable higher bandwidth and provide access to more devices. (For example, the implementation of 4G LTE technology would have been impossible without GPS technology)….(More)”
Of Governance and Revenue: Participatory Institutions and Tax Compliance in Brazil
Paper by Michael Touchton, Brian Wampler and Tiago C. Peixoto: “Traditionally, governments seek to mobilize tax revenues by expanding their enforcement of existing tax regimes and facilitating tax payments. However, enforcement and facilitation can be costly and produce diminishing marginal returns if citizens are unwilling to pay their taxes. This paper addresses gaps in knowledge about tax compliance, by asking a basic question: what explains why citizens and businesses comply with tax rules? To answer this question, the paper shows how the voluntary adoption of two different types of participatory governance institutions influences municipal tax collection in Brazil. Municipalities that voluntarily adopt participatory institutions collect significantly higher levels of taxes than similar municipalities without these institutions. The paper provides evidence that moves scholarship on tax compliance beyond enforcement and facilitation paradigms, while offering a better assessment of the role of local democratic institutions for government performance and tax compliance….(More)”.
Information Sharing as a Dimension of Smartness: Understanding Benefits and Challenges in Two Megacities
Paper by J. Ramon Gil-Garcia, Theresa A. Pardo, and Manuel De Tuya: “Cities around the world are facing increasingly complex problems.
These problems frequently require collaboration and information sharing across agency boundaries.
In our view, information sharing can be seen as an important dimension of what is recently being called smartness in cities and enables the ability to improve decision making and day-to-day operations in urban settings. Unfortunately, what many city managers are learning is that there are important challenges to sharing information both within their city and with others.
Based on nonemergency service integration initiatives in New York City and Mexico City, this article examines important benefits from and challenges to information sharing in the context of what the participants characterize as smart city initiatives, particularly in large metropolitan areas.
The research question guiding this study is as follows: To what extent do previous findings about information sharing hold in the context of city initiatives, particularly in megacities?
The results provide evidence on the importance of some specific characteristics of cities and megalopolises and how they affect benefits and challenges of information sharing. For instance, cities seem to have more managerial flexibility than other jurisdictions such as state governments.
In addition, megalopolises have most of the necessary technical skills and financial resources needed for information sharing and, therefore, these challenges are not as relevant as in other local governments….(More)”.
How Organizations with Data and Technology Skills Can Play a Critical Role in the 2020 Census
Blog Post by Kathryn L.S. Pettit and Olivia Arena: “The 2020 Census is less than a year away, and it’s facing new challenges that could result in an inaccurate count. The proposed inclusion of a citizenship question, the lack of comprehensive and unified messaging, and the new internet-response option could worsen the undercount of vulnerable and marginalized communities and deprive these groups of critical resources.
The US Census Bureau aims to count every US resident. But some groups are more likely to be missed than others. Communities of color, immigrants, young children, renters, people experiencing homelessness, and people living in rural areas have long been undercounted in the census. Because the census count is used to apportion federal funding and draw legislative districts for political seats, an inaccurate count means that these populations receive less than their fair share of resources and representation.
Local governments and community-based organizations have begun forming Complete Count Committees, coalitions of trusted community voices established to encourage census responses, to achieve a more accurate count in 2020. Local organizations with data and technology skills—like civic tech groups, libraries, technology training organizations, and data intermediaries—can harness their expertise to help these coalitions achieve a complete count.
As the coordinator of the National Neighborhood Indicators Partnership (NNIP), we are learning about 2020 Census mobilization in communities across the country. We have found that data and technology groups are natural partners in this work; they understand what is at risk in 2020, are embedded in communities as trusted data providers, and can amplify the importance of the census.
Threats to a complete count
The proposed citizenship question, currently being challenged in court, would likely suppress the count of immigrants and households in immigrant communities in the US. Though federal law prohibits the Census Bureau from disclosing individual-level data, even to other agencies, people may still be skeptical about the confidentiality of the data or generally distrust the government. Acknowledging these fears is important for organizations partnering in outreach to vulnerable communities.
Another potential hurdle is that, for the first time, the Census Bureau will encourage people to complete their census forms online (though answering by mail or phone will still be options). Though a high tech census could be more cost-effective, the digital divide compounded by the underfunding of the Census Bureau that limited initial testing of new methods and outreach could worsen the undercount….(More)”.
Applying crowdsourcing techniques in urban planning: A bibliometric analysis of research and practice prospects
Paper by Pinchao Liao et al in Cities: “Urban planning requires more public involvement and larger group participation to achieve scientific and democratic decision making. Crowdsourcing is a novel approach to gathering information, encouraging innovation and facilitating group decision-making. Unfortunately, although previous research has explored the utility of crowdsourcing applied to urban planning theoretically, there are still rare real practices or empirical studies using practical data. This study aims to identify the prospects for implementing crowdsourcing in urban planning through a bibliometric analysis on current research.
First, database and keyword lists based on peer-reviewed journal articles were developed. Second, semantic analysis is applied to quantify co-occurrence frequencies of various terms in the articles based on the keyword lists, and in turn a semantic network is built.
Then, cluster analysis was conducted to identify major and correlated research topics, and bursting key terms were analyzed and explained chronologically. Lastly, future research and practical trends were discussed.
The major contribution of this study is identifying crowdsourcing as a novel urban planning method, which can strengthen government capacities by involving public participation, i.e., turning governments into task givers. Regarding future patterns, the application of crowdsourcing in urban planning is expected to expand to transportation, public health and environmental issues. It is also indicated that the use of crowdsourcing requires governments to adjust urban planning mechanisms….(More)”.
How to use data for good — 5 priorities and a roadmap
Stefaan Verhulst at apolitical: “…While the overarching message emerging from these case studies was promising, several barriers were identified that if not addressed systematically could undermine the potential of data science to address critical public needs and limit the opportunity to scale the practice more broadly.
Below we summarise the five priorities that emerged through the workshop for the field moving forward.
1. Become People-Centric
Much of the data currently used for drawing insights involve or are generated by people.
These insights have the potential to impact people’s lives in many positive and negative ways. Yet, the people and the communities represented in this data are largely absent when practitioners design and develop data for social good initiatives.
To ensure data is a force for positive social transformation (i.e., they address real people’s needs and impact lives in a beneficiary way), we need to experiment with new ways to engage people at the design, implementation, and review stage of data initiatives beyond simply asking for their consent.
(Photo credit: Image from the people-led innovation report)
As we explain in our People-Led Innovation methodology, different segments of people can play multiple roles ranging from co-creation to commenting, reviewing and providing additional datasets.
The key is to ensure their needs are front and center, and that data science for social good initiatives seek to address questions related to real problems that matter to society-at-large (a key concern that led The GovLab to instigate 100 Questions Initiative).
2. Establish Data About the Use of Data (for Social Good)
Many data for social good initiatives remain fledgling.
As currently designed, the field often struggles with translating sound data projects into positive change. As a result, many potential stakeholders—private sector and government “owners” of data as well as public beneficiaries—remain unsure about the value of using data for social good, especially against the background of high risks and transactions costs.
The field needs to overcome such limitations if data insights and its benefits are to spread. For that, we need hard evidence about data’s positive impact. Ironically, the field is held back by an absence of good data on the use of data—a lack of reliable empirical evidence that could guide new initiatives.
The field needs to prioritise developing a far more solid evidence base and “business case” to move data for social good from a good idea to reality.
3. Develop End-to-End Data Initiatives
Too often, data for social good focus on the “data-to-knowledge” pipeline without focusing on how to move “knowledge into action.”
As such, the impact remains limited and many efforts never reach an audience that can actually act upon the insights generated. Without becoming more sophisticated in our efforts to provide end-to-end projects and taking “data from knowledge to action,” the positive impact of data will be limited….
4. Invest in Common Trust and Data Steward Mechanisms
For data for social good initiatives (including data collaboratives) to flourish and scale, there must be substantial trust between all parties involved; and amongst the public-at-large.
Establishing such a platform of trust requires each actor to invest in developing essential trust mechanisms such as data governance structures, contracts, and dispute resolution methods. Today, designing and establishing these mechanisms take tremendous time, energy, and expertise. These high transaction costs result from the lack of common templates and the need to each time design governance structures from scratch…
5. Build Bridges Across Cultures
As C.P. Snow famously described in his lecture on “Two Cultures and the Scientific Revolution,” we must bridge the “two cultures” of science and humanism if we are to solve the world’s problems….
To implement these five priorities we will need experimentation at the operational but also institutional level. This involves the establishment of “data stewards” within organisations that can accelerate data for social good initiative in a responsible manner integrating the five priorities above….(More)”
The Right to the Datafied City: Interfacing the Urban Data Commons
Chapter by Michiel de Lange in The Right to the Smart City: “The current datafication of cities raises questions about what Lefebvre and many after him have called “the right to the city.” In this contribution, I investigate how the use of data for civic purposes may strengthen the “right to the datafied city,” that is, the degree to which different people engage and participate in shaping urban life and culture, and experience a sense of ownership. The notion of the commons acts as the prism to see how data may serve to foster this participatory “smart citizenship” around collective issues. This contribution critically engages with recent attempts to theorize the city as a commons. Instead of seeing the city as a whole as a commons, it proposes a more fine-grained perspective of the “commons-as-interface.” The “commons-as-interface,” it is argued, productively connects urban data to the human-level political agency implied by “the right to the city” through processes of translation and collectivization. The term is applied to three short case studies, to analyze how these processes engender a “right to the datafied city.” The contribution ends by considering the connections between two seemingly opposed discourses about the role of data in the smart city – the cybernetic view versus a humanist view. It is suggested that the commons-as-interface allows for more detailed investigations of mediation processes between data, human actors, and urban issues….(More)”.