Paper by Jennifer Larson et al for Political Networks Workshops & Conference 2016: “Pinning down the role of social ties in the decision to protest has been notoriously elusive, largely due to data limitations. The era of social media and its global use by protesters offers an unprecedented opportunity to observe real-time social ties and online behavior, though often without an attendant measure of real-world behavior. We collect data on Twitter activity during the 2015 Charlie Hebdo protests in Paris which, unusually, record both real-world protest attendance and high-resolution network structure. We specify a theory of participation in which an individual’s decision depends on her exposure to others’ intentions, and network position determines exposure. Our findings are strong and consistent with this theory, showing that, relative to comparable Twitter users, protesters are significantly more connected to one another via direct, indirect, triadic, and reciprocated ties. These results offer the first large-scale empirical support for the claim that social network structure influences protest participation….(More)’
The Racist Algorithm?
Anupam Chander in the Michigan Law Review (2017 Forthcoming) : “Are we on the verge of an apartheid by algorithm? Will the age of big data lead to decisions that unfairly favor one race over others, or men over women? At the dawn of the Information Age, legal scholars are sounding warnings about the ubiquity of automated algorithms that increasingly govern our lives. In his new book, The Black Box Society: The Hidden Algorithms Behind Money and Information, Frank Pasquale forcefully argues that human beings are increasingly relying on computerized algorithms that make decisions about what information we receive, how much we can borrow, where we go for dinner, or even whom we date. Pasquale’s central claim is that these algorithms will mask invidious discrimination, undermining democracy and worsening inequality. In this review, I rebut this prominent claim. I argue that any fair assessment of algorithms must be made against their alternative. Algorithms are certainly obscure and mysterious, but often no more so than the committees or individuals they replace. The ultimate black box is the human mind. Relying on contemporary theories of unconscious discrimination, I show that the consciously racist or sexist algorithm is less likely than the consciously or unconsciously racist or sexist human decision-maker it replaces. The principal problem of algorithmic discrimination lies elsewhere, in a process I label viral discrimination: algorithms trained or operated on a world pervaded by discriminatory effects are likely to reproduce that discrimination.
I argue that the solution to this problem lies in a kind of algorithmic affirmative action. This would require training algorithms on data that includes diverse communities and continually assessing the results for disparate impacts. Instead of insisting on race or gender neutrality and blindness, this would require decision-makers to approach algorithmic design and assessment in a race and gender conscious manner….(More)“
What Can Civic Tech Learn From Social Movements?
Stacy Donohue at Omidyar Network: “…In order to spur creative thinking about how the civic tech sector could be accelerated and expanded, we looked to Purpose, a public benefit corporation that works with NGOs, philanthropies, and brands on movement building strategies. We wanted to explore what we might learn from taking the work that Purpose has done mapping the progress of of 21st century social movements and applying its methodology to civic tech.
So why consider viewing civic tech using the lens of 21st century movements? Movements are engines of change in society that enable citizens to create new and better paths to engage with government and to seek recourse on issues that matter to millions of people. At first glance, civic tech doesn’t appear to be a movement in the purest sense of the term, but on closer inspection, it does share some fundamental characteristics. Like a movement, civic tech is mission driven, is focused on making change that benefits the public, and in most cases enables better public input into decision making.
We believe that better understanding the essential components of movements, and observing the ways in which civic tech does or does not behave like one, can yield insights on how we as a civic tech community can collectively drive the sector forward….
report Engines of Change: What Civic Tech Can Learn From Social Movements….provides a lot of rich insight and detail which we invite everyone to explore. Meanwhile, we have summarized five key findings:
- Grassroots activity is expanding across the US – Activity is no longer centralized around San Francisco and New York; it’s rapidly growing and spreading across the US – in fact, there was an 81% increase in the number of cities hosting civic tech MeetUps from 2013 to 2015, and 45 of 50 states had at least one MeetUp on civic tech in 2015.
- Talk is turning to action – We are walking the talk. One way we can see this is that growth in civic tech Twitter discussion is highly correlated with the growth in GitHub contributions to civic tech projects and related Meetup events. Between 2013-2015, over 8,500 people contributed code to GitHub civic tech projects and there were over 76,000 MeetUps for civic tech events.
- There is an engaged core, but it is very small in number – As with most social movements, civic tech has a definite core of highly engaged evangelists, advocates and entrepreneurs that are driving conversations, activity, and events and this is growing. The number of Meetup groups holding multiple events a quarter grew by 136% between 2013 to 2015. And likewise there was a 60% growth in Engaged Tweeters in during this time period. However, this level of activity is dwarfed by other movements such as climate action.
- Civic tech is growing but still lacking scale – There are many positive indications of growth in civic tech; for example, the combination of nonprofit and for-profit funding to the sector increased by almost 120% over the period. But while growth compares favorably to other movements, again the scale just isn’t there.
- Common themes, but no shared vision or identity – Purpose examined the extent to which civic tech exhibits and articulates a shared vision or identity around which members of a movement can rally. What they found is that many fewer people are discussing the same shared set of themes. Two themes – Open Data and Government Transparency – are resonating and gaining traction across the sector and could therefore form the basis of common identity for civic tech.
While each of these insights is important in its own right and requires action to move the sector forward, the main thing that strikes us is the need for a coherent and clearly articulated vision and sense of shared identity for civic tech…
Read the full report: Engines of Change: What Civic Tech Can Learn From Social Movements
City of Copenhagen launches data marketplace
Sarah Wray at TMForum: “The City of Copenhagen has launched its City Data Exchange to make public and private data accessible to power innovation.
The City Data Exchange is a new service to create a ‘marketplace for data’ from public and private data providers and allow monetization. The platform has been developed by Hitachi Insight Group.
“Data is the fuel powering our digital world, but in most cities it is unused,” said Hans Lindeman, Senior Vice President, Hitachi Insight Group, EMEA. “Even where data sits in public, freely accessible databases, the cost of extracting and processing it can easily outweigh the benefits.”
The City of Copenhagen is using guidelines for a data format that is safe, secure, ensures privacy and makes data easy to use. The City Data Exchange will only accept data that has been fully anonymized by the data supplier, for example.
According to Hitachi Insight Group, “All of this spares organizations the trouble and cost of extracting and processing data from multiple sources. At the same time, proprietary data can now become a business resource that can be monetized outside an organization.”
As a way to demonstrate how data from the City Data Exchange could be used in applications, Hitachi Insight Group is developing two applications:
- Journey Insight, which helps citizens in the region to track their transportation usage over time and understand the carbon footprint of their travel
- Energy Insight, which allows both households and businesses to see how much energy they use.
Both are set for public launch later this year.
Another example of how data marketplaces can enable innovation is the Mind My Business mobile app, developed by Vizalytics. It brings together all the data that can affect a retailer — from real-time information on how construction or traffic issues can hurt the footfall of a business, to timely reminders about taxes to pay or new regulations to meet. The “survival app for shopkeepers” makes full use of all the relevant data sources brought together by the City Data Exchange.
The platform will offer data in different categories such as: city life, infrastructure, climate and environment, business data and economy, demographics, housing and buildings, and utilities usage. It aims to meet the needs of local government, city planners, architects, retailers, telecoms networks, utilities, and all other companies and organizations who want to understand what makes Copenhagen, its businesses and its citizens tick.
“Smart cities need smart insights, and that’s only possible if everybody has all the facts at their disposal. The City Data Exchange makes that possible; it’s the solution that will help us all to create better public spaces and — for companies in Copenhagen — to offer better services and create jobs,” said Frank Jensen, the Lord Mayor of Copenhagen.
The City Data Exchange is currently offering raw data to its customers, and later this year will add analytical tools. The cost of gathering and processing the data will be recovered through subscription and service fees, which are expected to be much lower than the cost any company or city would face in performing the work of extracting, collecting and integrating the data by themselves….(More)”
Evolving the IRB: Building Robust Review for Industry Research
Molly Jackman & Lauri Kanerva at Wash. & Lee L. Rev. Online : “Increasingly, companies are conducting research so that they can make informed decisions about what products to build and what features to change.These data-driven insights enable companies to make responsible decisions that will improve peoples’ experiences with their products. Importantly, companies must also be responsible in how they conduct research. Existing ethical guidelines for research do not always robustly address the considerations that industry researchers face. For this reason, companies should develop principles and practices around research that are appropriate to the environments in which they operate,taking into account the values set out in law and ethics. This paper describes the research review process designed and implemented at Facebook, including the training employees receive, and the steps involved in evaluating proposed research. We emphasize that there is no one-size-fits-all model of research review that can be applied across companies, and that processes should be designed to fit the contexts in which the research is taking place. However, we hope that general principles can be extracted from Facebook’s process that will inform other companies as they develop frameworks for research review that serve their needs….(More)”.
Are we too obsessed with data?
Lauren Woodman of Nethope:” Data: Everyone’s talking about it, everyone wants more of it….
Still, I’d posit that we’re too obsessed with data. Not just us in the humanitarian space, of course, but everyone. How many likes did that Facebook post get? How many airline miles did I fly last year? How many hours of sleep did I get last week?…
The problem is that data by itself isn’t that helpful: information is.
We need to develop a new obsession, around making sure that data is actionable, that it is relevant in the context in which we work, and on making sure that we’re using the data as effectively as we are collecting it.
In my talk at ICT4D, I referenced the example of 7-Eleven in Japan. In the 1970s, 7-Eleven in Japan became independent from its parent, Southland Corporation. The CEO had to build a viable business in a tough economy. Every month, each store manager would receive reams of data, but it wasn’t effective until the CEO stripped out the noise and provided just four critical data points that had the greatest relevance to drive the local purchasing that each store was empowered to do on their own.
Those points – what sold the day before, what sold the same day a year ago, what sold the last time the weather was the same, and what other stores sold the day before – were transformative. Within a year, 7-Eleven had turned a corner, and for 30 years, remained the most profitable retailer in Japan. It wasn’t about the Big Data; it was figuring out what data was relevant, actionable and empowered local managers to make nimble decisions.
For our sector to get there, we need to do the front-end work that transforms our data into information that we can use. That, after all, is where the magic happens.
A few examples provide more clarity as to why this is so critical.
We know that adaptive decision-making requires access to real-time data. By knowing what is happening in real-time, or near-real-time, we can adjust our approaches and interventions to be most impactful. But to do so, our data has to be accessible to those that are empowered to make decisions. To achieve that, we have to make investments in training, infrastructure, and capacity-building at the organizational level. But in the nonprofit sector, such investments are rarely supported by donors and beyond the limited unrestricted funding available to most most organizations. As a result, the sector has, so far, been able to take only limited steps towards effective data usage, hampering our ability to transform the massive amounts of data we have into useful information.
Another big question about data, and particularly in the humanitarian space, is whether it should be open, closed or somewhere in between. Privacy is certainly paramount, and for types of data, the need for close protection is very clear. For many other data, however, the rules are far less clear. Every country has its own rules about how data can and cannot be used or shared, and more work is needed to provide clarity and predictability so that appropriate data-sharing can evolve.
And perhaps more importantly, we need to think about not just the data, but the use cases. Most of us would agree, for example, that sharing information during a crisis situation can be hugely beneficial to the people and the communities we serve – but in a world where rules are unclear, that ambiguity limits what we can do with the data we have. Here again, the context in which data will be used is critically important.
Finally, all of in the sector have to realize that the journey to transforming data into information is one we’re on together. We have to be willing to give and take. Having data is great; sharing information is better. Sometimes, we have to co-create that basis to ensure we all benefit….(More)”
Revealing Cultural Ecosystem Services through Instagram Images
Paper by Paulina Guerrero, Maja Steen Møller, Anton Stahl Olafsson, and Bernhard Snizek on “The Potential of Social Media Volunteered Geographic Information for Urban Green Infrastructure Planning and Governance”: “With the prevalence of smartphones, new ways of engaging citizens and stakeholders in urban planning and governance are emerging. The technologies in smartphones allow citizens to act as sensors of their environment, producing and sharing rich spatial data useful for new types of collaborative governance set-ups. Data derived from Volunteered Geographic Information (VGI) can support accessible, transparent, democratic, inclusive, and locally-based governance situations of interest to planners, citizens, politicians, and scientists. However, there are still uncertainties about how to actually conduct this in practice. This study explores how social media VGI can be used to document spatial tendencies regarding citizens’ uses and perceptions of urban nature with relevance for urban green space governance. Via the hashtag #sharingcph, created by the City of Copenhagen in 2014, VGI data consisting of geo-referenced images were collected from Instagram, categorised according to their content and analysed according to their spatial distribution patterns. The results show specific spatial distributions of the images and main hotspots. Many possibilities and much potential of using VGI for generating, sharing, visualising and communicating knowledge about citizens’ spatial uses and preferences exist, but as a tool to support scientific and democratic interaction, VGI data is challenged by practical, technical and ethical concerns. More research is needed in order to better understand the usefulness and application of this rich data source to governance….(More)”
Leveraging ‘big data’ analytics in the public sector
Pandula Gamage in Public Money & Management: “This article examines the opportunities presented by effectively harnessing big data in the public sector context. The article is exploratory and reviews both academic- and practitioner–oriented literature related to big data developments. The findings suggest that big data will have an impact on the future role of public sector organizations in functional areas. However, the author also reveals that there are challenges to be addressed by governments in adopting big data applications. To realize the benefits of big data, policy-makers need to: invest in research; create incentives for private and public sector entities to share data; and set up programmes to develop appropriate skills….(More)”
Is artificial intelligence key to dengue prevention?
BreakDengue: “Dengue fever outbreaks are increasing in both frequency and magnitude. Not only that, the number of countries that could potentially be affected by the disease is growing all the time.
This growth has led to renewed efforts to address the disease, and a pioneering Malaysian researcher was recently recognized for his efforts to harness the power of big data and artificial intelligence to accurately predict dengue outbreaks.
Dr. Dhesi Baha Raja received the Pistoia Alliance Life Science Award at King’s College London in April of this year, for developing a disease prediction platform that employs technology and data to give people prior warning of when disease outbreaks occur.The medical doctor and epidemiologist has spent years working to develop AIME (Artificial Intelligence in Medical Epidemiology)…
it relies on a complex algorithm, which analyses a wide range of data collected by local government and also satellite image recognition systems. Over 20 variables such as weather, wind speed, wind direction, thunderstorm, solar radiation and rainfall schedule are included and analyzed. Population models and geographical terrain are also included. The ultimate result of this intersection between epidemiology, public health and technology is a map, which clearly illustrates the probability and location of the next dengue outbreak.
The ground-breaking platform can predict dengue fever outbreaks up to two or three months in advance, with an accuracy approaching 88.7 per cent and within a 400m radius. Dr. Dhesi has just returned from Rio de Janeiro, where the platform was employed in a bid to fight dengue in advance of this summer’s Olympics. In Brazil, its perceived accuracy was around 84 per cent, whereas in Malaysia in was over 88 per cent – giving it an average accuracy of 86.37 per cent.
The web-based application has been tested in two states within Malaysia, Kuala Lumpur, and Selangor, and the first ever mobile app is due to be deployed across Malaysia soon. Once its capability is adequately tested there, it will be rolled out globally. Dr. Dhesi’s team are working closely with mobile digital service provider Webe on this.
By making the app free to download, this will ensure the service becomes accessible to all, Dr Dhesi explains.
“With the web-based application, this could only be used by public health officials and agencies. We recognized the need for us to democratize this health service to the community, and the only way to do this is to provide the community with the mobile app.”
This will also enable the gathering of even greater knowledge on the possibility of dengue outbreaks in high-risk areas, as well as monitoring the changing risks as people move to different areas, he adds….(More)”
Code and the City
Code and the City explores the extent and depth of the ways in which software mediates how people work, consume, communication, travel and play. The reach of these systems is set to become even more pervasive through efforts to create smart cities: cities that employ ICTs to underpin and drive their economy and governance. Yet, despite the roll-out of software-enabled systems across all aspects of city life, the relationship between code and the city has barely been explored from a critical social science perspective. This collection of essays seeks to fill that gap, and offers an interdisciplinary examination of the relationship between software and contemporary urbanism.
This book will be of interest to those researching or studying smart cities and urban infrastructure….(More)”.