Civic hacking as data activism and advocacy: A history from publicity to open government data


Andrew R Schrock in New Media and Society: “The civic hacker tends to be described as anachronistic, an ineffective “white hat” compared to more overtly activist cousins. By contrast, I argue that civic hackers’ politics emerged from a distinct historical milieu and include potentially powerful modes of political participation. The progressive roots of civic data hacking can be found in early 20th-century notions of “publicity” and the right to information movement. Successive waves of activists saw the Internet as a tool for transparency. The framing of openness shifted in meaning from information to data, weakening of mechanisms for accountability even as it opened up new forms of political participation. Drawing on a year of interviews and participant observation, I suggest civic data hacking can be framed as a form of data activism and advocacy: requesting, digesting, contributing to, modeling, and contesting data. I conclude civic hackers are utopian realists involved in the crafting of algorithmic power and discussing ethics of technology design. They may be misunderstood because open data remediates previous forms of openness. In the process, civic hackers transgress established boundaries of political participation….(More)”

Hoaxmap: Debunking false rumours about refugee ‘crimes’


Teo Kermeliotis at AlJazeera: “Back in the summer of 2015, at the height of the ongoing refugee crisis, Karolin Schwarz started noticing a disturbing pattern.

Just as refugee arrivals in her town of Leipzig, eastern Germany, began to rise, so did the frequency of rumours over supposed crimes committed by those men, women and children who had fled war and hardship to reach Europe.

As months passed by, the allegations became even more common, increasingly popping up in social media feeds and often reproduced by mainstream news outlets.

The online map featured some 240 incidents in its first week [Source: Hoaxmap/Al Jazeera]

 

“The stories seemed to be [orchestrated] by far-right parties and organisations and I wanted to try to find some way to help organise this – maybe find patterns and give people a tool to look up these stories [when] they were being confronted with new ones.”

And so she did.

Along with 35-year-old developer Lutz Helm, Schwarz launched last week Hoaxmap, an online platform that allows people to separate fact from fiction by debunking false rumours about supposed crimes committed by refugees.

Using an interactive system of popping dots, the map documents and categorises where those “crimes” allegedly took place. It then counters that false information with official statements from the police and local authorities, as well as news reports in which the allegations have been disproved. The debunked cases marked on the map range from thefts and assaults to manslaughter – but one of the most common topics is rape, Schwarz said….(More)”

Data Could Help Scholars Persuade, If Only They Were Willing to Use It


Paul Basken at the Chronicle of Higher Education: “Thanks to what they’ve learned from university research, consultants like Matthew Kalmans have become experts in modern political persuasion. A co-founder of Applecart, a New York data firm, Mr. Kalmans specializes in shaping societal attitudes by using advanced analytical techniques to discover and exploit personal connections and friendships. His is one of a fast-growing collection of similar companies now raising millions of dollars, fattening businesses, and aiding political campaigns with computerized records of Facebook exchanges, high-school yearbooks, even neighborhood gossip.

Applecart uses that data to try to persuade people on a range of topics by finding voices they trust to deliver endorsements. “You can use this sort of technology to get people to purchase insurance at higher rates, get people to purchase a product, get people to do all sorts of other things that they might otherwise not be inclined to do,” said Mr. Kalmans, a 2014 graduate of the University of Pennsylvania. And in building such a valuable service, he’s found that the intellectual underpinnings are often free. “We are constantly reading academic papers to get ideas on how to do things better,” Mr. Kalmans said. That’s because scholars conduct the field experiments and subsequent tests that Mr. Kalmans needs to build and refine his models. “They do a lot of the infrastructural work that, frankly, a lot of commercial companies don’t have the in-house expertise to do,” he said of university researchers. Yet the story of Applecart stands in contrast to the dominant attitude and approach among university researchers themselves. Universities are full of researchers who intensively study major global problems such as environmental destruction and societal violence, then stop short when their conclusions point to the need for significant change in public behavior.

Some in academe consider that boundary a matter of principle rather than a systematic failure or oversight. “The one thing that we have to do is not be political,” Michael M. Crow, the usually paradigm-breaking president of Arizona State University, said this summer at a conference on academic engagement in public discourse. “Politics is a process that we are informing. We don’t have to be political to inform politicians or political actors.” But other academics contemplate that stance and see a missed opportunity to help convert the millions of taxpayer dollars spent on research into meaningful societal benefit. They include Dan M. Kahan, a professor of law and of psychology at Yale University who has been trying to help Florida officials cope with climate change. Mr. Kahan works with the four-county Southeast Florida Regional Climate Change Compact, which wants to redesign roads, expand public transit, and build pumping stations to prepare for harsher weather. But Mr. Kahan says he and his Florida partners have had trouble getting enough

But Mr. Kahan says he and his Florida partners have had trouble getting enough policy makers to seriously consider the scale of the problem and the necessary solutions. It’s frustrating, Mr. Kahan said, to see so much university research devoted to work inside laboratories on problems like climate, and comparatively little spent on real-world needs such as sophisticated messaging strategies. “There really is a kind of deficit in the research relating to actually operationalizing the kinds of insights that people have developed from research,” he said. That deficit appears to stem from academic culture, said Utpal M. Dholakia, a professor of marketing at Rice University whose work involves testing people’s self-control in areas such as eating and shopping. He then draws conclusions about whether regulations or taxes aimed at changing behaviors will be effective. Companies find advanced personal behavioral data highly useful, said Mr. Dholakia, who works on the side to help retailers devise sales strategies. But his university, he said, appears more interested in seeing him publish his findings than take the time to help policy makers make real-world use of them. “My dean gets very worried if I don’t publish a lot.” Because universities h

That deficit appears to stem from academic culture, said Utpal M. Dholakia, a professor of marketing at Rice University whose work involves testing people’s self-control in areas such as eating and shopping. He then draws conclusions about whether regulations or taxes aimed at changing behaviors will be effective. Companies find advanced personal behavioral data highly useful, said Mr. Dholakia, who works on the side to help retailers devise sales strategies. But his university, he said, appears more interested in seeing him publish his findings than take the time to help policy makers make real-world use of them. “My dean gets very worried if I don’t publish a lot.” …(More)

Linked Open Economy: Take Full Advantage of Economic Data


Paper by Michalis N. Vafopoulos et al: “For decades, information related to public finances was out of reach for most of the people. Gradually, public budgets and tenders are becoming openly available and global initiatives promote fiscal transparency and open product and price data. But, the poor quality of economic open data undermines their potential to answer interesting questions (e.g. efficiency of public funds and market processes). Linked Open Economy (LOE) has been developed as a top-level conceptualization that interlinks the publicly available economic open data by modelling the flows incorporated in public procurement together with the market process to address complex policy issues. LOE approach is extensively used to enrich open economic data ranging from budgets and spending to prices. Developers, professionals, public administrations and any other interested party use and customize LOE model to develop new systems, to enable information exchange between systems, to integrate data from heterogeneous sources and to publish open data related to economic activities….(More)”

6 lessons from sharing humanitarian data


Francis Irving at LLRX: “The Humanitarian Data Exchange (HDX) is an unusual data hub. It’s made by the UN, and is successfully used by agencies, NGOs, companies, Governments and academics to share data.

They’re doing this during crises such as the Ebola epidemic and the Nepal earthquakes, and every day to build up information in between crises.

There are lots of data hubs which are used by one organisation to publish data, far fewer which are used by lots of organisations to share data. The HDX project did a bunch of things right. What were they?

Here are six lessons…

1) Do good design

HDX started with user needs research. This was expensive, and was immediately worth it because it stopped a large part of the project which wasn’t needed.

The user needs led to design work which has made the website seem simple and beautiful – particularly unusual for something from a large bureaucracy like the UN.

HDX front page

2) Build on existing software

When making a hub for sharing data, there’s no need to make something from scratch. Open Knowledge’s CKANsoftware is open source, this stuff is a commodity. HDX has developers who modify and improve it for the specific needs of humanitarian data.

ckan

3) Use experts

HDX is a great international team – the leader is in New York, most of the developers are in Romania, there’s a data lab in Nairobi. Crucially, they bring in specific outside expertise: frog design do the user research and design work;ScraperWiki, experts in data collaboration, provide operational management.

ScraperWiki logo

4) Measure the right things

HDX’s metrics are about both sides of its two sided network. Are users who visit the site actually finding and downloading data they want? Are new organisations joining to share data? They’re avoiding “vanity metrics”, taking inspiration from tech startup concepts like “pirate metrics“.

HDX metrics

5) Add features specific to your community

There are endless features you can add to data hubs – most add no value, and end up a cost to maintain. HDX add specific things valuable to its community.

For example, much humanitarian data is in “shape files”, a standard for geographical information. HDX automatically renders a beautiful map of these – essential for users who don’t have ArcGIS, and a good check for those that do.

Syrian border crossing

6) Trust in the data

The early user research showed that trust in the data was vital. For this reason, anyone can’t just come along and add data to it. New organisations have to apply – proving either that they’re known in humanitarian circles, or have quality data to share. Applications are checked by hand. It’s important to get this kind of balance right – being too ideologically open or closed doesn’t work.

Apply HDX

Conclusion

The detail of how a data sharing project is run really matters….(More)”

Donating Your Selfies to Science


Linda Poon at CityLab: “It’s not only your friends and family who follow your online selfies and group photos. Scientists are starting to look at them, too, though they’re more interested in what’s around you. In bulk, photos can reveal weather patterns across multiple locations, air quality of a place over time, the dynamics of a neighborhood—all sorts of information that helps researchers study cities.

At the Nanyang Technological University in Singapore, a research group is using crowdsourced photos to create a low-cost alternative to air-pollution sensors. Called AirTick, the smartphone app they’ve designed will collect photos from users and analyze how hazy the environment looks. It’ll then check each image against official air quality data, and through machine-learning the app will eventually be able to predict pollution levels based on an image alone.

AirTick creator Pan Zhengziang said in a promotional video last month that the growing concern among the public over air quality can make programs like this a success—especially in Southeast Asia, where smog has gotten so bad that governments have had to shut down schools and suspend outdoor activities.  “In Singapore’s recent haze episode, around 250,000 people [have] shared their concerns via Twitter,” he said. “This has made crowdsourcing-based air quality monitoring a possibility.”…(More)”

Open data and (15 million!) new measures of democracy


Joshua Tucker in the Washington Post: “Last month the University of Gothenberg’s V-Dem Institute released a new“Varieties of Democracy” dataset. It provides about 15 million data points on democracy, including 39 democracy-related indices. It can be accessed at v-dem.net along with supporting documentation. I asked Staffan I. Lindberg, Director of the V-Dem Institute and one of the directors of the project, a few questions about the new data. What follows is a lightly edited version of his answers.


Women’s Political Empowerment Index for Southeast Asia (Data: V-Dem data version 5; Figure V-Dem Institute, University of Gothenberg, Sweden)

Joshua Tucker: What is democracy, and is it even really to have quantitative measures on democracy?

Staffan Lindberg: There is no consensus on the definition of democracy and how to measure it. The understanding of what a democracy really is varies across countries and regions. This motivates the V-Dem approach not to offer one standard definition of the concept but instead to distinguish among five principles different versions of democracy: Electoral, Liberal, Participatory, Deliberative, and Egalitarian democracy. All of these principles have played prominent roles in current and historical discussions about democracy. Our measurement of these principles are based on two types of data, factual data collected by assisting researchers and survey responses by country experts, which are combined using a rather complex measurement model (which is a“custom-designed Bayesian ordinal item response theory model”, for details see the V-Dem Methodology document)….(More)

Big data’s big role in humanitarian aid


Mary K. Pratt at Computerworld: “Hundreds of thousands of refugees streamed into Europe in 2015 from Syria and other Middle Eastern countries. Some estimates put the number at nearly a million.

The sheer volume of people overwhelmed European officials, who not only had to handle the volatile politics stemming from the crisis, but also had to find food, shelter and other necessities for the migrants.

Sweden, like many of its European Union counterparts, was taking in refugees. The Swedish Migration Board, which usually sees 2,500 asylum seekers in an average month, was accepting 10,000 per week.

“As you can imagine, with that number, it requires a lot of buses, food, registration capabilities to start processing all the cases and to accommodate all of those people,” says Andres Delgado, head of operational control, coordination and analysis at the Swedish Migration Board.

Despite the dramatic spike in refugees coming into the country, the migration agency managed the intake — hiring extra staff, starting the process of procuring housing early, getting supplies ready. Delgado credits a good part of that success to his agency’s use of big data and analytics that let him predict, with a high degree of accuracy, what was heading his way.

“Without having that capability, or looking at the tool every day, to assess every need, this would have crushed us. We wouldn’t have survived this,” Delgado says. “It would have been chaos, actually — nothing short of that.”

The Swedish Migration Board has been using big data and analytics for several years, as it seeks to gain visibility into immigration trends and what those trends will mean for the country…./…

“Can big data give us peace? I think the short answer is we’re starting to explore that. We’re at the very early stages, where there are shining examples of little things here and there. But we’re on that road,” says Kalev H. Leetaru, creator of the GDELT Project, or the Global Database of Events, Language and Tone, which describes itself as a comprehensive “database of human society.”

The topic is gaining traction. A 2013 report, “New Technology and the Prevention of Violence and Conflict,” from the International Peace Institute, highlights uses of telecommunications technology, including data, in several crisis situations around the world. The report emphasizes the potential these technologies hold in helping to ease tensions and address problems.

The report’s conclusion offers this idea: “Big data can be used to identify patterns and signatures associated with conflict — and those associated with peace — presenting huge opportunities for better-informed efforts to prevent violence and conflict.”

That’s welcome news to Noel Dickover. He’s the director of PeaceTech Data Networks at the PeaceTech Lab, which was created by the U.S. Institute of Peace (USIP) to advance USIP’s work on how technology, media and data help reduce violent conflict around the world.

Such work is still in the nascent stages, Dickover says, but people are excited about its potential. “We have unprecedented amounts of data on human sentiment, and we know there’s value there,” he says. “The question is how to connect it.”

Dickover is working on ways to do just that. One example is the Open Situation Room Exchange (OSRx) project, which aims to “empower greater collective impact in preventing or mitigating serious violent conflicts in particular arenas through collaboration and data-sharing.”…(More)

Improving government effectiveness: lessons from Germany


Tom Gash at Global Government Forum: “All countries face their own unique challenges but advanced democracies also have much in common: the global economic downturn, aging populations, increasingly expensive health and pension spending, and citizens who remain as hard to please as ever.

At an event last week in Bavaria, attended by representatives of Bavaria’s governing party, the Christian Social Union (CSU) and their guests, it also became clear that there is a growing consensus that governments face another common problem. They have relied for too long on traditional legislation and regulation to drive change. The consensus was that simply prescribing in law what citizens and companies can and can’t do will not solve the complex problems governments are facing, that governments cannot legislate their way to improved citizen health, wealth and wellbeing….

…a number of developments …from which both UK and international policymakers and practitioners can learn to improve government effectiveness.

  1. Behavioural economics: The Behavioural Insights Team (BIT), which span out of government in 2013 and is the subject of a new book by one of its founders and former IfG Director of Research, David Halpern, is being watched carefully by many countries abroad. Some are using its services, while others – including the New South Wales Government in Australia –are building their own skills in this area. BIT and others using similar principles have shown that using insights from social psychology – alongside an experimental approach – can help save money and improve outcomes. Well known successes include increasing the tax take through changing wording of reminder letters (work led by another IfG alumni Mike Hallsworth) and increasing pension take-up through auto-enrolment.
  2. Market design: There is an emerging field of study which is examining how algorithms can be used to match people better with services they need – particularly in cases where it is unfair or morally repugnant to let allow a free market to operate. Alvin Roth, the Harvard Professor and Nobel prize winner, writes about these ‘matching markets’ in his book Who Gets What and Why – in which he also explains how the approach can ensure that more kidneys reach compatible donors, and children find the right education.
  3. Big data: Large datasets can now be mined far more effectively, whether it is to analyse crime patterns to spot where police patrols might be useful or to understand crowd flows on public transport. The use of real-time information allows far more sophisticated deployment of public sector resources, better targeted at demand and need, and better tailored to individual preferences.
  4. Transparency: Transparency has the potential to enhance both the accountability and effectiveness of governments across the world – as shown in our latest Whitehall Monitor Annual Report. The UK government is considered a world-leader for its transparency – but there are still areas where progress has stalled, including in transparency over the costs and performance of privately provided public services.
  5. New management models: There is a growing realisation that new methods are best harnessed when supported by effective management. The Institute’s work on civil service reform highlights a range of success factors from past reforms in the UK – and the benefits of clear mechanisms for setting priorities and sticking to them, as is being attempted by governments new(ish) Implementation Taskforces and the Departmental Implementation Units currently cropping up across Whitehall. I looked overseas for a different model that clearly aligns government activities behind citizens’ concerns – in this case the example of the single non-emergency number system operating in New York City and elsewhere. This system supports a powerful, highly responsive, data-driven performance management regime. But like many performance management regimes it can risk a narrow and excessively short-term focus – so such tools must be combined with the mind-set of system stewardship that the Institute has long championed in its policymaking work.
  6. Investment in new capability: It is striking that all of these developments are supported by technological change and research insights developed outside government. But to embed new approaches in government, there appear to be benefits to incubating new capacity, either in specialist departmental teams or at the centre of government….(More)”

The Point of Collection


Essay by Mimi Onuoha: “The conceptual, practical, and ethical issues surrounding “big data” and data in general begin at the very moment of data collection. Particularly when the data concern people, not enough attention is paid to the realities entangled within that significant moment and spreading out from it.

I try to do some disentangling here, through five theses around data collection — points that are worth remembering, communicating, thinking about, dwelling on, and keeping in mind, if you have anything to do with data on a daily basis (read: all of us) and want to do data responsibly.

1. Data sets are the results of their means of collection.

It’s easy to forget that the people collecting a data set, and how they choose to do it, directly determines the data set….

2. As we collect more data, we prioritize things that fit patterns of collection.

Or as Rob Kitchin and Martin Dodge say in Code/Space,“The effect of abstracting the world is that the world starts to structure itself in the image of the capta and the code.” Data emerges from a world that is increasingly software-mediated, and software thrives on abstraction. It flattens out individual variations in favor of types and models….

3. Data sets outlive the rationale for their collection.

Spotify can come up with a list of reasons why having access to users’ photos, locations, microphones, and contact lists can improve the music streaming experience. But the reasons why they decide these forms of data might be useful can be less important than the fact that they have the data itself. This is because the needs or desires influencing the decisions to collect some type of data often eventually disappear, while the data produced as a result of those decisions have the potential to live for much longer. The data are capable of shifting and changing according to specific cultural contexts and to play different roles than what they might have initially been intended for….

4. Corollary: Especially combined, data sets reveal far more than intended.

We sometimes fail to realize that data sets, both on their own and combined with others, can be used to do far more than what they were originally intended for. You can make inferences from one data set that result in conclusions in completely different realms. Facebook, by having huge amounts of data on people and their networks, could make reasonable hypotheses regarding people’s sexual orientations….

5. Data collection is a transaction that is the result of an invisible relationship.

This is a frame — connected to my first point — useful for understanding how to think about data collection on the whole:

Every data set involving people implies subjects and objects, those who collect and those who make up the collected. It is imperative to remember that on both sides we have human beings….(More)”