Twitter as a data source: An overview of tools for journalists


Wasim Ahmed at Data Driven Journalism: “Journalists may wish to use data from social media platforms in order to provide greater insight and context to a news story. For example, journalists may wish to examine the contagion of hashtags and whether they are capable of achieving political or social change. Moreover, newsrooms may also wish to tap into social media posts during unfolding crisis events. For example, to find out who tweeted about a crisis event first, and to empirically examine the impact of social media.

Furthermore, Twitter users and accounts such as WikiLeaks may operate outside the constraints of traditional journalism, and therefore it becomes important to have tools and mechanisms in place in order to examine these kinds of influential users. For example, it was found that those who were backing Marine Le Pen on Twitter could have been users who had an affinity to Donald Trump.

There remains a number of different methods for analysing social media data. Take text analytics, for example, which can include using sentiment analysis to place bulk social media posts into categories of a particular feeling, such as positive, negative, or neutral. Or machine learning, which can automatically assign social media posts to a number of different topics.

There are other methods such as social network analysis, which examines online communities and the relationships between them. A number of qualitative methodologies also exist, such as content analysis and thematic analysis, which can be used to manually label social media posts. From a journalistic perspective, network analysis may be of importance initially via tools such as NodeXL. This is because it can quickly provide an overview of influential Twitter users alongside a topic overview.

From an industry standpoint, there has been much focus on gaining insight into users’ personalities, through services such as IBM Watson’s Personality Insights service. This uses linguistic analytics to derive intrinsic personality insights, such as emotions like anxiety, self-consciousness, and depression. This information can then be used by marketers to target certain products; for example, anti-anxiety medication to users who are more anxious…(An overview of tools for 2017).”

UK government watchdog examining political use of data analytics


“Given the big data revolution, it is understandable that political campaigns are exploring the potential of advanced data analysis tools to help win votes,” Elizabeth Denham, the information commissioner, writes on the ICO’s blog. However, “the public have the right to expect” that this takes place in accordance with existing data protection laws, she adds.

Political parties are able to use Facebook to target voters with different messages, tailoring the advert to recipients based on their demographic. In the 2015 UK general election, the Conservative party spent £1.2 million on Facebook campaigns and the Labour party £16,000. It is expected that Labour will vastly increase that spend for the general election on 8 June….

Political parties and third-party companies are allowed to collect data from sites like Facebook and Twitter that lets them tailor these ads to broadly target different demographics. However, if those ads target identifiable individuals, it runs afoul of the law….(More)”

Eliminating the Human


I suspect that we almost don’t notice this pattern because it’s hard to imagine what an alternative focus of tech development might be. Most of the news we get barraged with is about algorithms, AI, robots and self driving cars, all of which fit this pattern, though there are indeed many technological innovations underway that have nothing to do with eliminating human interaction from our lives. CRISPR-cas9 in genetics, new films that can efficiently and cheaply cool houses and quantum computing to name a few, but what we read about most and what touches us daily is the trajectory towards less human involvement. Note: I don’t consider chat rooms and product reviews as “human interaction”; they’re mediated and filtered by a screen.

I am not saying these developments are not efficient and convenient; this is not a judgement regarding the services and technology. I am simply noticing a pattern and wondering if that pattern means there are other possible roads we could be going down, and that the way we’re going is not in fact inevitable, but is (possibly unconsciously) chosen.

Here are some examples of tech that allows for less human interaction…

Lastly, “Social” media- social “interaction” that isn’t really social.

While the appearance on social networks is one of connection—as Facebook and others frequently claim—the fact is a lot of social media is a simulation of real social connection. As has been in evidence recently, social media actually increases divisions amongst us by amplifying echo effects and allowing us to live in cognitive bubbles. We are fed what we already like or what our similarly inclined friends like… or more likely now what someone has payed for us to see in an ad that mimics content. In this way, we actually become less connected except to those in our group…..

Many transformative movements in the past succeed based on leaders, agreed upon principles and organization. Although social media is a great tool for rallying people and bypassing government channels, it does not guarantee eventual success.

Social media is not really social—ticking boxes and having followers and getting feeds is NOT being social—it’s a screen simulation of human interaction. Human interaction is much more nuanced and complicated than what happens online. Engineers like things that are quantifiable. Smells, gestures, expression, tone of voice, etc. etc.—in short, all the various ways we communicate are VERY hard to quantify, and those are often how we tell if someone likes us or not….

To repeat what I wrote above—humans are capricious, erratic, emotional, irrational and biased in what sometimes seem like counterproductive ways. I’d argue that though those might seem like liabilities, many of those attributes actually work in our favor. Many of our emotional responses have evolved over millennia, and they are based on the probability that our responses, often prodded by an emotion, will more likely than not offer the best way to deal with a situation….

Our random accidents and odd behaviors are fun—they make life enjoyable. I’m wondering what we’re left with when there are fewer and fewer human interactions. Remove humans from the equation and we are less complete as people or as a society. “We” do not exist as isolated individuals—we as individuals are inhabitants of networks, we are relationships. That is how we prosper and thrive….(More)”.

Open Data Barometer 2016


Open Data Barometer: “Produced by the World Wide Web Foundation as a collaborative work of the Open Data for Development (OD4D) network and with the support of the Omidyar Network, the Open Data Barometer (ODB) aims to uncover the true prevalence and impact of open data initiatives around the world. It analyses global trends, and provides comparative data on countries and regions using an in-depth methodology that combines contextual data, technical assessments and secondary indicators.

Covering 115 jurisdictions in the fourth edition, the Barometer ranks governments on:

  • Readiness for open data initiatives.
  • Implementation of open data programmes.
  • Impact that open data is having on business, politics and civil society.

After three successful editions, the fourth marks another step towards becoming a global policymaking tool with a participatory and inclusive process and a strong regional focus. This year’s Barometer includes an assessment of government performance in fulfilling the Open Data Charter principles.

The Barometer is a truly global and collaborative effort, with input from more than 100 researchers and government representatives. It takes over six months and more than 10,000 hours of research work to compile. During this process, we address more than 20,000 questions and respond to more than 5,000 comments and suggestions.

The ODB global report is a summary of some of the most striking findings. The full data and methodology is available, and is intended to support secondary research and inform better decisions for the progression of open data policies and practices across the world…(More)”.

A DARPA Perspective on Artificial Intelligence


DARPA: “What’s the ground truth on artificial intelligence (AI)? In this video, John Launchbury, the Director of DARPA’s Information Innovation Office (I2O), attempts to demystify AI–what it can do, what it can’t do, and where it is headed. Through a discussion of the “three waves of AI” and the capabilities required for AI to reach its full potential, John provides analytical context to help understand the roles AI already has played, does play now, and could play in the future. (Slides can be downloaded here)….”

An AI Ally to Combat Bullying in Virtual Worlds


Simon Parkin at MIT Technology Review: “In any fictionalized universe, the distinction between playful antagonism and earnest harassment can be difficult to discern. Name-calling between friends playing a video game together is often a form of camaraderie. Between strangers, however, similar words assume a different, more troublesome quality. Being able to distinguish between the two is crucial for any video-game maker that wants to foster a welcoming community.

Spirit AI hopes to help developers support players and discourage bullying behavior with an abuse detection and intervention system called Ally. The software monitors interactions between players—what people are saying to each other and how they are behaving—through the available actions within a game or social platform. It’s able to detect verbal harassment and also nonverbal provocation—for example, one player stalking another’s avatar or abusing reporting tools.

“We’re looking at interaction patterns, combined with natural-language classifiers, rather than relying on a list of individual keywords,” explains Ruxandra Dariescu, one of Ally’s developers. “Harassment is a nuanced problem.”

When Ally identifies potentially abusive behavior, it checks to see if the potential abuser and the other player have had previous interactions. Where Ally differs from existing moderation software is that rather than simply send an alert to the game’s developers, it is able to send a computer-controlled virtual character to check in with the player—one that, through Spirit AI’s natural-language tools, is able to converse in the game’s tone and style (see “A Video-Game Algorithm to Solve Online Abuse”)….(More)”.

E-Democracy for Smart Cities


Book by Vinod Kumar: “…highlights the rightful role of citizens as per the constitution of the country for participation in Governance of a smart city using electronic means such as high speed fiber optic networks, the internet, and mobile computing as well as Internet of Things that have the ability to transform the dominant role of citizens and technology in smart cities. These technologies can transform the way in which business is conducted, the interaction of interface with citizens and academic institutions, and improve interactions between business, industry, and city government…(More).

The cloud, the crowd, and the city: How new data practices reconfigure urban governance?


Introduction to Special Issue of Big Data & Society by ,  and : “The urban archetype of the flâneur, so central to the concept of modernity, can now experience the city in ways unimaginable one hundred years ago. Strolling around Paris, the contemporary flâneur might stop to post pictures of her discoveries on Instagram, simultaneously identifying points of interest to the rest of her social network and broadcasting her location (perhaps unknowingly). The café she visits might be in the middle of a fundraising campaign through a crowdfunding site such as Kickstarter, and she might be invited to tweet to her followers in exchange for a discount on her pain au chocolate. As she ambles about Paris, the route of her stroll is captured by movement sensors positioned on top of street lights, and this data—aggregated with that of thousands of other pedestrians—could be used by the City of Paris to sync up transit schedules. And if those schedules were not convenient, she might tap Uber to whisk her home to her threadbare pension booked on AirBnB.

This vignette attests to the transformation of the urban experience through technology-enabled platforms that allow for the quick mobilization and exchange of information, public services, surplus capacity, entrepreneurial energy, and money. However, these changes have implicated more than just consumers, as multiple technologies have been taken up in urban governance processes through platforms variously labeled as Big Data, crowd sourcing, or the sharing economy. These systems combine inexpensive data collection and cloud-based storage, distributed social networks, geotagged locational sensing, mobile access (often through “app” platforms), and new collaborative entrepreneurship models to radically alter how the needs of urban residents are identified and how services are delivered and consumed in so-called “smart cities” (Townsend, 2013). Backed by Big Data, smart city initiatives have made inroads into urban service provision and policy in areas such as e-government and transparency, new forms of public-private partnerships through “urban lab” arrangements, or models such as impact investing, civic hacking, or tactical urbanism (cf. Karvonen and van Heur, 2014; Kitchin, 2014; Swyngedouw, 2005).

In the rhetoric used by their boosters, the vision and practice of these technologies “disrupts” existing markets by harnessing the power of “the crowd”—a process fully evident in sectors such as taxi (Uber/Lyft), hoteling (AirBnB), and finance (peer-to-peer lending). However, the notion of disruption has also targeted government bureaucracies and public services, with new initiatives seeking to insert crowd mechanisms or characteristics—at once self-organizing and collectively rational (Brabham, 2008)—into public policy. These mechanisms envision reconfiguring the traditional relationship of public powers with planning and governance by vesting data collection and problem-solving in crowd-like institutional arrangements that are partially or wholly outside the purview of government agencies. While scholars are used to talking about “governance beyond-the-state” (Swyngedouw, 2005) in terms of privatization and a growing scope for civil society organizations, technological intermediation potentially changes the scale and techniques of governance as well as its relationship to sovereign authority.

For instance, civic crowdfunding models have emerged as new means of organizing public service provision and funding community economic development by embracing both market-like bidding mechanisms and social-network technologies to distribute responsibility for planning and financing socially desirable investments to laypeople (Brickstarter, 2012; Correia de Freitas and Amado, 2013; Langley and Leyshon, 2016). Other practices are even more radical in their scope. Toronto’s Urban Repair Squad—an offshoot of the aptly named Critical Mass bike happenings—urges residents to take transportation planning into their own hands and paint their own bike lanes. Their motto: “They say city is broke. We fix. No charge.” (All that is missing is the snarky “you’re welcome” at the end.)

Combined, these emerging platforms and practices are challenging the tactics, capabilities, and authorizations employed to define and govern urban problems. This special theme of Big Data & Society picks up these issues, interrogating the emergence of digital platforms and smart city initiatives that rely on both the crowd and the cloud (new on-demand, internet-based technologies that store and process data) to generate and fold Big Data into urban governance. The papers contained herein were presented as part of a one-day symposium held at the University of Illinois at Chicago (UIC) in April 2015 and sponsored by UIC’s Department of Urban Planning and Policy. Setting aside the tired narratives of individual genius and unstoppable technological progress, workshop participants sought to understand why these practices and platforms have recently gained popularity and what their implementation might mean for cities. Papers addressed numerous questions: How have institutional supports and political-economic contexts facilitated the ascendance of “crowd” and “cloud” models within different spheres of urban governance? How do their advocates position them relative to imaginaries of state or market failure/dysfunction? What kinds of assumptions and expectations are embedded in the design and operation of these platforms and practices? What kinds of institutional reconfigurations have been spurred by the push to adopt smart city initiatives? How is information collected through these initiatives being used to advance particular policy agendas? Who is likely to benefit from them?…(More)”.

The Smart City Concept in the 21st Century


Essay by Mircea EremiaLucian Toma and Mihai Sanduleac in Procedia Engineering: “The quality of life was significantly improved in the last century mainly as regards the access to services. However, the heavy industrialization and the increasing population in the urban areas has been a big challenge for administrators, architects and urban planners. This paper provides a brief presentation of the evolution of the “smart city” term and the most representative characteristics of it. Furthermore, various alternative terms that were proposed to describe the multiple characteristics of the future cities are analyzed. A connection between smart city and smart grid is also presented….(More)”

 

Using Open Data to Combat Corruption


Robert Palmer at Open Data Charter: “…today we’re launching the Open Up Guide: Using Open Data to Combat Corruption. We think that with the right conditions in place, greater transparency can lead to more accountability, less corruption and better outcomes for citizens. This guide builds on the work in this area already done by the G20’s anti-corruption working group, Transparency International and the Web Foundation.

Inside the guide you’ll find a number of tools including:

  • A short overview on how open data can be used to combat corruption.
  • Use cases and methodologies. A series of case studies highlighting existing and future approaches to the use of open data in the anti-corruption field.
  • 30 priority datasets and the key attributes needed so that they can talk to each other. To address corruption networks it is particularly important that connections can be established and followed across data sets, national borders and different sectors.
  • Data standards. Standards describe what should be published, and the technical details of how it should be made available. The report includes some of the relevant standards for anti-corruption work, and highlights the areas where there are currently no standards.

The guide has been developed by Transparency International-Mexico, Open Contracting Partnership and the Open Data Charter, building on input from government officials, open data experts, civil society and journalists. It’s been designed as a practical tool for governments who want to use open data to fight corruption. However, it’s still a work in progress and we want feedback on how to make it more useful. Please either comment directly on the Google Doc version of the guide, or email us at [email protected]….View the full guide.”