How Twitter Is Being Gamed to Feed Misinformation


the New York Times: “…the biggest problem with Twitter’s place in the news is its role in the production and dissemination of propaganda and misinformation. It keeps pushing conspiracy theories — and because lots of people in the media, not to mention many news consumers, don’t quite understand how it works, the precise mechanism is worth digging into….Here’s how.

The guts of the news business.

One way to think of today’s disinformation ecosystem is to picture it as a kind of gastrointestinal tract…. Twitter often acts as the small bowel of digital news. It’s where political messaging and disinformation get digested, packaged and widely picked up for mass distribution to cable, Facebook and the rest of the world.

This role for Twitter has seemed to grow more intense during (and since) the 2016 campaign. Twitter now functions as a clubhouse for much of the news. It’s where journalists pick up stories, meet sources, promote their work, criticize competitors’ work and workshop takes. In a more subtle way, Twitter has become a place where many journalists unconsciously build and gut-check a worldview — where they develop a sense of what’s important and merits coverage, and what doesn’t.

This makes Twitter a prime target for manipulators: If you can get something big on Twitter, you’re almost guaranteed coverage everywhere….

Twitter is clogged with fake people.

For determined media manipulators, getting something big on Twitter isn’t all that difficult. Unlike Facebook, which requires people to use their real names, Twitter offers users essentially full anonymity, and it makes many of its functions accessible to outside programmers, allowing people to automate their actions on the service.

As a result, numerous cheap and easy-to-use online tools let people quickly create thousands of Twitter bots — accounts that look real, but that are controlled by a puppet master.

Twitter’s design also promotes a slavish devotion to metrics: Every tweet comes with a counter of Likes and Retweets, and users come to internalize these metrics as proxies for real-world popularity….

They may ruin democracy.

…. the more I spoke to experts, the more convinced I became that propaganda bots on Twitter might be a growing and terrifying scourge on democracy. Research suggests that bots are ubiquitous on Twitter. Emilio Ferrara and Alessandro Bessi, researchers at the University of Southern California, found that about a fifth of the election-related conversation on Twitter last year was generated by bots. Most users were blind to them; they treated the bots the same way they treated other users….

in a more pernicious way, bots give us an easy way to doubt everything we see online. In the same way that the rise of “fake news” gives the president cover to label everything “fake news,” the rise of bots might soon allow us to dismiss any online enthusiasm as driven by automation. Anyone you don’t like could be a bot; any highly retweeted post could be puffed up by bots….(More)”.

Routledge Handbook on Information Technology in Government


Book edited by Yu-Che Chen and Michael J. Ahn: “The explosive growth in information technology has ushered in unparalleled new opportunities for advancing public service. Featuring 24 chapters from foremost experts in the field of digital government, this Handbook provides an authoritative survey of key emerging technologies, their current state of development and use in government, and insightful discussions on how they are reshaping and influencing the future of public administration. This Handbook explores:

  • Key emerging technologies (i.e., big data, social media, Internet of Things (IOT), GIS, smart phones & mobile technologies) and their impacts on public administration
  • The impacts of the new technologies on the relationships between citizens and their governments with the focus on collaborative governance
  • Key theories of IT innovations in government on the interplay between technological innovations and public administration
  • The relationship between technology and democratic accountability and the various ways of harnessing the new technologies to advance public value
  • Key strategies and conditions for fostering success in leveraging technological innovations for public service

This Handbook will prove to be an invaluable guide and resource for students, scholars and practitioners interested in this growing field of technological innovations in government….(More)”.

Permanent Campaigning in Canada


Book by Alex MarlandThierry Giasson and Anna Lennox Esselment:  “Election campaigning never stops. That is the new reality of politics and government in Canada, where everyone from staffers in the Prime Minister’s Office to backbench MPs practise political marketing and communication as though the official campaign were still underway.

Permanent Campaigning in Canada examines the growth and democratic implications of political parties’ relentless search for votes and popularity and what a constant state of electioneering means for governance. With the emergence of fixed-date elections and digital media, each day is a battle to win mini-contests: the news cycle, public opinion polls, quarterly fundraising results, by-elections, and more. The contributors’ case studies – on political databases, the strategy behind online political communication, the politicization of government advertising, and the role of the PMO and political staff – reveal how political actors are using all available tools at their disposal to secure electoral advantage, including the use of public resources for partisan gain.

This is the first study of a phenomenon that has become embedded in Canadian politics and government. It reveals the extent to which political parties and political staff have embraced non-stop electioneering, and the consequences for our democratic processes and institutions….(More)”

The Way Ahead


Transcript of lecture delivered by Stephen Fry on the 28th May  2017 • Hay Festival, Hay-on-Wye: “Peter Florence, the supremo of this great literary festival, asked me some months ago if I might, as part of Hay’s celebration of the five hundredth anniversary of Martin Luther’s kickstarting of the reformation, suggest a reform of the internet…

You will be relieved to know, that unlike Martin Luther, I do not have a full 95 theses to nail to the door, or in Hay’s case, to the tent flap. It might be worth reminding ourselves perhaps, however, of the great excitements of the early 16th century. I do not think it is a coincidence that Luther grew up as one of the very first generation to have access to printed books, much as some of you may have children who were the first to grow up with access to e-books, to iPads and to the internet….

The next big step for AI is the inevitable achievement of Artificial General Intelligence, or AGI, sometimes called ‘full artificial intelligence’ the point at which machines really do think like humans. In 2013, hundreds of experts were asked when they thought AGI may arise and the median prediction was they year 2040. After that the probability, most would say certain, is artificial super-intelligence and the possibility of reaching what is called the Technological Singularity – what computer pioneer John van Neumann described as the point “…beyond which humans affairs, as we know them, could not continue.” I don’t think I have to worry about that. Plenty of you in this tent have cause to, and your children beyond question will certainly know all about it. Unless of course the climate causes such havoc that we reach a Meteorological Singularity. Or the nuclear codes are penetrated by a self-teaching algorithm whose only purpose is to find a way to launch…

It’s clear that, while it is hard to calculate the cascade upon cascade of new developments and their positive effects, we already know the dire consequences and frightening scenarios that threaten to engulf us. We know them because science fiction writers and dystopians in all media have got there before us and laid the nightmare visions out. Their imaginations have seen it all coming. So whether you believe Ray Bradbury, George Orwell, Aldous Huxley, Isaac Asimov, Margaret Atwood, Ridley Scott, Anthony Burgess, H. G. Wells, Stanley Kubrick, Kazuo Ishiguro, Philip K. Dick, William Gibson, John Wyndham, James Cameron, the Wachowski’s or the scores and scores of other authors and film-makers who have painted scenarios of chaos and doom, you can certainly believe that a great transformation of human society is under way, greater than Gutenberg’s revolution – greater I would submit than the Industrial Revolution (though clearly dependent on it) – the greatest change to our ways of living since we moved from hunting and gathering to settling down in farms, villages and seaports and started to trade and form civilisations. Whether it will alter the behaviour, cognition and identity of the individual in the same way it is certain to alter the behaviour, cognition and identity of the group, well that is a hard question to answer.

But believe me when I say that it is happening. To be frank it has happened. The unimaginably colossal sums of money that have flowed to the first two generations of Silicon Valley pioneers have filled their coffers, their war chests, and they are all investing in autonomous cars, biotech, the IoT, robotics Artificial Intelligence and their convergence. None more so than the outlier, the front-runner Mr Elon Musk whose neural link system is well worth your reading about online on the great waitbutwhy.com website. Its author Tim Urban is a paid consultant of Elon Musk’s so he has the advantage of knowing what he is writing about but the potential disadvantage of being parti pri and lacking in objectivity. Elon Musk made enough money from his part in the founding and running of PayPal to fund his manifold exploits. The Neuralink project joins his Tesla automobile company and subsidiary battery and solar power businesses, his Space X reusable spacecraft group, his OpenAI initiative and Hyperloop transport system. The 1950s and 60s Space Race was funded by sovereign governments, this race is funded by private equity, by the original investors in Google, Apple, Facebook and so on. Nation states and their agencies are not major players in this game, least of all poor old Britain. Even if our politicians were across this issue, and they absolutely are not, our votes would still be an irrelevance….

So one thesis I would have to nail up to the tent is to clamour for government to bring all this deeper into schools and colleges. The subject of the next technological wave, I mean, not pornography and prostitution. Get people working at the leading edge of AI and robotics to come into the classrooms. But more importantly listen to them – even if what they say is unpalatable, our masters must have the intellectual courage and honesty to say if they don’t understand and ask for repetition and clarification. This time, in other words, we mustn’t let the wave engulf us, we must ride its crest. It’s not quite too late to re-gear governmental and educational planning and thinking….

The witlessness of our leaders and of ourselves is indeed a problem. The real danger surely is not technology but technophobic Canute-ism, a belief that we can control, change or stem the technological tide instead of understanding that we need to learn how to harness it. Driving cars is dangerous, but we developed driving lesson requirements, traffic controls, seat-belts, maintenance protocols, proximity sensors, emission standards – all kinds of ways of mitigating the danger so as not to deny ourselves the life-changing benefits of motoring.

We understand why angry Ned Ludd destroyed the weaving machines that were threatening his occupation (Luddites were prophetic in their way, it was weaving machines that first used the punched cards on which computers relied right up to the 1970s). We understand too why French workers took their clogs, their sabots as they were called, and threw them into the machinery to jam it up, giving us the word sabotage. But we know that they were in the end, if you’ll pardon the phrase, pissing into the wind. No technology has ever been stopped.

So what is the thesis I am nailing up? Well, there is no authority for me to protest to, no equivalent of Pope Leo X for it to be delivered to, and I am certainly no Martin Luther. The only thesis I can think worth nailing up is absurdly simple. It is a cry as much from the heart as from the head and it is just one word – Prepare. We have an advantage over our hunter gatherer and farming ancestors, for whether it is Winter that is coming, or a new Spring, is entirely in our hands, so long as we prepare….(More)”.

ControCurator: Understanding Controversy Using Collective Intelligence


Paper by Benjamin Timmermans et al: “There are many issues in the world that people do not agree on, such as Global Warming [Cook et al. 2013], Anti-Vaccination [Kata 2010] and Gun Control [Spitzer 2015]. Having opposing opinions on such topics can lead to heated discussions, making them appear controversial. Such opinions are often expressed through news articles and social media. There are increasing calls for methods to detect and monitor these online discussions on different topics. Existing methods focus on using sentiment analysis and Wikipedia for identifying controversy [Dori-Hacohen and Allan 2015]. The problem with this is that it relies on a well structured and existing debate, which may not always be the case. Take for instance news reporting during large disasters, in which case the structure of a discussion is not yet clear and may change rapidly. Adding to this is that there is currently no agreed upon definition as to what exactly defines controversy. It is only agreed that controversy arises when there is a large debate by people with opposing viewpoints, but we do not yet understand which are the characteristic aspects and how they can be measured. In this paper we use the collective intelligence of the crowd in order to gain a better understanding of controversy by evaluating the aspects that have impact on it….(More)”

See also http://crowdtruth.org/

 

How can we study disguised propaganda on social media? Some methodological reflections


Jannick Schou and Johan Farkas at DataDrivenJournalism: ’Fake news’ has recently become a seemingly ubiquitous concept among journalists, researchers, and citizens alike. With the rise of platforms such as Facebook and Twitter, it has become possible to spread deliberate forms of misinformation in hitherto unforeseen ways. This has also spilled over into the political domain, where new forms of (disguised) propaganda and false information have recently begun to emerge. These new forms of propaganda have very real effects: they serve to obstruct political decision-making processes, instil false narratives within the general public, and add fuel to already heated sites of political conflict. They represent a genuine democratic problem.

Yet, so far, both critical researchers and journalists have faced a number of issues and challenges when attempting to understand these new forms of political propaganda. Simply put: when it comes to disguised propaganda and social media, we know very little about the actual mechanisms through which such content is produced, disseminated, and negotiated. One of the key explanations for this might be that fake profiles and disguised political agendas are incredibly difficult to study. They present a serious methodological challenge. This is not only due to their highly ephemeral nature, with Facebook pages being able to vanish after only a few days or hours, but also because of the anonymity of its producers. Often, we simply do not know who is disseminating what and with what purpose. This makes it difficult for us to understand and research exactly what is going on.

This post takes its point of departure from a new article published in the international academic journal New Media & Society. Based on the research done for this article, we want to offer some methodological reflections as to how disguised propaganda might be investigated. How can we research fake and disguised political agendas? And what methodological tools do we have at our disposal?…

two main methodological advices spring to mind. First of all: collect as much data as you can in as many ways as possible. Make screenshots, take detailed written observations, use data scraping, and (if possible) participate in citizen groups. One of the most valuable resources we had at our disposal was the set of heterogeneous data we collected from each page. Using this allowed us to carefully dissect and retrace the complex set of practices involved in each page long after they were gone. While we certainly tried to be as systematic in our data collection as possible, we also had to use every tool at our disposal. And we had to constantly be on our toes. As soon as a page emerged, we were there: ready to write down notes and collect data.

Second: be willing to participate and collaborate. Our research showcases the immense potential in researchers (and journalists) actively collaborating with citizen groups and grassroots movements. Using the collective insights and attention of this group allowed us to quickly find and track down pages. It gave us renewed methodological strength. Collaborating across otherwise closed boundaries between research and journalism opens up new avenues for deeper and more detailed insights….(More)”

A framework for analyzing digital volunteer contributions in emergent crisis response efforts


 and  in New Media and Society: “Advances in information, communication, and computational technologies allow digital volunteer networks formed by concerned publics across the globe to contribute to an effective response to disasters and crises. Digital volunteer networks are event-centric and emergent networks. Currently, the literature is sharply growing in the fields of communication, computer science, emergency management, and geography. This article aims to assess the current status of the literature and suggest a comprehensive conceptual framework of digital volunteer networks in response to disasters and crises. This framework is based on a traditional input–process–output model consisting of three dimensions: the disaster and crisis context, a voluntary response process, and outputs and outcomes. We also discuss challenges of digital volunteer networks for crisis response. This article is expected to contribute to the development of related theories and hypotheses and practical strategies for managing digital volunteer networks…(More)”,

Data Collaboratives: exchanging data to create public value across Latin America and the Caribbean


Stefaan Verhulst, Andrew Young and Prianka Srinivasan at IADB’s Abierto al Publico: “Data is playing an ever-increasing role in bolstering businesses across Latin America – and the rest of the word. In Brazil, Mexico and Colombia alone, the revenue from Big Data is calculated at more than US$603.7 million, a market that is only set to increase as more companies across Latin America and the Caribbean embrace data-driven strategies to enhance their bottom-line. Brazilian banking giant Itau plans to create six data centers across the country, and already uses data collected from consumers online to improve cross-selling techniques and streamline their investments. Data from web-clicks, social media profiles, and telecommunication services is fueling a new generation of entrepreneurs keen to make big dollars from big data.

What if this same data could be used not just to improve business, but to improve the collective well-being of our communities, public spaces, and cities? Analysis of social media data can offer powerful insights to city officials into public trends and movements to better plan infrastructure and policies. Public health officials and humanitarian workers can use mobile phone data to, for instance, map human mobility and better target their interventions. By repurposing the data collected by companies for their business interests, governments, international organizations and NGOs can leverage big data insights for the greater public good.

Key question is thus: How to unlock useful data collected by corporations in a responsible manner and ensure its vast potential does not go to waste?

Data Collaboratives” are emerging as a possible answer. Data collaboratives are a new type of public-private partnerships aimed at creating public value by exchanging data across sectors.

Research conducted by the GovLab finds that Data Collaboratives offer several potential benefits across a number of sectors, including humanitarian and anti-poverty efforts, urban planning, natural resource stewardship, health, and disaster management. As a greater number of companies in Latin America look to data to spur business interests, our research suggests that some companies are also sharing and collaborating around data to confront some of society’s most pressing problems.

Consider the following Data Collaboratives that seek to enhance…(More)”

Digital platforms and democracy


Ricard Espelt and Monica Garriga  at Open Democracy: “The impact of digital platforms in recent years affects all areas and all sorts of organizations: from production to consumption, from political parties to social movements, from business to public administration, trade unions, universities or the mass media. The disruption they generate is cross-section and intergenerational. Undoubtedly, their outstanding assets – at least from a discursive point of view –, are self-management and disintermediation. Today, through technology, people can participate actively in processes related to any particular activity. This is why we often talk about digital platforms as tools for democratizing participation, overcoming as they do the traditional tyranny of space and time. If we analyze them in detail, however, and look at the organizations that promote them, we realize that the improvement in citizen involvement tends to vary, sometimes considerably, as does the logic behind their approach…..

La Teixidora, a democratic digital platform

Being aware now of the risks of partial evaluation of the impact of technology and the key elements to be considered in analyzing it, let us return to our starting point: democratizing participation. Given the importance of local assessment of global digital tools, let us now see the case of the multimedia platform La Teixidora, which allows us to synthesize the aspects which, in our opinion, shape democratic participation.

Platform cooperativism or open cooperativism, whether it focuses on the social strength of cooperative values or on the need to reappropriate common goods, calls for a detailed critical review of the local activity of its digital platforms.

This initiative, launched in 2016 in Barcelona, organizes in real time a collaborative structure with the aim of mapping distributed knowledge generated in different parts of the city during conferences, meetings, workshops and other offline meeting formats related to technopolitics and the commons. To do this, it appropriates several open source tools (collaborative editor, wiki, content storage spaces) and uses a Creative Commons license which, while recognizing authorship, allows anyone to adapt the contents and even use them commercially. Two significant apps illustrate the value of its functionalities in relation to democratizing participation:

  1. In March 2016 La Teixidora covered, with a team of some twenty people, a debate on Collaborative Economy (Economies Col·laboratives Procomuns). The classified data were then transferred to the Decidim Barcelona platform, which has helped to define, through a broad participatory process, the Municipal Action Plan of the Barcelona City Council.
  2. At the same time, the tool has been used to monitor the fifteen teams which have been following the economic development program La Comunificadora, whose aim is the promotion of social transformation projects and the advancement of entrepreneurship. Through La Teixidora, the participants have been able to establish a space for exchanging knowledge among them, with the mentors, with the city service managers and with citizens in general. All its contents are open and reusable.

In short, through this platform, both processes have been able not only to contribute proposals, but also to form an open learning space. And by mapping participation, which makes these processes – both of which are promoted by the Public Administration – transparent and accountable, thus improving their democratic quality. At the same time, the information and the learning from their use are helping to redesign the technological platform itself and adapt it to the needs of the communities involved….(More)”.

Twitter as a data source: An overview of tools for journalists


Wasim Ahmed at Data Driven Journalism: “Journalists may wish to use data from social media platforms in order to provide greater insight and context to a news story. For example, journalists may wish to examine the contagion of hashtags and whether they are capable of achieving political or social change. Moreover, newsrooms may also wish to tap into social media posts during unfolding crisis events. For example, to find out who tweeted about a crisis event first, and to empirically examine the impact of social media.

Furthermore, Twitter users and accounts such as WikiLeaks may operate outside the constraints of traditional journalism, and therefore it becomes important to have tools and mechanisms in place in order to examine these kinds of influential users. For example, it was found that those who were backing Marine Le Pen on Twitter could have been users who had an affinity to Donald Trump.

There remains a number of different methods for analysing social media data. Take text analytics, for example, which can include using sentiment analysis to place bulk social media posts into categories of a particular feeling, such as positive, negative, or neutral. Or machine learning, which can automatically assign social media posts to a number of different topics.

There are other methods such as social network analysis, which examines online communities and the relationships between them. A number of qualitative methodologies also exist, such as content analysis and thematic analysis, which can be used to manually label social media posts. From a journalistic perspective, network analysis may be of importance initially via tools such as NodeXL. This is because it can quickly provide an overview of influential Twitter users alongside a topic overview.

From an industry standpoint, there has been much focus on gaining insight into users’ personalities, through services such as IBM Watson’s Personality Insights service. This uses linguistic analytics to derive intrinsic personality insights, such as emotions like anxiety, self-consciousness, and depression. This information can then be used by marketers to target certain products; for example, anti-anxiety medication to users who are more anxious…(An overview of tools for 2017).”