Digital Media Integration for Participatory Democracy


Book by Rocci Luppicini and Rachel Baarda: “Digital technology has revitalized the landscape of political affairs. As e-government continues to become more prominent in society, conducting further research in this realm is vital to promoting democratic advancements.

Digital Media Integration for Participatory Democracy provides a comprehensive examination of the latest methods and trends used to engage citizens with the political world through new information and communication technologies. Highlighting innovative practices and applications across a variety of areas such as technoethics, civic literacy, virtual reality, and social networking, this book is an ideal reference source for government officials, academicians, students, and researchers interested in the enhancement of citizen engagement in modern democracies….(More)”

Restoring Trust in Expertise


Minouche Shafik at Project Syndicate: “…public confidence in experts is at a crossroads. With news becoming more narrowly targeted to individual interests and preferences, and with people increasingly choosing whom to trust and follow, the traditional channels for sharing expertise are being disrupted. Who needs experts when you have Facebook, Google, Mumsnet, and Twitter?

Actually, we all do. Over the course of human history, the application of expertise has helped tackle disease, reduce poverty, and improve human welfare. If we are to build on this progress, we need reliable experts to whom the public can confidently turn.

Restoring confidence requires, first, that those describing themselves as “experts” embrace uncertainty. Rather than pretending to be certain and risk frequently getting it wrong, commentators should be candid about uncertainty. Over the long term, such an approach will rebuild credibility. A good example of this is the use of “fan charts” in forecasts produced by the Bank of England’s Monetary Policy Committee (MPC), which show the wide range of possible outcomes for issues such as inflation, growth, and unemployment.

Yet conveying uncertainty increases the complexity of a message. This is a major challenge. It is easy to tweet “BoE forecasts 2% growth.” The fan chart’s true meaning – “If economic circumstances identical to today were to prevail on 100 occasions, the MPC’s best collective judgment is that the mature estimate of GDP growth would lie above 2% on 50 occasions and below 2% on 50 occasions” – doesn’t even fit within Twitter’s 140-character limit.

This underscores the need for sound principles and trustworthy practices to become more widespread as technology changes the way we consume information. Should journalists and bloggers be exposed for reporting or recirculating falsehoods or rumors? Perhaps principles and practices widely used in academia – such as peer review, competitive processes for funding research, transparency about conflicts of interests and financing sources, and requirements to publish underlying data – should be adapted and applied more widely to the world of think tanks, websites, and the media….

Schools and universities will have to do more to educate students to be better consumers of information. Striking research by the Stanford History Education Group, based on tests of thousands of students across the US, described as “bleak” their findings about young people’s ability to evaluate information they encounter online. Fact-checking websites appraising the veracity of claims made by public figures are a step in the right direction, and have some similarities to peer review in academia.

Listening to the other side is crucial. Social media exacerbates the human tendency of groupthink by filtering out opposing views. We must therefore make an effort to engage with opinions that are different from our own and resist algorithmic channeling to avoid difference. Perhaps technology “experts” could code algorithms that burst such bubbles.

Finally, the boundary between technocracy and democracy needs to be managed more carefully. Not surprisingly, when unelected individuals steer decisions that have huge social consequences, public resentment may not be far behind. Problems often arise when experts try to be politicians or politicians try to be experts. Clarity about roles – and accountability when boundaries are breached – is essential.

We need expertise more than ever to solve the world’s problems. The question is not how to manage without experts, but how to ensure that expertise is trustworthy. Getting this right is vital: if the future is not to be shaped by ignorance and narrow-mindedness, we need knowledge and informed debate more than ever before….(More)”.

Global Patterns of Synchronization in Human Communications


Alfredo J. Morales, Vaibhav Vavilala, Rosa M. Benito, and Yaneer Bar-Yam in the Journal of the Royal Society Interface: “Social media are transforming global communication and coordination and provide unprecedented opportunities for studying socio-technical domains. Here we study global dynamical patterns of communication on Twitter across many scales. Underlying the observed patterns is both the diurnal rotation of the earth, day and night, and the synchrony required for contingency of actions between individuals. We find that urban areas show a cyclic contraction and expansion that resembles heartbeats linked to social rather than natural cycles. Different urban areas have characteristic signatures of daily collective activities. We show that the differences detected are consistent with a new emergent global synchrony that couples behavior in distant regions across the world. Although local synchrony is the major force that shapes the collective behavior in cities, a larger-scale synchronization is beginning to occur….(More)”.

When the Big Lie Meets Big Data


Peter Bruce in Scientific America: “…The science of predictive modeling has come a long way since 2004. Statisticians now build “personality” models and tie them into other predictor variables. … One such model bears the acronym “OCEAN,” standing for the personality characteristics (and their opposites) of openness, conscientiousness, extroversion, agreeableness, and neuroticism. Using Big Data at the individual level, machine learning methods might classify a person as, for example, “closed, introverted, neurotic, not agreeable, and conscientious.”

Alexander Nix, CEO of Cambridge Analytica (owned by Trump’s chief donor, Rebekah Mercer), says he has thousands of data points on you, and every other voter: what you buy or borrow, where you live, what you subscribe to, what you post on social media, etc. At a recent Concordia Summit, using the example of gun rights, Nix described how messages will be crafted to appeal specifically to you, based on your personality profile. Are you highly neurotic and conscientious? Nix suggests the image of a sinister gloved hand reaching through a broken window.

In his presentation, Nix noted that the goal is to induce behavior, not communicate ideas. So where does truth fit in? Johan Ugander, Assistant Professor of Management Science at Stanford, suggests that, for Nix and Cambridge Analytica, it doesn’t. In counseling the hypothetical owner of a private beach how to keep people off his property, Nix eschews the merely factual “Private Beach” sign, advocating instead a lie: “Sharks sighted.” Ugander, in his critique, cautions all data scientists against “building tools for unscrupulous targeting.”

The warning is needed, but may be too late. What Nix described in his presentation involved carefully crafted messages aimed at his target personalities. His messages pulled subtly on various psychological strings to manipulate us, and they obeyed no boundary of truth, but they required humans to create them.  The next phase will be the gradual replacement of human “craftsmanship” with machine learning algorithms that can supply targeted voters with a steady stream of content (from whatever source, true or false) designed to elicit desired behavior. Cognizant of the Pandora’s box that data scientists have opened, the scholarly journal Big Data has issued a call for papers for a future issue devoted to “Computational Propaganda.”…(More)”

Open Government Data in Africa: A preference elicitation analysis of media practitioners


Eric Afful-Dadzie and Anthony Afful-Dadzie in Government Information Quarterly: “Open Government Data (OGD) continues to receive considerable traction around the world. In particular, there have been a growing number of OGD establishments in the developed world, sparking expectations of similar trends in growing democracies. To understand the readiness of OGD stakeholders in Africa especially the media, this paper (1) reviews current infrastructure at OGD web portals in Africa and (2) conducts a preference elicitation analysis among media practitioners in 5 out of the 7 OGD country centers in Africa regarding desired structure of OGD in developing countries. The analysis gives a view of the relative importance media practitioners ascribe to a selected set of OGD attributes in anticipation of a more functional OGD in their respective countries. Using conjoint analysis, the result indicates that media practitioners put premium on ‘metadata’ and ‘data format’ respectively in order of importance. Results from the review also reveal that features of current OGD web portals in Africa are not consistent with the desired preferences of users. Overall, the study provides a general insight into media expectations of OGD in Africa, and also serves as a foundational knowledge for authorities and practitioners to manage expectations of the media in connection with OGD in Africa….(More)”.

Crowdsourcing Cybersecurity: Cyber Attack Detection using Social Media


Paper by Rupinder Paul Khandpur, Taoran Ji, Steve Jan, Gang Wang, Chang-Tien Lu, Naren Ramakrishnan: “Social media is often viewed as a sensor into various societal events such as disease outbreaks, protests, and elections. We describe the use of social media as a crowdsourced sensor to gain insight into ongoing cyber-attacks. Our approach detects a broad range of cyber-attacks (e.g., distributed denial of service (DDOS) attacks, data breaches, and account hijacking) in an unsupervised manner using just a limited fixed set of seed event triggers. A new query expansion strategy based on convolutional kernels and dependency parses helps model reporting structure and aids in identifying key event characteristics. Through a large-scale analysis over Twitter, we demonstrate that our approach consistently identifies and encodes events, outperforming existing methods….(More)”

Social Media for Government


Book by Gohar Feroz Khan: “This book provides practical know-how on understanding, implementing, and managing main stream social media tools (e.g., blogs and micro-blogs, social network sites, and content communities) from a public sector perspective. Through social media, government organizations can inform citizens, promote their services, seek public views and feedback, and monitor satisfaction with the services they offer so as to improve their quality. Given the exponential growth of social media in contemporary society, it has become an essential tool for communication, content sharing, and collaboration. This growth and these tools also present an unparalleled opportunity to implement a transparent, open, and collaborative government.  However, many government organization, particularly those in the developing world, are still somewhat reluctant to leverage social media, as it requires significant policy and governance changes, as well as specific know-how, skills and resources to plan, implement and manage social media tools. As a result, governments around the world ignore or mishandle the opportunities and threats presented by social media. To help policy makers and governments implement a social media driven government, this book provides guidance in developing an effective social media policy and strategy. It also addresses issues such as those related to security and privacy….(More)”

Why We Make Free, Public Information More Accessible


Gabi Fitz and Lisa Brooks in Philantopic: “One of the key roles the nonprofit sector plays in civil society is providing evidence about social problems and their solutions. Given recent changes to policies regarding the sharing of knowledge and evidence by federal agencies, that function is more critical than ever.

Nonprofits deliver more than direct services such as running food banks or providing shelter to people who are homeless. They also collect and share data, evidence, and lessons learned so as to help all of us understand complex and difficult problems.

Those efforts not only serve to illuminate and benchmark our most pressing social problems, they also inform the actions we take, whether at the individual, organizational, community, or policy level. Often, they provide the evidence in “evidence-based” decision making, not to mention the knowledge that social sector organizations and policy makers rely on when shaping their programs and services and individual citizens turn to inform their own engagement.

In January 2017, several U.S. government agencies, including the Environmental Protection Agency and the Departments of Health and Human Services and Agriculture, were ordered by officials of the incoming Trump administration not to share anything that could be construed as controversial through official communication channels such as websites and social media channels. (See “Federal Agencies Told to Halt External Communications.”) Against that backdrop, the nonprofit sector’s interest in generating and sharing evidence has become more urgent than ever…..

Providing access to evidence and lessons learned is always important, but in light of recent events, we believe it’s more necessary than ever. That’s why we are asking for your help in providing — and preserving — access to this critical knowledge base.

Over the next few months, we will be updating and maintaining special collections of non-academic research on the following topics and need lead curators with issue expertise to lend us a hand. IssueLab special collections are an effort to contextualize important segments of the growing evidence base we curate, and are one of the ways we  help visitors to the platform learn about nonprofit organizations and resources that may be useful to their work and knowledge-gathering efforts.

Possible special collection topics to be updated or curated:

→ Access to reproductive services (new)
→ Next steps for ACA
→ Race and policing
→ Immigrant detention and deportation
→ Climate change and extractive mining (new)
→ Veterans affairs
→ Gun violence

If you are a researcher, knowledge broker, or service provider in any of these fields of practice, please consider volunteering as a lead curator. …(More)”

Corporate Social Responsibility for a Data Age


Stefaan G. Verhulst in the Stanford Social Innovation Review: “Proprietary data can help improve and save lives, but fully harnessing its potential will require a cultural transformation in the way companies, governments, and other organizations treat and act on data….

We live, as it is now common to point out, in an era of big data. The proliferation of apps, social media, and e-commerce platforms, as well as sensor-rich consumer devices like mobile phones, wearable devices, commercial cameras, and even cars generate zettabytes of data about the environment and about us.

Yet much of the most valuable data resides with the private sector—for example, in the form of click histories, online purchases, sensor data, and call data records. This limits its potential to benefit the public and to turn data into a social asset. Consider how data held by business could help improve policy interventions (such as better urban planning) or resiliency at a time of climate change, or help design better public services to increase food security.

Data responsibility suggests steps that organizations can take to break down these private barriers and foster so-called data collaboratives, or ways to share their proprietary data for the public good. For the private sector, data responsibility represents a new type of corporate social responsibility for the 21st century.

While Nepal’s Ncell belongs to a relatively small group of corporations that have shared their data, there are a few encouraging signs that the practice is gaining momentum. In Jakarta, for example, Twitter exchanged some of its data with researchers who used it to gather and display real-time information about massive floods. The resulting website, PetaJakarta.org, enabled better flood assessment and management processes. And in Senegal, the Data for Development project has brought together leading cellular operators to share anonymous data to identify patterns that could help improve health, agriculture, urban planning, energy, and national statistics.

Examples like this suggest that proprietary data can help improve and save lives. But to fully harness the potential of data, data holders need to fulfill at least three conditions. I call these the “the three pillars of data responsibility.”…

The difficulty of translating insights into results points to some of the larger social, political, and institutional shifts required to achieve the vision of data responsibility in the 21st century. The move from data shielding to data sharing will require that we make a cultural transformation in the way companies, governments, and other organizations treat and act on data. We must incorporate new levels of pro-activeness, and make often-unfamiliar commitments to transparency and accountability.

By way of conclusion, here are four immediate steps—essential but not exhaustive—we can take to move forward:

  1. Data holders should issue a public commitment to data responsibility so that it becomes the default—an expected, standard behavior within organizations.
  2. Organizations should hire data stewards to determine what and when to share, and how to protect and act on data.
  3. We must develop a data responsibility decision tree to assess the value and risk of corporate data along the data lifecycle.
  4. Above all, we need a data responsibility movement; it is time to demand data responsibility to ensure data improves and safeguards people’s lives…(More)”

Why big data may be having a big effect on how our politics plays out


 in The Conversation: “…big data… is an inconceivably vast mass of information, which at first glance would seem a giant mess; just white noise.

Unless you know how to decipher it.

According to a story first published in Zurich-based Das Magazin in December and more recently taken up by Motherboard, events such as Brexit and Trump’s ascendency may have been made possible through just such deciphering. The argument is that technology combining psychological profiling and data analysis may have played a pivotal part in exploiting unconscious bias at the individual voter level. The theory is this was used in the recent US election to increase or suppress votes to benefit particular candidates in crucial locations. It is claimed that the company behind this may be active in numerous countries.

The technology at play is based on the integration of a model of psychological profiling known as OCEAN. This uses the details contained within individuals’ digital footprints to create user-specific profiles. These map to the level of the individual, identifiable voter, who can then be manipulated by exploiting beliefs, preferences and biases that they might not even be aware of, but which their data has revealed about them in glorious detail.

As well as enabling the creation of tailored media content, this can also be used to create scripts of relevant talking points for campaign doorknockers to focus on, according to the address and identity of the householder to whom they are speaking.

This goes well beyond the scope and detail of previous campaign strategies. If the theory about the role of these techniques is correct, it signals a new landscape of political strategising. An active researcher in the field, when writing about the company behind this technology (which Trump paid for services during his election campaign), described the potential scale of such technologies:

Marketers have long tailored their placement of advertisements based on their target group, for example by placing ads aimed at conservative consumers in magazines read by conservative audiences. What is new about the psychological targeting methods implemented by Cambridge Analytica, however, is their precision and scale. According to CEO Alexander Nix, the company holds detailed psycho-demographic profiles of more than 220 million US citizens and used over 175,000 different ad messages to meet the unique motivations of their recipients….(More)”