Social media and citizen engagement: A meta-analytic review


Paper by Marko M. Skoric et al: “This meta-analytic study reviews empirical research published from 2007 to 2013 with an aim of providing robust conclusions about the relationship between social media use and citizen engagement. It includes 22 studies that used self-reported measures of social media use and participation, with a total of 116 relationships/effects. The results suggest that social media use generally has a positive relationship with engagement and its three sub-categories, that is, social capital, civic engagement, and political participation. More specifically, we find small-to-medium size positive relationships between expressive, informational, and relational uses of social media and the above indicators of citizen engagement. For identity- and entertainment-oriented uses of social media, our analyses find little evidence supporting their relationship with citizen engagement….(More)”

For people, by people


Geeta Padmanabhan at the Hindu: “Ippodhu, a mobile app, is all about crowd-sourced civic participation for good governance…Last week, a passer-by noticed how the large hoardings outside Vivekanandar Illam, facing Marina Beach, blocked the view of the iconic building. Enraged, he whipped out his smartphone, logged on to Ippodhu and wrote: “How is this allowed? The banners are in the walking space and we can’t see the historic building!” Ippodhu.com carried the story with pictures.

“On Ippodhu, a community information mobile application, the person complaining has the option to do more,” says Peer Mohamed, the team leader of the app/website. “He could have registered a complaint with the police, the Corporation or a relevant NGO, using the ‘Act’ option. This facility makes Ippodhu a valuable tool for beleaguered citizens to complain and puts it above other social media avenues.”

Users can choose between Tamil and English, and read the latest posts just as they would in a Twitter feed. While posting, your location is geo-tagged automatically; if you find that intrusive, you can post anonymously. There is no word limit and one can enlarge the font, write an essay, a note or a rant and post it under one of 15 categories. I decided to check out the app and created an account. My post went live in less than a minute. Then I moved to Ippodhu’s USP. I clicked‘Act’, chose ‘civic issue’ as the category, and posted a note about flooding in my locality. “It’s on Apple and Android as just text now, but expect picture and video features soon when the circulation hits the target,” says Peer. “My team of 12 journalists curates the feeds 24/7, allowing no commercials, ads or abusive language. We want to keep it non-controversial and people-friendly.” It’s crowd-sourced citizen journalism and civic participation for good governance….(More)”

Decoding the Future for National Security


George I. Seffers at Signal: “U.S. intelligence agencies are in the business of predicting the future, but no one has systematically evaluated the accuracy of those predictions—until now. The intelligence community’s cutting-edge research and development agency uses a handful of predictive analytics programs to measure and improve the ability to forecast major events, including political upheavals, disease outbreaks, insider threats and cyber attacks.

The Office for Anticipating Surprise at the Intelligence Advanced Research Projects Activity (IARPA) is a place where crystal balls come in the form of software, tournaments and throngs of people. The office sponsors eight programs designed to improve predictive analytics, which uses a variety of data to forecast events. The programs all focus on incidents outside of the United States, and the information is anonymized to protect privacy. The programs are in different stages, some having recently ended as others are preparing to award contracts.

But they all have one more thing in common: They use tournaments to advance the state of the predictive analytic arts. “We decided to run a series of forecasting tournaments in which people from around the world generate forecasts about, now, thousands of real-world events,” says Jason Matheny, IARPA’s new director. “All of our programs on predictive analytics do use this tournament style of funding and evaluating research.” The Open Source Indicators program used a crowdsourcing technique in which people across the globe offered their predictions on such events as political uprisings, disease outbreaks and elections.

The data analyzed included social media trends, Web search queries and even cancelled dinner reservations—an indication that people are sick. “The methods applied to this were all automated. They used machine learning to comb through billions of pieces of data to look for that signal, that leading indicator, that an event was about to happen,” Matheny explains. “And they made amazing progress. They were able to predict disease outbreaks weeks earlier than traditional reporting.” The recently completed Aggregative Contingent Estimation (ACE) program also used a crowdsourcing competition in which people predicted events, including whether weapons would be tested, treaties would be signed or armed conflict would break out along certain borders. Volunteers were asked to provide information about their own background and what sources they used. IARPA also tested participants’ cognitive reasoning abilities. Volunteers provided their forecasts every day, and IARPA personnel kept score. Interestingly, they discovered the “deep domain” experts were not the best at predicting events. Instead, people with a certain style of thinking came out the winners. “They read a lot, not just from one source, but from multiple sources that come from different viewpoints. They have different sources of data, and they revise their judgments when presented with new information. They don’t stick to their guns,” Matheny reveals. …

The ACE research also contributed to a recently released book, Superforecasting: The Art and Science of Prediction, according to the IARPA director. The book was co-authored, along with Dan Gardner, by Philip Tetlock, the Annenberg University professor of psychology and management at the University of Pennsylvania who also served as a principal investigator for the ACE program. Like ACE, the Crowdsourcing Evidence, Argumentation, Thinking and Evaluation program uses the forecasting tournament format, but it also requires participants to explain and defend their reasoning. The initiative aims to improve analytic thinking by combining structured reasoning techniques with crowdsourcing.

Meanwhile, the Foresight and Understanding from Scientific Exposition (FUSE) program forecasts science and technology breakthroughs….(More)”

Government’s innovative approach to skills sharing


Nicole Blake Johnson at GovLoop: “For both managers and employees, it often seems there aren’t enough hours in the day to tackle every priority project.

But what if there was another option — a way for federal managers to get the skills they need internally and for employees to work on projects they’re interested in but unaware of?

Maybe you’re the employee who is really into data analytics or social media, but that’s not a part of your regular job duties. What if you had the support of your supervisor to help out on an analytics project down the hall or in a field office across the country?

I’m not making up hypothetical scenarios. These types of initiatives are actually taking shape at federal agencies, including the Environmental Protection Agency, Social Security Administration, Health and Human Services and Commerce departments.

Many agencies are in the pilot phase of rolling out their programs, which are versions of a governmentwide initiative called GovConnect. The initiative was inspired by an EPA program called Skills Marketplace that dates back to 2011.(Read more about GovConnect here.)

“We felt like we had something really promising at EPA, and we wanted to share it with other government agencies,” said Noha Gaber, EPA’s Director of Internal Communications. “So we actually pitched it to OPM and several other agencies, and that ended up becoming GovConnect.”

“The goal of GovConnect is to develop federal workforce skills through cross-agency collaboration and teamwork, to enable more agile response to mission demands without being unnecessarily limited by organizational silos,” said Melissa Kline Lee, who serves as Program Manager of GovConnect at the Office of Personnel Management. “As part of the President’s Management Agenda, the Office of Personnel Management and Environmental Protection Agency are using the GovConnect pilot to help agencies test and scale new approaches to workforce development.”…

Managers post projects or tasks in the online marketplace, which was developed using the agency’s existing SharePoint environment. Projects include clear tasks that employees can accomplish using up to 20 percent of their workweek or less. Projects cannot be open-ended and should not exceed one year.

From there, any employee can view the projects, evaluate what skills or competencies are needed and apply for the position. Managers review the applications and conduct interviews before selecting a candidate. Here are the latest stats for Skills Marketplace as of November 2015:

  • Managers posted 358 projects in the marketplace
  • Employees submitted 577 applications
  • More than 750 people have created profiles for the marketplace

Gaber shared one example involving an employee from the Office of Pesticide Programs and staff from the Office of Environmental Information (OEI), which is the main IT office at EPA. The employee brought to the team technical expertise and skills in geographic information systems to support OEI’s Toxic Release Inventory Program, which tracks data on toxic chemicals being produced by different facilities.

The benefits were twofold: The employee established new connections in a different part of the agency, and his home office benefited from the experiences and knowledge he gleaned while working on the project….(More)

Political Turbulence: How Social Media Shape Collective Action


Book by Helen Margetts, Peter John, Scott Hale, & Taha Yasseri: “As people spend increasing proportions of their daily lives using social media, such as Twitter and Facebook, they are being invited to support myriad political causes by sharing, liking, endorsing, or downloading. Chain reactions caused by these tiny acts of participation form a growing part of collective action today, from neighborhood campaigns to global political movements. Political Turbulence reveals that, in fact, most attempts at collective action online do not succeed, but some give rise to huge mobilizations—even revolutions.

Drawing on large-scale data generated from the Internet and real-world events, this book shows how mobilizations that succeed are unpredictable, unstable, and often unsustainable. To better understand this unruly new force in the political world, the authors use experiments that test how social media influence citizens deciding whether or not to participate. They show how different personality types react to social influences and identify which types of people are willing to participate at an early stage in a mobilization when there are few supporters or signals of viability. The authors argue that pluralism is the model of democracy that is emerging in the social media age—not the ordered, organized vision of early pluralists, but a chaotic, turbulent form of politics.

This book demonstrates how data science and experimentation with social data can provide a methodological toolkit for understanding, shaping, and perhaps even predicting the outcomes of this democratic turbulence….(More)”

Big Data and Privacy: Emerging Issues


O’Leary, Daniel E. at Intelligent Systems, IEEE : “The goals of big data and privacy are fundamentally opposed to each other. Big data and knowledge discovery are aimed reducing information asymmetries between organizations and the data sources, whereas privacy is aimed at maintaining information asymmetries of data sources. A number of different definitions of privacy are used to investigate some of the tensions between different characteristics of big data and potential privacy concerns. Specifically, the author examines the consequences of unevenness in big data, digital data going from local controlled settings to uncontrolled global settings, privacy effects of reputation monitoring systems, and inferring knowledge from social media. In addition, the author briefly analyzes two other emerging sources of big data: police cameras and stingray for location information….(More)”

How Big Data is Helping to Tackle Climate Change


Bernard Marr at DataInformed: “Climate scientists have been gathering a great deal of data for a long time, but analytics technology’s catching up is comparatively recent. Now that cloud, distributed storage, and massive amounts of processing power are affordable for almost everyone, those data sets are being put to use. On top of that, the growing number of Internet of Things devices we are carrying around are adding to the amount of data we are collecting. And the rise of social media means more and more people are reporting environmental data and uploading photos and videos of their environment, which also can be analyzed for clues.

Perhaps one of the most ambitious projects that employ big data to study the environment is Microsoft’s Madingley, which is being developed with the intention of creating a simulation of all life on Earth. The project already provides a working simulation of the global carbon cycle, and it is hoped that, eventually, everything from deforestation to animal migration, pollution, and overfishing will be modeled in a real-time “virtual biosphere.” Just a few years ago, the idea of a simulation of the entire planet’s ecosphere would have seemed like ridiculous, pie-in-the-sky thinking. But today it’s something into which one of the world’s biggest companies is pouring serious money. Microsoft is doing this because it believes that analytical technology has finally caught up with the ability to collect and store data.

Another data giant that is developing tools to facilitate analysis of climate and ecological data is EMC. Working with scientists at Acadia National Park in Maine, the company has developed platforms to pull in crowd-sourced data from citizen science portals such as eBird and iNaturalist. This allows park administrators to monitor the impact of climate change on wildlife populations as well as to plan and implement conservation strategies.

Last year, the United Nations, under its Global Pulse data analytics initiative, launched the Big Data Climate Challenge, a competition aimed to promote innovate data-driven climate change projects. Among the first to receive recognition under the program is Global Forest Watch, which combines satellite imagery, crowd-sourced witness accounts, and public datasets to track deforestation around the world, which is believed to be a leading man-made cause of climate change. The project has been promoted as a way for ethical businesses to ensure that their supply chain is not complicit in deforestation.

Other initiatives are targeted at a more personal level, for example by analyzing transit routes that could be used for individual journeys, using Google Maps, and making recommendations based on carbon emissions for each route.

The idea of “smart cities” is central to the concept of the Internet of Things – the idea that everyday objects and tools are becoming increasingly connected, interactive, and intelligent, and capable of communicating with each other independently of humans. Many of the ideas put forward by smart-city pioneers are grounded in climate awareness, such as reducing carbon dioxide emissions and energy waste across urban areas. Smart metering allows utility companies to increase or restrict the flow of electricity, gas, or water to reduce waste and ensure adequate supply at peak periods. Public transport can be efficiently planned to avoid wasted journeys and provide a reliable service that will encourage citizens to leave their cars at home.

These examples raise an important point: It’s apparent that data – big or small – can tell us if, how, and why climate change is happening. But, of course, this is only really valuable to us if it also can tell us what we can do about it. Some projects, such as Weathersafe, which helps coffee growers adapt to changing weather patterns and soil conditions, are designed to help humans deal with climate change. Others are designed to tackle the problem at the root, by highlighting the factors that cause it in the first place and showing us how we can change our behavior to minimize damage….(More)”

Anonymous hackers could be Islamic State’s online nemesis


 at the Conversation: “One of the key issues the West has had to face in countering Islamic State (IS) is the jihadi group’s mastery of online propaganda, seen in hundreds of thousands of messages celebrating the atrocities against civilians and spreading the message of radicalisation. It seems clear that efforts to counter IS online are missing the mark.

A US internal State Department assessment noted in June 2015 how the violent narrative of IS had “trumped” the efforts of the world’s richest and most technologically advanced nations. Meanwhile in Europe, Interpol was to track and take down social media accounts linked to IS, as if that would solve the problem – when in fact doing so meant potentially missing out on intelligence gathering opportunities.

Into this vacuum has stepped Anonymous, a fragmented loose network of hacktivists that has for years launched occasional cyberattacks against government, corporate and civil society organisations. The group announced its intention to take on IS and its propaganda online, using its networks to crowd-source the identity of IS-linked accounts. Under the banner of #OpIsis and #OpParis, Anonymous published lists of thousands of Twitter accounts claimed to belong to IS members or sympathisers, claiming more than 5,500 had been removed.

The group pursued a similar approach following the attacks on Charlie Hebdo magazine in January 2015, with @OpCharlieHebdo taking down more than 200 jihadist Twitter acounts, bringing down the website Ansar-Alhaqq.net and publishing a list of 25,000 accounts alongside a guide on how to locate pro-IS material online….

Anonymous has been prosecuted for cyber attacks in many countries under cybercrime laws, as their activities are not seen as legitimate protest. It is worth mentioning the ethical debate around hacktivism, as some see cyber attacks that take down accounts or websites as infringing on others’ freedom of expression, while others argue that hacktivism should instead create technologies to circumvent censorship, enable digital equality and open access to information….(More)”

Crowdsourced phone camera footage maps conflicts


Springwise: “The UN requires accurate proof when investigating possible war crimes, but with different sides of a conflict providing contradicting evidence, and the unsafe nature of the environment, gaining genuine insight can be problematic. A team based at Goldsmith’s University in the UK are using amateur footage to investigate.

Forensic Architecture makes use of the increasingly prevalent smartphone footage on social media networks. By crowdsourcing several viewpoints around a given location on an accurately 3D rendered map, the team are able to determine where explosive devices were used, and of what calibre. Key resources are smoke plumes from explosions, which provide a unique shape at any moment, allowing the team to map them and identify the smoke at the exact moment from various viewpoints, providing a dossier of evidence to build up evidence against a war crime.

While Forensic Architecture’s method has been developed to validate war crime atrocities, the potential uses in other areas where satellite data are not available are numerous — forest fire sources could be located based on smoke plumes, and potential crowd crush scenarios may be spotted before they occur….(More)”

The promise and perils of predictive policing based on big data


H. V. Jagadish in the Conversation: “Police departments, like everyone else, would like to be more effective while spending less. Given the tremendous attention to big data in recent years, and the value it has provided in fields ranging from astronomy to medicine, it should be no surprise that police departments are using data analysis to inform deployment of scarce resources. Enter the era of what is called “predictive policing.”

Some form of predictive policing is likely now in force in a city near you.Memphis was an early adopter. Cities from Minneapolis to Miami have embraced predictive policing. Time magazine named predictive policing (with particular reference to the city of Santa Cruz) one of the 50 best inventions of 2011. New York City Police Commissioner William Bratton recently said that predictive policing is “the wave of the future.”

The term “predictive policing” suggests that the police can anticipate a crime and be there to stop it before it happens and/or apprehend the culprits right away. As the Los Angeles Times points out, it depends on “sophisticated computer analysis of information about previous crimes, to predict where and when crimes will occur.”

At a very basic level, it’s easy for anyone to read a crime map and identify neighborhoods with higher crime rates. It’s also easy to recognize that burglars tend to target businesses at night, when they are unoccupied, and to target homes during the day, when residents are away at work. The challenge is to take a combination of dozens of such factors to determine where crimes are more likely to happen and who is more likely to commit them. Predictive policing algorithms are getting increasingly good at such analysis. Indeed, such was the premise of the movie Minority Report, in which the police can arrest and convict murderers before they commit their crime.

Predicting a crime with certainty is something that science fiction can have a field day with. But as a data scientist, I can assure you that in reality we can come nowhere close to certainty, even with advanced technology. To begin with, predictions can be only as good as the input data, and quite often these input data have errors.

But even with perfect, error-free input data and unbiased processing, ultimately what the algorithms are determining are correlations. Even if we have perfect knowledge of your troubled childhood, your socializing with gang members, your lack of steady employment, your wacko posts on social media and your recent gun purchases, all that the best algorithm can do is to say it is likely, but not certain, that you will commit a violent crime. After all, to believe such predictions as guaranteed is to deny free will….

What data can do is give us probabilities, rather than certainty. Good data coupled with good analysis can give us very good estimates of probability. If you sum probabilities over many instances, you can usually get a robust estimate of the total.

For example, data analysis can provide a probability that a particular house will be broken into on a particular day based on historical records for similar houses in that neighborhood on similar days. An insurance company may add this up over all days in a year to decide how much to charge for insuring that house….(More)”