Four things policy-makers need to know about social media data and real time analytics.


Ella McPherson at LSE’s Impact Blog: “I recently gave evidence to the House of Commons Science and Technology Select Committee. This was based on written evidence co-authored with my colleague, Anne Alexander, and submitted to their ongoing inquiry into social media data and real time analytics. Both Anne and I research the use of social media during contested times; Anne looks at its use by political activists and labour movement organisers in the Arab world, and I look at its use in human rights reporting. In both cases, the need to establish facticity is high, as is the potential for the deliberate or inadvertent falsification of information. Similarly to the case that Carruthers makes about war reporting, we believe that the political-economic, methodological, and ethical issues raised by media dynamics in the context of crisis are bellwethers for the dynamics in more peaceful and mundane contexts.

From our work we have learned four crucial lessons that policy-makers considering this issue should understand:

1.  Social media information is vulnerable to a variety of distortions – some typical of all information, and others more specific to the characteristics of social media communications….

2.  If social media information is used to establish events, it must be verified; while technology can hasten this process, it is unlikely to ever occur real time due to the subjective, human element of judgment required….

 

3.  Verifying social media information may require identifying its source, which has ethical implications related to informed consent and anonymisation….

4.  Another way to think about social media information is as what Hermida calls an ‘awareness system,’ which reduces the need to collect source identities; under this approach, researchers look at volume rather than veracity to recognise information of interest… (More)

How We’re Changing the Way We Respond to Petitions


Jason Goldman (White House) at Medium: “…In 2011 (years before I arrived at the White House), the team here developed a petitions platform called We the People. It provided a clear and easy way for the American people to petition their government — along with a threshold for action. Namely — once a petition gains 100,000 signatures.

This was a new system for the United States government, announced as a flagship effort in the first U.S. Open Government National Action Plan. Right now it exists only for the White House (Hey, Congress! We have anopen API! Get in touch!) Some other countries, including Germany and theUnited Kingdom, do online petitions, too. In fact, the European Parliamenthas even started its own online petitioning platform.

For the most part, we’ve been pretty good about responding — before today, the Obama Administration had responded to 255 petitions that had collectively gathered more than 11 million signatures. That’s more than 91 percent of the petitions that have met our threshold requiring a response. Some responses have taken a little longer than others. But now, I’m happy to say, we have caught up.

Today, the White House is responding to every petition in our We the Peoplebacklog — 20 in all.

This means that nearly 2.5 million people who had petitioned us to take action on something heard back today. And it’s our goal to make that response the start of the conversation, not the final page. The White House is made up of offices that research and analyze the kinds of policy issues raised by these petitions, and leaders from those offices will be taking questions today, and in the weeks to come, from petition signers, on topics such as vaccination policy, community policing, and other petition subjects.

Take a look at more We the People stats here.

We’ll start the conversation on Twitter. Follow @WeThePeople, and join the conversation using hashtag #WeThePeople. (I’ll be personally taking your questions on @Goldman44 about how we’re changing the platform specifically at 3:30 p.m. Eastern.)

We the People, Moving Forward

We’re going to be changing a few things about We the People.

  1. First, from now on, if a petition meets the signature goal within a designated period of time, we will aim to respond to it — with an update or policy statement — within 60 days wherever possible. You can read about the details of our policy in the We the People Terms of Participation.
  2. Second, other outside petitions platforms are starting to tap into the We the People platform. We’re excited to announce today that Change.org is choosing to integrate with the We the People platform, meaning the future signatures of its 100 million users will count toward the threshold for getting an official response from the Administration. We’re also opening up the code behind petitions.whitehouse.gov on Drupal.org and GitHub, which empowers other governments and outside organizations to create their own versions of this platform to engage their own citizens and constituencies.
  3. Third, and most importantly, the process of hearing from us about your petition is going to look a little different. We’ve assembled a team of people responsible for taking your questions and requests and bringing them to the right people — whether within the White House or in an agency within the Administration — who may be in a position to say something about your request….(More)

Quantifying Crowd Size with Mobile Phone and Twitter Data


, , and Being able to infer the number of people in a specific area is of extreme importance for the avoidance of crowd disasters and to facilitate emergency evacuations. Here, using a football stadium and an airport as case studies, we present evidence of a strong relationship between the number of people in restricted areas and activity recorded by mobile phone providers and the online service Twitter. Our findings suggest that data generated through our interactions with mobile phone networks and the Internet may allow us to gain valuable measurements of the current state of society….(More)”

Transforming public services the right way with impact management


Emily Bazalgette at FutureGov: “…Impact evaluation involves using a range of research methodologies to investigate whether our products and services are having an impact on users’ lives. ….Rigorous academic impact evaluation wasn’t really designed for rapidly iterating products made by a fast-moving digital and design company like FutureGov. Our products can change significantly over short periods of time — for instance, in a single workshop Doc Ready evolved from a feature-rich social media platform to a stripped-down checklist builder — and that can create a tension between our agile process and traditional evaluation methodologies, which tend to require a fixed product to support a long-term evaluation plan.

We’ve decided to embrace this tension by using Theories of Change, a useful evaluation tool recommended to us by our investors and partners Nesta Impact Investments. To give you a flavour (excuse the pun), below we have Casserole Club’s Theory of Change.

Casserole toc

The problem we’re trying to solve (reducing social isolation) doesn’t tend to change, but the way we solve it might (the inputs and short to medium-term outcomes). In future, we may find that we need to adapt to serve new user groups, or operate in different channels, or that there are mediating outcomes for social isolation that Casserole Club produces other than social contact with a Casserole Club cook. Theories of Change allow us to stay focused on big-picture outcomes, while being flexible about how the product delivers on these outcomes.

Another lesson is to make evaluation everyone’s business. Like many young-ish companies, FutureGov is not at the stage where we have the resources to support a full-time, dedicated Head of Impact. But we’ve found that you can get pretty far if you’ve got a flat structure and lots of passionate people (both of which, luckily, we have). Our lack of hierarchy means that anyone can take up a project and run with it, and collaboration across the company is encouraged. Product impact evaluation is owned by the product teams who manage the product over time. This means we can get more done, that research design benefits from the deep knowledge of our product teams, and that evaluation skills (like how to design a decent survey or depth interview) have started to spread across the organisation….(More)”

Transform Government From The Outside In


Review by GCN of a new report by Forrester: “Agencies struggles to match the customer experience available from the private sector, and that causes citizens to become dissatisfied with government. In fact, seven of the 10 worst organizations in the Forrester’s U.S. Customer Experience Index are federal agencies, and only a third of Americans say their experience with the government meets expectations.

FINDINGS: To keep up with public expectations, Forrester found governments must embrace mobile, turn big data into actionable insights, improve the customer experience and accelerate digital government.  Among the recommendations:

Agencies must shift their thinking to make mobile the primary platform for connection between citizens and government.  Government staff should also have mobile access to the tools and resources needed to complete tasks in the field. Agencies should learn what mobile methods work best for citizens, ensure all citizen services are mobile-friendly and use the mobile platform for sharing information with the public and gathering incident reports and sentiments. By building mobile-friendly infrastructure and processes, like municipal Wi-Fi hotspots, the government (and its services) can be constantly connected to its citizens and businesses.

Governments must find ways to integrate, share and use the large amounts of data and analytics it collects. By aggregating citizen-driven data from precinct-level or agency-specific databases and data collected by systems already in place, the government can increase responsiveness, target areas in need and make better short-term decisions and long-term plans. Opening data to researchers, the private sector and citizens can also spark innovation across industries.

Better customer experience has a ripple effect through government, improving the efficacy of legislation, compliance, engagement and the effectiveness of government offices. This means making processes such as applying for healthcare, registering a car or paying taxes easier and available with highly functioning user-friendly websites.  Such improvements in  communication and digital customer service, will save citizens’ time, increase the use of government services and reduce agencies’ workloads….(More)”

Smartphones as Locative Media


Book by Jordan Frith: “Smartphone adoption has surpassed 50% of the population in more than 15 countries, and there are now more than one million mobile applications people can download to their phones. Many of these applications take advantage of smartphones as locative media, which is what allows smartphones to be located in physical space. Applications that take advantage of people’s location are called location-based services, and they are the focus of this book.

Smartphones as locative media raise important questions about how we understand the complicated relationship between the Internet and physical space. This book addresses these questions through an interdisciplinary theoretical framework and a detailed analysis of how various popular mobile applications including Google Maps, Facebook, Instagram, Yelp, and Foursquare use people’s location to provide information about their surrounding space….(More)”

This Is What Controversies Look Like in the Twittersphere


Emerging Technology From the arXiv: “A new way of analyzing disagreement on social media reveals that arguments in the Twittersphere look like fireworks.

Many a controversy has raged on social media platforms such as Twitter. Some last for weeks or months, others blow themselves in an afternoon. And yet most go unnoticed by most people. That would change if there was a reliable way of spotting controversies in the Twitterstream in real time.

That could happen thanks to the work of Kiran Garimella and pals at Aalto University in Finland. These guys have found a way to spot the characteristics of a controversy in a collection of tweets and distinguish this from a noncontroversial conversation.

Various researchers have studied controversies on Twitter but these have all focused on preidentified arguments, whereas Garimella and co want to spot them in the first place. Their key idea is that the structure of conversations that involve controversy are different from those that are benign.

And they think this structure can be spotted by studying various properties of the conversation, such as the network of connections between those involved in a topic; the structure of endorsements, who agrees with whom; and the sentiment of the discussion, whether positive and negative.

They test this idea by first studying ten conversations associated with hashtags that are known to be controversial and ten that are known to be benign. Garimella and co map out the structure of these discussion by looking at the networks of retweets, follows, keywords and combinations of these….(More)

More: arxiv.org/abs/1507.05224 : Quantifying Controversy in Social Media

A Visual Introduction to Machine Learning


R2D3 introduction: “In machine learning, computers apply statistical learning techniques to automatically identify patterns in data. These techniques can be used to make highly accurate predictions.

Keep scrolling. Using a data set about homes, we will create a machine learning model to distinguish homes in New York from homes in San Francisco…./

 

  1. Machine learning identifies patterns using statistical learning and computers by unearthing boundaries in data sets. You can use it to make predictions.
  2. One method for making predictions is called a decision trees, which uses a series of if-then statements to identify boundaries and define patterns in the data
  3. Overfitting happens when some boundaries are based on on distinctions that don’t make a difference. You can see if a model overfits by having test data flow through the model….(More)”

Urban Informatics


Special issue of Data Engineering: “Most data related to people and the built world originates in urban settings. There is increasing demand to capture and exploit this data to support efforts in areas such as Smart Cities, City Science and Intelligent Transportation Systems. Urban informatics deals with the collection, organization, dissemination and analysis of urban information used in such applications. However, the dramatic growth in the volume of this urban data creates challenges for existing data-management and analysis techniques. The collected data is also increasingly diverse, with a wide variety of sensor, GIS, imagery and graph data arising in cities. To address these challenges, urban informatics requires development of advanced data-management approaches, analysis methods, and visualization techniques. It also provides an opportunity to confront the “Variety” axis of Big Data head on. The contributions in this issue cross the spectrum of urban information, from its origin, to archiving and retrieval, to analysis and visualization. …

Collaborative Sensing for Urban Transportation (By Sergio Ilarri, et al)

Open Civic Data: Of the People, For the People, By the People (by Arnaud Sahuguet, et al, The GovLab)

Plenario: An Open Data Discovery and Exploration Platform for Urban Science (by Charlie Catlett et al)

Riding from Urban Data to Insight Using New York City Taxis (by Juliana Freire et al)…(More)”

 

Transparency in Social Media


New book on “Tools, Methods and Algorithms for Mediating Online Interactions” edited by Matei, Sorin; Adam; Russell Martha G.: and Bertino, Elisa (Eds.): “The volume presents, in a synergistic manner, significant theoretical and practical contributions in the area of social media reputation and authorship measurement, visualization, and modeling. The book justifies and proposes contributions to a future agenda for understanding the requirements for making social media authorship more transparent. Building on work presented in a previous volume of this series, Roles, Trust, and Reputation in Social Media Knowledge Markets, this book discusses new tools, applications, services, and algorithms that are needed for authoring content in a real-time publishing world. These insights may help people who interact and create content through social media better assess their potential for knowledge creation. They may also assist in analyzing audience attitudes, perceptions, and behavior in informal social media or in formal organizational structures. In addition, the volume includes several chapters that analyze the higher order ethical, critical thinking, and philosophical principles that may be used to ground social media authorship. Together, the perspectives presented in this volume help us understand how social media content is created and how its impact can be evaluated.

The chapters demonstrate thought leadership through new ways of constructing social media experiences and making traces of social interaction visible. Transparency in Social Media aims to help researchers and practitioners design services, tools, or methods of analysis that encourage a more transparent process of interaction and communication on social media. Knowing who has added what content and with what authority to a specific online social media project can help the user community better understand, evaluate and make decisions and, ultimately, act on the basis of such information …(More)”