New book: “Now more than ever, we need to understand social media – the good as well as the bad. We need critical knowledge that helps us to navigate the controversies and contradictions of this complex digital media landscape. Only then can we make informed judgements about what’s
happening in our media world, and why.
Showing the reader how to ask the right kinds of questions about social media, Christian Fuchs takes us on a journey across social media,
delving deep into case studies on Google, Facebook, Twitter, WikiLeaks and Wikipedia. The result lays bare the structures and power relations
at the heart of our media landscape.
This book is the essential, critical guide for understanding social media and for all students of media studies and sociology. Readers will
never look at social media the same way again.
Sample chapter:
Twitter and Democracy: A New Public Sphere?
Introduction: What is a Critical Introduction to Social Media?“
How Internet surveillance predicts disease outbreak before WHO
Kurzweil News: “Have you ever Googled for an online diagnosis before visiting a doctor? If so, you may have helped provide early warning of an infectious disease epidemic.
In a new study published in Lancet Infectious Diseases, Internet-based surveillance has been found to detect infectious diseases such as Dengue Fever and Influenza up to two weeks earlier than traditional surveillance methods, according to Queensland University of Technology (QUT) research fellow and senior author of the paper Wenbiao Hu.
Hu, based at the Institute for Health and Biomedical Innovation, said there was often a lag time of two weeks before traditional surveillance methods could detect an emerging infectious disease.
“This is because traditional surveillance relies on the patient recognizing the symptoms and seeking treatment before diagnosis, along with the time taken for health professionals to alert authorities through their health networks. In contrast, digital surveillance can provide real-time detection of epidemics.”
Hu said the study used search engine algorithms such as Google Trends and Google Insights. It found that detecting the 2005–06 avian influenza outbreak “Bird Flu” would have been possible between one and two weeks earlier than official surveillance reports.
“In another example, a digital data collection network was found to be able to detect the SARS outbreak more than two months before the first publications by the World Health Organization (WHO),” Hu said.
According to this week’s CDC FluView report published Jan. 17, 2014, influenza activity in the United States remains high overall, with 3,745 laboratory-confirmed influenza-associated hospitalizations reported since October 1, 2013 (credit: CDC)
“Early detection means early warning and that can help reduce or contain an epidemic, as well alert public health authorities to ensure risk management strategies such as the provision of adequate medication are implemented.”
Hu said the study found that social media including Twitter and Facebook and microblogs could also be effective in detecting disease outbreaks. “The next step would be to combine the approaches currently available such as social media, aggregator websites, and search engines, along with other factors such as climate and temperature, and develop a real-time infectious disease predictor.”
“The international nature of emerging infectious diseases combined with the globalization of travel and trade, have increased the interconnectedness of all countries and that means detecting, monitoring and controlling these diseases is a global concern.”
The other authors of the paper were Gabriel Milinovich (first author), Gail Williams and Archie Clements from the University of Queensland School of Population, Health and State.
Supramap
Another powerful tool is Supramap, a web application that synthesizes large, diverse datasets so that researchers can better understand the spread of infectious diseases across hosts and geography by integrating genetic, evolutionary, geospatial, and temporal data. It is now open-source — create your own maps here.
Associate Professor Daniel Janies, Ph.D., an expert in computational genomics at the Wexner Medical Center at The Ohio State University (OSU), worked with software engineers at the Ohio Supercomputer Center (OSC) to allow researchers and public safety officials to develop other front-end applications that draw on the logic and computing resources of Supramap.
It was originally developed in 2007 to track the spread and evolution of pandemic (H1N1) and avian influenza (H5N1).
“Using SUPRAMAP, we initially developed maps that illustrated the spread of drug-resistant influenza and host shifts in H1N1 and H5N1 influenza and in coronaviruses, such as SARS,” said Janies. “SUPRAMAP allows the user to track strains carrying key mutations in a geospatial browser such as Google Earth. Our software allows public health scientists to update and view maps on the evolution and spread of pathogens.”
Grant funding through the U.S. Army Research Laboratory and Office supports this Innovation Group on Global Infectious Disease Research project. Support for the computational requirements of the project comes from the American Museum of Natural History (AMNH) and OSC. Ohio State’s Wexner Medical Center, Department of Biomedical Informatics and offices of Academic Affairs and Research provide additional support.”
See also
- Gabriel J Milinovich, Gail M Williams, Archie C A Clements, Wenbiao Hu, Internet-based surveillance systems for monitoring emerging infectious diseases, The Lancet Infectious Diseases, 2013, DOI: 10.1016/S1473-3099(13)70244-5
- Daniel A. Janies et al., The Supramap project: linking pathogen genomes with geography to fight emergent infectious diseases, Cladistics, 2012, DOI: 10.1111/j.1096-0031.2010.00314.x (open access)
How should we analyse our lives?
Gillian Tett in the Financial Times on the challenge of using the new form of data science: “A few years ago, Alex “Sandy” Pentland, a professor of computational social sciences at MIT Media Lab, conducted a curious experiment at a Bank of America call centre in Rhode Island. He fitted 80 employees with biometric devices to track all their movements, physical conversations and email interactions for six weeks, and then used a computer to analyse “some 10 gigabytes of behaviour data”, as he recalls.
The results showed that the workers were isolated from each other, partly because at this call centre, like others of its ilk, the staff took their breaks in rotation so that the phones were constantly manned. In response, Bank of America decided to change its system to enable staff to hang out together over coffee and swap ideas in an unstructured way. Almost immediately there was a dramatic improvement in performance. “The average call-handle time decreased sharply, which means that the employees were much more productive,” Pentland writes in his forthcoming book Social Physics. “[So] the call centre management staff converted the break structure of all their call centres to this new system and forecast a $15m per year productivity increase.”
When I first heard Pentland relate this tale, I was tempted to give a loud cheer on behalf of all long-suffering call centre staff and corporate drones. Pentland’s data essentially give credibility to a point that many people know instinctively: that it is horribly dispiriting – and unproductive – to have to toil in a tiny isolated cubicle by yourself all day. Bank of America deserves credit both for letting Pentland’s team engage in this people-watching – and for changing its coffee-break schedule in response.
But there is a bigger issue at stake here too: namely how academics such as Pentland analyse our lives. We have known for centuries that cultural and social dynamics influence how we behave but until now academics could usually only measure this by looking at micro-level data, which were often subjective. Anthropology (a discipline I know well) is a case in point: anthropologists typically study cultures by painstakingly observing small groups of people and then extrapolating this in a subjective manner.
Pentland and others like him are now convinced that the great academic divide between “hard” and “soft” sciences is set to disappear, since researchers these days can gather massive volumes of data about human behaviour with precision. Sometimes this information is volunteered by individuals, on sites such as Facebook; sometimes it can be gathered from the electronic traces – the “digital breadcrumbs” – that we all deposit (when we use a mobile phone, say) or deliberately collected with biometric devices like the ones used at Bank of America. Either way, it can enable academics to monitor and forecast social interaction in a manner we could never have dreamed of before. “Social physics helps us understand how ideas flow from person to person . . . and ends up shaping the norms, productivity and creative output of our companies, cities and societies,” writes Pentland. “Just as the goal of traditional physics is to understand how the flow of energy translates into change in motion, social physics seems to understand how the flow of ideas and information translates into changes in behaviour….
But perhaps the most important point is this: whether you love or hate this new form of data science, the genie cannot be put back in the bottle. The experiments that Pentland and many others are conducting at call centres, offices and other institutions across America are simply the leading edge of a trend.
The only question now is whether these powerful new tools will be mostly used for good (to predict traffic queues or flu epidemics) or for more malevolent ends (to enable companies to flog needless goods, say, or for government control). Sadly, “social physics” and data crunching don’t offer any prediction on this issue, even though it is one of the dominant questions of our age.”
Mapping the Data Shadows of Hurricane Sandy: Uncovering the Sociospatial Dimensions of ‘Big Data’
New Paper by Shelton, T., Poorthuis, A., Graham, M., and Zook, M. : “Digital social data are now practically ubiquitous, with increasingly large and interconnected databases leading researchers, politicians, and the private sector to focus on how such ‘big data’ can allow potentially unprecedented insights into our world. This paper investigates Twitter activity in the wake of Hurricane Sandy in order to demonstrate the complex relationship between the material world and its digital representations. Through documenting the various spatial patterns of Sandy-related tweeting both within the New York metropolitan region and across the United States, we make a series of broader conceptual and methodological interventions into the nascent geographic literature on big data. Rather than focus on how these massive databases are causing necessary and irreversible shifts in the ways that knowledge is produced, we instead find it more productive to ask how small subsets of big data, especially georeferenced social media information scraped from the internet, can reveal the geographies of a range of social processes and practices. Utilizing both qualitative and quantitative methods, we can uncover broad spatial patterns within this data, as well as understand how this data reflects the lived experiences of the people creating it. We also seek to fill a conceptual lacuna in studies of user-generated geographic information, which have often avoided any explicit theorizing of sociospatial relations, by employing Jessop et al’s TPSN framework. Through these interventions, we demonstrate that any analysis of user-generated geographic information must take into account the existence of more complex spatialities than the relatively simple spatial ontology implied by latitude and longitude coordinates.”
Tech Policy Is Not A Religion
Opinion Piece by Robert Atkinson: “”Digital libertarians” and “digital technocrats” want us to believe their way is the truth and the light. It’s not that black and white. Manichaeism, an ancient religion, took a dualistic view of the world. It described the struggle between a good, spiritual world of light, and an evil, material world of darkness. Listening to tech policy debates, especially in America, one would presume that Manichaeism is alive and well.
On one side (light or dark, depending on your view) are the folks who embrace free markets, bottom-up processes, multi-stakeholderism, open-source systems, and crowdsourced innovations. On the other are those who embrace government intervention, top-down processes, additional regulation, proprietary systems, and expert-based innovations.
For the first group, whom I’ll call the digital libertarians, government is the problem, not the solution. Tech enables freedom, and statist actions can only limit it.
According to this camp, tech is moving so fast that government can’t hope to keep up — the only workable governance system is a nimble one based on multi-stakeholder processes, such as ICANN and W3C. With Web 2.0, everyone can be a contributor, and it is through the proliferation of multiple and disparate voices that we discover the truth. And because of the ability of communities of coders to add their contributions, the only viable tech systems are based on open-source models.
For the second group, the digital technocrats, the problem is the anarchic, lawless, corporate-dominated nature of the digital world. Tech is so disruptive, including to long-established norms and laws, it needs to be limited and shaped, and only the strong hand of the state can do that. Because of the influence of tech on all aspects of society, any legitimate governance process must stem from democratic institutions — not from a select group of insiders — and that can only happen with government oversight such as through the UN’s International Telecommunication Union.
According to this camp, because there are so many uninformed voices on the Internet spreading urban myths like wildfire, we need carefully vetted experts, whether in media or other organizations, to sort through the mass of information and provide expert, unbiased analysis. And because IT systems are so critical to the safety and well-functioning of society, we need companies to build and profit from them through a closed-source model.
Of course, just as religious Manichaeism leads to distorted practices of faith, tech Manichaeism leads to distorted policy practices and views. Take Internet governance. The process of ensuring Internet governance and evolution is complex and rapidly changing. A strong case can be made for the multi-stakeholder process as the driving force.
But this situation doesn’t mean, as digital libertarians would assert, that governments should stay out of the Internet altogether. Governments are not, as digital libertarian John Perry Barlow arrogantly asserts, “weary giants of flesh and steel.” Governments can and do play legitimate roles in many Internet policy issues, from establishing cybersecurity guidelines to setting online sales tax policy to combatting spam and digital piracy to setting rules governing unfair and deceptive online marketing practices.
This assertion doesn’t mean governments always get things right. They don’t. But as the Information Technology and Innovation Foundation writes in its recent response to Barlow’s manifesto, to deny people the right to regulate Internet activity through their government officials ignores the significant contribution the government can play in promoting the continued development of the Internet and digital economy.
At the same time, the digital technocrats must understand that the digital world is different from the analog one, and that old rules, regulations, and governing structures simply don’t apply. When ITU Secretary General Hamadoun Toure argues that “at the behest of all the world’s nations, the UN must lead this effort” to manage the global Internet, and that “for big commercial interests, it’s about maximizing the bottom line,” he’s ignoring the critical role that tech companies and other non-government stakeholders play in the Internet ecosystem.
Because digital technology is such a vastly complex system, digital libertarians claim that their “light” approach is superior to the “dark,” controlling, technocratic approach. In fact, this very complexity requires that we base Internet policy on pragmatism, not religion.
Conversely, because technology is so important to opportunity and the functioning of societies, digital technocrats assert that only governments can maximize these benefits. In fact, its importance requires us to respect its complexity and the role of private sector innovators in driving digital progress.
In short, the belief that one or the other of these approaches is sufficient in itself to maximize tech innovation is misleading at best and damaging at worst.”
New Book: Open Data Now
New book by Joel Gurin (The GovLab): “Open Data is the world’s greatest free resource–unprecedented access to thousands of databases–and it is one of the most revolutionary developments since the Information Age began. Combining two major trends–the exponential growth of digital data and the emerging culture of disclosure and transparency–Open Data gives you and your business full access to information that has never been available to the average person until now. Unlike most Big Data, Open Data is transparent, accessible, and reusable in ways that give it the power to transform business, government, and society.
Open Data Now is an essential guide to understanding all kinds of open databases–business, government, science, technology, retail, social media, and more–and using those resources to your best advantage. You’ll learn how to tap crowds for fast innovation, conduct research through open collaboration, and manage and market your business in a transparent marketplace.
Open Data is open for business–and the opportunities are as big and boundless as the Internet itself. This powerful, practical book shows you how to harness the power of Open Data in a variety of applications:
- HOT STARTUPS: turn government data into profitable ventures
- SAVVY MARKETING: understand how reputational data drives your brand
- DATA-DRIVEN INVESTING: apply new tools for business analysis
- CONSUMER IN FORMATION: connect with your customers using smart disclosure
- GREEN BUSINESS: use data to bet on sustainable companies
- FAST R&D: turn the online world into your research lab
- NEW OPPORTUNITIES: explore open fields for new businesses
Whether you’re a marketing professional who wants to stay on top of what’s trending, a budding entrepreneur with a billion-dollar idea and limited resources, or a struggling business owner trying to stay competitive in a changing global market–or if you just want to understand the cutting edge of information technology–Open Data Now offers a wealth of big ideas, strategies, and techniques that wouldn’t have been possible before Open Data leveled the playing field.
The revolution is here and it’s now. It’s Open Data Now.”
Social media in crisis events: Open networks and collaboration supporting disaster response and recovery
Paper for the IEEE International Conference on Technologies for Homeland Security (HST): “Large-scale crises challenge the ability of public safety and security organisations to respond efficient and effectively. Meanwhile, citizens’ adoption of mobile technology and rich social media services is dramatically changing the way crisis responses develop. Empowered by new communication media (smartphones, text messaging, internet-based applications and social media), citizens are the in situ first sensors. However, this entire social media arena is unchartered territory to most public safety and security organisations. In this paper, we analyse crisis events to draw narratives on social media relevance and describe how public safety and security organisations are increasingly aware of social media’s added value proposition in times of crisis. A set of critical success indicators to address the process of adopting social media is identified, so that social media information is rapidly transformed into actionable intelligence, thus enhancing the effectiveness of public safety and security organisations — saving time, money and lives.”
How Big Should Your Network Be?
Michael Simmons at Forbes: “There is a debate happening between software developers and scientists: How large can and should our networks be in this evolving world of social media? The answer to this question has dramatic implications for how we look at our own relationship building…
To better understand our limits, I connected with the famous British anthropologist and evolutionary psychologist, Robin Dunbar, creator of his namesake; Dunbar’s number.
Dunbar’s number, 150, is the suggested cognitive limit to the number of relationships we can maintain where both parties are willing to do favors for each other.
Dunbar’s discovery was in finding a very high correlation between the size of a species’ neocortex and the average social group size (see chart to right). The theory predicted 150 for humans, and this number is found throughout human communities over time….
Does Dunbar’s Number Still Apply In Today’s Connected World?
There are two camps when it comes to Dunbar’s number. The first camp is embodied by David Morin, the founder of Path, who built a whole social network predicated on the idea that you cannot have more than 150 friends. Robin Dunbar falls into this camp and even did an academic study on social media’s impact on Dunbar’s number. When I asked for his opinion, he replied:
The 150 limit applies to internet social networking sites just as it does in face-to-face life. Facebook’s own data shows that the average number of friends is 150-250 (within the range of variation in the face-to-face world). Remember that the 150 figure is just the average for the population as a whole. However, those who have more seem to have weaker friendships, suggesting that the amount of social capital is fixed and you can choose to spread it thickly or thinly.
Zvi Band, the founder of Contactually, a rapidly growing, venture-backed, relationship management tool, disagrees with both Morin and Dunbar, “We have the ability as a society to bust through Dunbar’s number. Current software can extend Dunbar’s number by at least 2-3 times.” To understand the power of Contactually and tools like it, we must understand the two paradigms people currently use when keeping in touch: broadcast & one-on-one.
While broadcast email makes it extremely easy to reach lots of people who want to hear from us, it is missing personalization. Personalization is what transforms information diffusion into personal relationship building. To make matters worse, email broadcast open rates have halved in size over the last decade.
On the other end of the spectrum is one-on-one outreach. Research performed by Facebook data scientists shows that one-on-one outreach is extremely effective and explains why:
Both the offering and the receiving of the intimate information increases relationship strength. Providing a partner with personal information expresses trust, encourages reciprocal self-disclosure, and engages the partner in at least some of the details of one’s daily life. Directed communication evokes norms of reciprocity, so may obligate partner to reply. The mere presence of the communication, which is relatively effortful compared to broadcast messages, also signals the importance of the relationship….”
Can a Better Taxonomy Help Behavioral Energy Efficiency?
Article at GreenTechEfficiency: “Hundreds of behavioral energy efficiency programs have sprung up across the U.S. in the past five years, but the effectiveness of the programs — both in terms of cost savings and reduced energy use — can be difficult to gauge.
Of nearly 300 programs, a new report from the American Council for an Energy-Efficient Economy was able to accurately calculate the cost of saved energy from only ten programs….
To help utilities and regulators better define and measure behavioral programs, ACEEE offers a new taxonomy of utility-run behavior programs that breaks them into three major categories:
Cognition: Programs that focus on delivering information to consumers. (This includes general communication efforts, enhanced billing and bill inserts, social media and classroom-based education.)
Calculus: Programs that rely on consumers making economically rational decisions. (This includes real-time and asynchronous feedback, dynamic pricing, games, incentives and rebates and home energy audits.)
Social interaction: Programs whose key drivers are social interaction and belonging. (This includes community-based social marketing, peer champions, online forums and incentive-based gifts.)
….
While the report was mostly preliminary, it also offered four steps forward for utilities that want to make the most of behavioral programs.
Stack. The types of programs might fit into three broad categories, but judiciously blending cues based on emotion, reason and social interaction into programs is key, according to ACEEE. Even though the report recommends stacked programs that have a multi-modal approach, the authors acknowledge, “This hypothesis will remain untested until we see more stacked programs in the marketplace.”
Track. Just like other areas of grid modernization, utilities need to rethink how they collect, analyze and report the data coming out of behavioral programs. This should include metrics that go beyond just energy savings.
Share. As with other utility programs, behavior-based energy efficiency programs can be improved upon if utilities share results and if reporting is standardized across the country instead of varying by state.
Coordinate. Sharing is only the first step. Programs that merge water, gas and electricity efficiency can often gain better results than siloed programs. That approach, however, requires a coordinated effort by regional utilities and a change to how programs are funded and evaluated by regulators.”