Minna Ruckenstein and Mika Pantzar in New Media and Society: “This article investigates the metaphor of the Quantified Self (QS) as it is presented in the magazine Wired (2008–2012). Four interrelated themes—transparency, optimization, feedback loop, and biohacking—are identified as formative in defining a new numerical self and promoting a dataist paradigm. Wired captures certain interests and desires with the QS metaphor, while ignoring and downplaying others, suggesting that the QS positions self-tracking devices and applications as interfaces that energize technological engagements, thereby pushing us to rethink life in a data-driven manner. The thematic analysis of the QS is treated as a schematic aid for raising critical questions about self-quantification, for instance, detecting the merging of epistemological claims, technological devices, and market-making efforts. From this perspective, another definition of the QS emerges: a knowledge system that remains flexible in its aims and can be used as a resource for epistemological inquiry and in the formation of alternative paradigms….(More)”
The deception that lurks in our data-driven world
Alexis C. Madrigal at Fusion: “…There’s this amazing book called Seeing Like a State, which shows how governments and other big institutions try to reduce the vast complexity of the world into a series of statistics that their leaders use to try to comprehend what’s happening.
The author, James C. Scott, opens the book with an extended anecdote about the Normalbaum. In the second half of the 18th century, Prussian rulers wanted to know how many “natural resources” they had in the tangled woods of the country. So, they started counting. And they came up with these huge tables that would let them calculate how many board-feet of wood they could pull from a given plot of forest. All the rest of the forest, everything it did for the people and the animals and general ecology of the place was discarded from the analysis.
But the world proved too unruly. Their data wasn’t perfect. So they started creating new forests, the Normalbaum, planting all the trees at the same time, and monoculturing them so that there were no trees in the forest that couldn’t be monetized for wood. “The fact is that forest science and geometry, backed by state power, had the capacity to transform the real, diverse, and chaotic old-growth forest into a new, more uniform forest that closely resembled the administrative grid of its techniques,” Scott wrote.
The spreadsheet became the world! They even planted the trees in rows, like a grid.
German foresters got very scientific with their fertilizer applications and management practices. And the scheme really worked—at least for a hundred years. Pretty much everyone across the world adopted their methods.
“In the German case, the negative biological and ultimately commercial consequences of the stripped-down forest became painfully obvious only after the second rotation of conifers had been planted,” Scott wrote.
The complex ecosystem that underpinned the growth of these trees through generations—all the microbial and inter-species relationships—were torn apart by the rigor of the Normalbaum. The nutrient cycles were broken. Resilience was lost. The hidden underpinnings of the world were revealed only when they were gone. The Germans, like they do, came up with a new word for what happened: Waldsterben, or forest death.
Sometimes, when I look out at our world—at the highest level—in which thin data have come to stand in for huge complex systems of human and biological relationships, I wonder if we’re currently deep in the Normalbaum phase of things, awaiting the moment when Waldsterbensets in.
Take the ad-supported digital media ecosystem. The idea is brilliant: capture data on people all over the web and then use what you know to show them relevant ads, ads they want to see. Not only that, but because it’s all tracked, unlike broadcast or print media, an advertiser can measure what they’re getting more precisely. And certainly the digital advertising market has grown, taking share from most other forms of media. The spreadsheet makes a ton of sense—which is one reason for the growth predictions that underpin the massive valuations of new media companies.
But scratch the surface, like Businessweek recently did, and the problems are obvious. A large percentage of the traffic to many stories and videos consists of software pretending to be human.
“The art is making the fake traffic look real, often by sprucing up websites with just enough content to make them appear authentic,” Businessweek says. “Programmatic ad-buying systems don’t necessarily differentiate between real users and bots, or between websites with fresh, original work, and Potemkin sites camouflaged with stock photos and cut-and-paste articles.”
Of course, that’s not what high-end media players are doing. But the cheap programmatic ads, fueled by fake traffic, drive down the pricesacross the digital media industry, making it harder to support good journalism. Meanwhile, users of many sites are rebelling against the business model by installing ad blockers.
The advertisers and ad-tech firms just wanted to capture user data to show them relevant ads. They just wanted to measure their ads more effectively. But placed into the real-world, the system that grew up around these desires has reshaped the media landscape in unpredictable ways.
We’ve deceived ourselves into thinking data is a camera, but it’s really an engine. Capturing data about something changes the way that something works. Even the mere collection of stats is not a neutral act, but a way of reshaping the thing itself….(More)”
Governments’ Self-Disruption Challenge
Mohamed A. El-Erian at Project Syndicate: “One of the most difficult challenges facing Western governments today is to enable and channel the transformative – and, for individuals and companies, self-empowering – forces of technological innovation. They will not succeed unless they become more open to creative destruction, allowing not only tools and procedures, but also mindsets, to be revamped and upgraded. The longer it takes them to meet this challenge, the bigger the lost opportunities for current and future generations.
Self-empowering technological innovation is all around us, affecting a growing number of people, sectors, and activities worldwide. Through an ever-increasing number of platforms, it is now easier than ever for households and corporations to access and engage in an expanding range of activities – from urban transportation to accommodation, entertainment, and media. Even the regulation-reinforced, fortress-like walls that have traditionally surrounded finance and medicine are being eroded.
…In fact, Western political and economic structures are, in some ways, specifically designed to resist deep and rapid change, if only to prevent temporary and reversible fluctuations from having an undue influence on underlying systems. This works well when politics and economies are operating in cyclical mode, as they usually have been in the West. But when major structural and secular challenges arise, as is the case today, the advanced countries’ institutional architecture acts as a major obstacle to effective action….Against this background, a rapid and comprehensive transformation is clearly not feasible. (In fact, it may not even be desirable, given the possibility of collateral damage and unintended consequences.) The best option for Western governments is thus to pursue gradual change, propelled by a variety of adaptive instruments, which would reach a critical mass over time.
Such tools include well-designed public-private partnerships, especially when it comes to modernizing infrastructure; disruptive outside advisers – selected not for what they think, but for how they think – in the government decision-making process; mechanisms to strengthen inter-agency coordination so that it enhances, rather than retards, policy responsiveness; and broader cross-border private-sector linkages to enhance multilateral coordination.
How economies function is changing, as relative power shifts from established, centralized forces toward those that respond to the unprecedented empowerment of individuals. If governments are to overcome the challenges they face and maximize the benefits of this shift for their societies, they need to be a lot more open to self-disruption. Otherwise, the transformative forces will leave them and their citizens behind….(More)”
Big Data and Mass Shootings
Holman W. Jenkins in the Wall Street Journal: “As always, the dots are connected after the fact, when the connecting is easy. …The day may be coming, sooner than we think, when such incidents can be stopped before they get started. A software program alerts police to a social-media posting by an individual of interest in their jurisdiction. An algorithm reminds them why the individual had become a person of interest—a history of mental illness, an episode involving a neighbor. Months earlier, discreet inquires by police had revealed an unhealthy obsession with weapons—key word, unhealthy. There’s no reason why gun owners, range operators and firearms dealers shouldn’t be a source of information for local police seeking information about who might merit special attention.
Sound scary? Big data exists to find the signal among the noise. Your data is the noise. It’s what computerized systems seek to disregard in their quest for information that actually would be useful to act on. Big data is interested in needles, not hay.
Still don’t trust the government? You’re barking up an outdated tree. Consider the absurdly ancillary debate last year on whether the government should be allowed to hold telephone “metadata” when the government already holds vastly more sensitive data on all of us in the form of tax, medical, legal and census records.
All this seems doubly silly given the spacious information about each of us contained in private databases, freely bought and sold by marketers. Bizarre is the idea that Facebook should be able to use our voluntary Facebook postings to decide what we might like to buy, but police shouldn’t use the same information to prevent crime.
Hitachi, the big Japanese company, began testing its crime-prediction software in several unnamed American cities this month. The project, called Hitachi Visualization Predictive Crime Analytics, culls crime records, map and transit data, weather reports, social media and other sources for patterns that might otherwise go unnoticed by police.
Colorado-based Intrado, working with LexisNexis and Motorola Solutions, already sells police a service that instantly scans legal, business and social-media records for information about persons and circumstances that officers may encounter when responding to a 911 call at a specific address. Hundreds of public safety agencies find the system invaluable though that didn’t stop the city of Bellingham, Wash., from rejecting it last year on the odd grounds that such software must be guilty of racial profiling.
Big data is changing how police allocate resources and go about fighting crime. …It once was freely asserted that police weren’t supposed to prevent crime, only solve it. But recent research shows investment in policing actually does reduce crime rates—and produces a large positive return measured in dollars and cents. A day will come when failing to connect the dots in advance of a mass-shooting won’t be a matter for upturned hands. It will be a matter for serious recrimination…(More)“
We Need Both Networks and Communities
Henry Mintzberg at HBR: “If you want to understand the difference between a network and a community, ask your Facebook friends to help paint your house.
Social media certainly connects us to whoever is on the other end of the line, and so extends our social networks in amazing ways. But this can come at the expense of deeper personal relationships. When it feels like we’re up-to-date on our friends’ lives through Facebook or Instagram, we may become less likely to call them, much less meet up. Networks connect; communities care.
….A century or two ago, the word community “seemed to connote a specific group of people, from a particular patch of earth, who knew and judged and kept an eye on one another, who shared habits and history and memories, and could at times be persuaded to act as a whole on behalf of a part.” In contrast,the word has now become fashionable to describe what are really networks, as in the “business community”—”people with common interests [but] not common values, history, or memory.”
Does this matter for managing in the digital age, even for dealing with our global problems? It sure does. In a 2012 New York Times column, Thomas Friedman reported asking an Egyptian friend about the protest movements in that country: “Facebook really helped people to communicate, but not to collaborate,” he replied. Friedman added that “at their worst, [social media sites] can become addictive substitutes for real action.” That is why, while the larger social movements, as in Cairo’s Tahrir Square or on Wall Street, may raise consciousness about the need for renewal in society, it is the smaller social initiatives, usually developed by small groups in communities, that do much of the renewing….
We tend to make a great fuss about leadership these days, but communityship is more important. The great leaders create, enhance, and support a sense of community in their organizations, and that requires hands-on management. Hence managers have get beyond their individual leadership, to recognize the collective nature of effective enterprise.
Especially for operating around the globe, electronic communication has become essential. But the heart of enterprise remains rooted in personal collaborative relationships, albeit networked by the new information technologies. Thus, in localities and organizations, across societies and around the globe, beware of “networked individualism“ where people communicate readily while struggling to collaborate.
The new digital technologies, wonderful as they are in enhancing communication, can have a negative effect on collaboration unless they are carefully managed. An electronic device puts us in touch with a keyboard, that’s all….(More)”
A new model to explore non-profit social media use for advocacy and civic engagement
David Chapman, Katrina Miller-Stevens, John C Morris, and Brendan O’Hallarn in First Monday: “In an age when electronic communication is ubiquitous, non-profit organizations are actively using social media platforms as a way to deliver information to end users. In spite of the broad use of these platforms, little scholarship has focused on the internal processes these organizations employ to implement these tools. A limited number of studies offer models to help explain an organization’s use of social media from initiation to outcomes, yet few studies address a non-profit organization’s mission as the driver to employ social media strategies and tactics. Furthermore, the effectiveness of social media use is difficult for non-profit organizations to measure. Studies that attempt to address this question have done so by viewing social media platform analytics (e.g., Facebook analytics) or analyzing written content by users of social media (Nah and Saxton, 2013; Auger, 2013; Uzunoğlu and Misci Kip, 2014; or Guo and Saxton, 2014). The value added of this study is to present a model for practice (Weil, 1997) that explores social media use and its challenges from a non-profit organization’s mission through its desired outcome, in this case an outcome of advocacy and civic engagement.
We focus on one non-profit organization, Blue Star Families, that actively engages in advocacy and civic engagement. Blue Star Families was formed in 2009 to “raise the awareness of the challenges of military family life with our civilian communities and leaders” (Blue Star Families, 2010). Blue Star Families is a virtual organization with no physical office location. Thus, the organization relies on its Web presence and social media tools to advocate for military families and engage service members and their families, communities, and citizens in civic engagement activities (Blue Star Families, 2010).
The study aims to provide organizational-level insights of the successes and challenges of working in the social media environment. Specifically, the study asks: What are the processes non-profit organizations follow to link organizational mission to outcomes when using social media platforms? What are the successes and challenges of using social media platforms for advocacy and civic engagement purposes? In our effort to answer these questions, we present a new model to explore non-profit organizations’ use of social media platforms by building on previous models and frameworks developed to explore the use of social media in the public, private, and non-profit sectors.
This research is important for three reasons. First, most previous studies of social media tend to employ models that focus on the satisfaction of the social media tools for organizational members, rather than the utility of social media as a tool to meet organizational goals. Our research offers a means to explore the utility of social media from an organization perspective. Second, the exemplar case for our research, Blue Star Families, Inc., is a non-profit organization whose mission is to create and nurture a virtual community spread over a large geographical — if not global — area. Because Blue Star Families was founded as an online organization that could not exist without social media, it provides a case for which social media is a critical component of the organization’s activity. Finally, we offer some “lessons learned” from our case to identify issues for other organizations seeking to create a significant social media presence.
This paper is organized as follows: first, the growth of social media is briefly addressed to provide background context. Second, previous models and frameworks exploring social media are discussed. This is followed by a presentation of a new model exploring the use of social media from an organizational perspective, starting with the driver of a non-profit organization’s mission, to its desired outcomes of advocacy and civic engagement. Third, the case study methodology is explained. Next, we present an analysis and discussion applying the new model to Blue Star Families’ use of social media platforms. We conclude by discussing the challenges of social media revealed in the case study analysis, and we offer recommendations to address these challenges….(More)”
The Quantified Community and Neighborhood Labs: A Framework for Computational Urban Planning and Civic Technology Innovation
Constantine E. Kontokosta: “This paper presents the conceptual framework and justification for a “Quantified Community” (QC) and a networked experimental environment of neighborhood labs. The QC is a fully instrumented urban neighborhood that uses an integrated, expandable, and participatory sensor network to support the measurement, integration, and analysis of neighborhood conditions, social interactions and behavior, and sustainability metrics to support public decision-making. Through a diverse range of sensor and automation technologies — combined with existing data generated through administrative records, surveys, social media, and mobile sensors — information on human, physical, and environmental elements can be processed in real-time to better understand the interaction and effects of the built environment on human well-being and outcomes. The goal is to create an “informatics overlay” that can be incorporated into future urban development and planning that supports the benchmarking and evaluation of neighborhood conditions, provides a test-bed for measuring the impact of new technologies and policies, and responds to the changing needs and preferences of the local community….(More)”
Understanding democracy as a product of citizen performances reduces the need for a defined ‘people’
Liron Lavi at Democratic Audit: “Dēmokratía, literally ‘the rule of the people’, is the basis for democracy as a political regime. However, ‘the people’ is a heterogeneous, open, and dynamic entity. So, how can we think about democracy without the people as a coherent entity, yet as the source of democracy? I employ a performative theorisation of democracy in order to answer this question. Democracy, I suggest, is an effect produced by repetitive performative acts and ‘the people’ is produced as the source of democratic sovereignty.
A quick search on ‘democratic performance’ will usually yield results (and concerns) regarding voter competence, government accountability, liberal values, and legitimacy. However, from the perspective of performative theory, the term gains a rather different meaning (as has been discussed at length by Judith Butler). It suggests that democracy is not a pre-given structure but rather needs to be constructed repeatedly. Thus, for a democracy to be recognised and maintained as such it needs to be performed by citizens, institutions, office-holders, the media, etc. Acts made by these players – voting, demonstrating, decision- and- law-making, etc. – give form to the abstract concept of democracy, thus producing it as their (imagined) source. There is, therefore, no finite set of actions that can determine once and for all that a social structure is indeed a democracy, for the regime is not a stable and pre-given structure, but rather produced and imagined through a multitude of acts and procedures.
Elections, for example, are a democratic performance insofar as they are perceived as an effective tool for expressing the public’s preferences and choosing its representatives and desired policies. Polling stations are therefore the site in which democracy is constituted insofar as all eligible members (can) participate in the act of voting, and therefore are constructed as the source of sovereignty. By this, elections produce democracy as their effect, as their source, and hold together the political imagination of democracy. And they do this periodically, thus open options for new variations (and failures) in the democratic effect they produce. Elections are therefore, not only an opportunity to replace representatives and incumbents, but also an opportunity to perform democracy, shape it, alter it, and load it with various meanings….(More)”
Researchers wrestle with a privacy problem
Erika Check Hayden at Nature: “The data contained in tax returns, health and welfare records could be a gold mine for scientists — but only if they can protect people’s identities….In 2011, six US economists tackled a question at the heart of education policy: how much does great teaching help children in the long run?
They started with the records of more than 11,500 Tennessee schoolchildren who, as part of an experiment in the 1980s, had been randomly assigned to high- and average-quality teachers between the ages of five and eight. Then they gauged the children’s earnings as adults from federal tax returns filed in the 2000s. The analysis showed that the benefits of a good early education last for decades: each year of better teaching in childhood boosted an individual’s annual earnings by some 3.5% on average. Other data showed the same individuals besting their peers on measures such as university attendance, retirement savings, marriage rates and home ownership.
The economists’ work was widely hailed in education-policy circles, and US President Barack Obama cited it in his 2012 State of the Union address when he called for more investment in teacher training.
But for many social scientists, the most impressive thing was that the authors had been able to examine US federal tax returns: a closely guarded data set that was then available to researchers only with tight restrictions. This has made the study an emblem for both the challenges and the enormous potential power of ‘administrative data’ — information collected during routine provision of services, including tax returns, records of welfare benefits, data on visits to doctors and hospitals, and criminal records. Unlike Internet searches, social-media posts and the rest of the digital trails that people establish in their daily lives, administrative data cover entire populations with minimal self-selection effects: in the US census, for example, everyone sampled is required by law to respond and tell the truth.
This puts administrative data sets at the frontier of social science, says John Friedman, an economist at Brown University in Providence, Rhode Island, and one of the lead authors of the education study “They allow researchers to not just get at old questions in a new way,” he says, “but to come at problems that were completely impossible before.”….
But there is also concern that the rush to use these data could pose new threats to citizens’ privacy. “The types of protections that we’re used to thinking about have been based on the twin pillars of anonymity and informed consent, and neither of those hold in this new world,” says Julia Lane, an economist at New York University. In 2013, for instance, researchers showed that they could uncover the identities of supposedly anonymous participants in a genetic study simply by cross-referencing their data with publicly available genealogical information.
Many people are looking for ways to address these concerns without inhibiting research. Suggested solutions include policy measures, such as an international code of conduct for data privacy, and technical methods that allow the use of the data while protecting privacy. Crucially, notes Lane, although preserving privacy sometimes complicates researchers’ lives, it is necessary to uphold the public trust that makes the work possible.
“Difficulty in access is a feature, not a bug,” she says. “It should be hard to get access to data, but it’s very important that such access be made possible.” Many nations collect administrative data on a massive scale, but only a few, notably in northern Europe, have so far made it easy for researchers to use those data.
In Denmark, for instance, every newborn child is assigned a unique identification number that tracks his or her lifelong interactions with the country’s free health-care system and almost every other government service. In 2002, researchers used data gathered through this identification system to retrospectively analyse the vaccination and health status of almost every child born in the country from 1991 to 1998 — 537,000 in all. At the time, it was the largest study ever to disprove the now-debunked link between measles vaccination and autism.
Other countries have begun to catch up. In 2012, for instance, Britain launched the unified UK Data Service to facilitate research access to data from the country’s census and other surveys. A year later, the service added a new Administrative Data Research Network, which has centres in England, Scotland, Northern Ireland and Wales to provide secure environments for researchers to access anonymized administrative data.
In the United States, the Census Bureau has been expanding its network of Research Data Centers, which currently includes 19 sites around the country at which researchers with the appropriate permissions can access confidential data from the bureau itself, as well as from other agencies. “We’re trying to explore all the available ways that we can expand access to these rich data sets,” says Ron Jarmin, the bureau’s assistant director for research and methodology.
In January, a group of federal agencies, foundations and universities created the Institute for Research on Innovation and Science at the University of Michigan in Ann Arbor to combine university and government data and measure the impact of research spending on economic outcomes. And in July, the US House of Representatives passed a bipartisan bill to study whether the federal government should provide a central clearing house of statistical administrative data.
Yet vast swathes of administrative data are still inaccessible, says George Alter, director of the Inter-university Consortium for Political and Social Research based at the University of Michigan, which serves as a data repository for approximately 760 institutions. “Health systems, social-welfare systems, financial transactions, business records — those things are just not available in most cases because of privacy concerns,” says Alter. “This is a big drag on research.”…
Many researchers argue, however, that there are legitimate scientific uses for such data. Jarmin says that the Census Bureau is exploring the use of data from credit-card companies to monitor economic activity. And researchers funded by the US National Science Foundation are studying how to use public Twitter posts to keep track of trends in phenomena such as unemployment.
….Computer scientists and cryptographers are experimenting with technological solutions. One, called differential privacy, adds a small amount of distortion to a data set, so that querying the data gives a roughly accurate result without revealing the identity of the individuals involved. The US Census Bureau uses this approach for its OnTheMap project, which tracks workers’ daily commutes. ….In any case, although synthetic data potentially solve the privacy problem, there are some research applications that cannot tolerate any noise in the data. A good example is the work showing the effect of neighbourhood on earning potential3, which was carried out by Raj Chetty, an economist at Harvard University in Cambridge, Massachusetts. Chetty needed to track specific individuals to show that the areas in which children live their early lives correlate with their ability to earn more or less than their parents. In subsequent studies5, Chetty and his colleagues showed that moving children from resource-poor to resource-rich neighbourhoods can boost their earnings in adulthood, proving a causal link.
Secure multiparty computation is a technique that attempts to address this issue by allowing multiple data holders to analyse parts of the total data set, without revealing the underlying data to each other. Only the results of the analyses are shared….(More)”
Personalising data for development
Wolfgang Fengler and Homi Kharas in the Financial Times: “When world leaders meet this week for the UN’s general assembly to adopt the Sustainable Development Goals (SDGs), they will also call for a “data revolution”. In a world where almost everyone will soon have access to a mobile phone, where satellites will take high-definition pictures of the whole planet every three days, and where inputs from sensors and social media make up two thirds of the world’s new data, the opportunities to leverage this power for poverty reduction and sustainable development are enormous. We are also on the verge of major improvements in government administrative data and data gleaned from the activities of private companies and citizens, in big and small data sets.
But these opportunities are yet to materialize in any scale. In fact, despite the exponential growth in connectivity and the emergence of big data, policy making is rarely based on good data. Almost every report from development institutions starts with a disclaimer highlighting “severe data limitations”. Like castaways on an island, surrounded with water they cannot drink unless the salt is removed, today’s policy makers are in a sea of data that need to be refined and treated (simplified and aggregated) to make them “consumable”.
To make sense of big data, we used to depend on data scientists, computer engineers and mathematicians who would process requests one by one. But today, new programs and analytical solutions are putting big data at anyone’s fingertips. Tomorrow, it won’t be technical experts driving the data revolution but anyone operating a smartphone. Big data will become personal. We will be able to monitor and model social and economic developments faster, more reliably, more cheaply and on a far more granular scale. The data revolution will affect both the harvesting of data through new collection methods, and the processing of data through new aggregation and communication tools.
In practice, this means that data will become more actionable by becoming more personal, more timely and more understandable. Today, producing a poverty assessment and poverty map takes at least a year: it involves hundreds of enumerators, lengthy interviews and laborious data entry. In the future, thanks to hand-held connected devices, data collection and aggregation will happen in just a few weeks. Many more instances come to mind where new and higher-frequency data could generate development breakthroughs: monitoring teacher attendance, stocks and quality of pharmaceuticals, or environmental damage, for example…..
Despite vast opportunities, there are very few examples that have generated sufficient traction and scale to change policy and behaviour and create the feedback loops to further improve data quality. Two tools have personalised the abstract subjects of environmental degradation and demography (see table):
- Monitoring forest fires. The World Resources Institute has launched Global Forest Watch, which enables users to monitor forest fires in near real time, and overlay relevant spatial information such as property boundaries and ownership data to be developed into a model to anticipate the impact on air quality in affected areas in Indonesia, Singapore and Malaysia.
- Predicting your own life expectancy. The World Population Program developed a predictive tool – www.population.io – showing each person’s place in the distribution of world population and corresponding statistical life expectancy. In just a few months, this prototype attracted some 2m users who shared their results more than 25,000 times on social media. The traction of the tool resulted from making demography personal and converting an abstract subject matter into a question of individual ranking and life expectancy.
A new Global Partnership for Sustainable Development Data will be launched at the time of the UN General Assembly….(More)”