Daedalus Issue on “The Internet”


Press release: “Thirty years ago, the Internet was a network that primarily delivered email among academic and government employees. Today, it is rapidly evolving into a control system for our physical environment through the Internet of Things, as mobile and wearable technology more tightly integrate the Internet into our everyday lives.

How will the future Internet be shaped by the design choices that we are making today? Could the Internet evolve into a fundamentally different platform than the one to which we have grown accustomed? As an alternative to big data, what would it mean to make ubiquitously collected data safely available to individuals as small data? How could we attain both security and privacy in the face of trends that seem to offer neither? And what role do public institutions, such as libraries, have in an environment that becomes more privatized by the day?

These are some of the questions addressed in the Winter 2016 issue of Daedalus on “The Internet.”  As guest editors David D. Clark (Senior Research Scientist at the MIT Computer Science and Artificial Intelligence Laboratory) and Yochai Benkler (Berkman Professor of Entrepreneurial Legal Studies at Harvard Law School and Faculty Co-Director of the Berkman Center for Internet and Society at Harvard University) have observed, the Internet “has become increasingly privately owned, commercial, productive, creative, and dangerous.”

Some of the themes explored in the issue include:

  • The conflicts that emerge among governments, corporate stakeholders, and Internet users through choices that are made in the design of the Internet
  • The challenges—including those of privacy and security—that materialize in the evolution from fixed terminals to ubiquitous computing
  • The role of public institutions in shaping the Internet’s privately owned open spaces
  • The ownership and security of data used for automatic control of connected devices, and
  • Consumer demand for “free” services—developed and supported through the sale of user data to advertisers….

Essays in the Winter 2016 issue of Daedalus include:

  • The Contingent Internet by David D. Clark (MIT)
  • Degrees of Freedom, Dimensions of Power by Yochai Benkler (Harvard Law School)
  • Edge Networks and Devices for the Internet of Things by Peter T. Kirstein (University College London)
  • Reassembling Our Digital Selves by Deborah Estrin (Cornell Tech and Weill Cornell Medical College) and Ari Juels (Cornell Tech)
  • Choices: Privacy and Surveillance in a Once and Future Internet by Susan Landau (Worcester Polytechnic Institute)
  • As Pirates Become CEOs: The Closing of the Open Internet by Zeynep Tufekci (University of North Carolina at Chapel Hill)
  • Design Choices for Libraries in the Digital-Plus Era by John Palfrey (Phillips Academy)…(More)

See also: Introduction

Developing Global Norms for Sharing Data and Results during Public Health Emergencies


Paper by Kayvon Modjarrad et al in PLOS Med: “…When a new or re-emergent pathogen causes a major outbreak, rapid access to both raw and analysed data or other pertinent research findings becomes critical to developing a rapid and effective public health response. Without the timely exchange of information on clinical, epidemiologic, and molecular features of an infectious disease, informed decisions about appropriate responses cannot be made, particularly those that relate to fielding new interventions or adapting existing ones. Failure to share information in a timely manner can have disastrous public health consequences, leading to unnecessary suffering and death. The 2014–2015 Ebola epidemic in West Africa revealed both successful practices and important deficiencies within existing mechanisms for information sharing. For example, trials of two Ebola vaccine candidates (ChAd3-ZEBOV and rVSV-ZEBOV) benefited greatly from an open collaboration between investigators and institutions in Africa, Europe, and North America . These teams, coordinated by the WHO, were able to generate and exchange critical data for the development of urgently needed, novel vaccines along faster timelines than have ever before been achieved. Similarly, some members of the genome sequencing community made viral sequence data publicly available within days of accessing samples , thus adhering to their profession’s long-established principles of rapid, public release of sequence data in any setting. In contrast, the dissemination of surveillance data early in the epidemic was comparatively slow, and in some cases, the criteria for sharing were unclear.

In recognition of the need to streamline mechanisms of data dissemination—globally and in as close to real-time as possible—the WHO held a consultation in Geneva, Switzerland, on 1–2 September 2015 to advance the development of data sharing norms, specifically in the context of public health emergencies….

preservation of global health requires prioritization of and support for international collaboration. These and other principles were affirmed at the consultation (Table 1) and codified into a consensus statement that was published on the WHO website immediately following the meeting (http://www.who.int/medicines/ebola-treatment/data-sharing_phe/en/). A more comprehensive set of principles and action items was made available in November 2015, including the consensus statement made by the editorial staff of journals that attended the meeting (http://www.who.int/medicines/ebola-treatment/blueprint_phe_data-share-results/en/). The success of prior initiatives to accelerate timelines for reporting clinical trial results has helped build momentum for a broader data sharing agenda. As the quick and transparent dissemination of information is the bedrock of good science and public health practice, it is important that the current trends in data sharing carry over to all matters of acute public health need. Such a global norm would advance the spirit of open collaboration, simplify current mechanisms of information sharing, and potentially save many lives in subsequent outbreaks….(More)”

 

The Power of the Nudge to Change Our Energy Future


Sebastian Berger in the Scientific American: “More than ever, psychology has become influential not only in explaining human behavior, but also as a resource for policy makers to achieve goals related to health, well-being, or sustainability. For example, President Obama signed an executive order directing the government to systematically use behavioral science insights to “better serve the American people.” Not alone in this endeavor, many governments – including the UK, Germany, Denmark, or Australia – are turning to the insights that most frequently stem from psychological researchers, but also include insights from behavioral economics, sociology, or anthropology.

Particularly relevant are the analysis and the setting of “default-options.” A default is the option that a decision maker receives if he or she does not specifically state otherwise. Are we automatically enrolled in a 401(k), are we organ donors by default, or is the flu-shot a standard that is routinely given to all citizens? Research has given us many examples of how and when defaults can promote public safety or wealth.

One of the most important questions facing the planet, however, is how to manage the transition into a carbon-free economy. In a recent paper, Felix Ebeling of the University of Cologne and I tested whether defaults could nudge consumers into choosing a green energy contract over one that relies on conventional energy. The results were striking: setting the default to green energy increased participation nearly tenfold. This is an important result because it tells us that subtle, non-coercive changes in the decision making environment are enough to show substantial differences in consumers’ preferences in the domain of clean energy. It changes green energy participation from “hardly anyone” to “almost everyone”. Merely within the domain of energy behavior, one can think of many applications where this finding can be applied:  For instance, default engines of new cars could be set to hybrid and customers would need to actively switch to standard options. Standard temperatures of washing machines could be low, etc….(More)”

This Is How Visualizing Open Data Can Help Save Lives


Alexander Howard at the Huffington Post: “Cities are increasingly releasing data that they can use to make life better for their residents online — enabling journalists and researchers to better inform the public.

Los Angeles, for example, has analyzed data about injuries and deaths on its streets and published it online. Now people can check its conclusions and understand why LA’s public department prioritizes certain intersections.

The impact from these kinds of investments can lead directly to saving lives and preventing injuries. The work is part of a broader effort around the world to make cities safer.

Like New York City, San Francisco and Portland, Oregon, Los Angeles has adopted Sweden’s “Vision Zero” program as part of its strategy for eliminating traffic deathsCalifornia led the nation in bicycle deaths in 2014.

At visionzero.lacity.org, you can see that the City of Los Angeles is using data visualization to identify the locations of “high injury networks,” or the 6 percent of intersections that account for 65 percent of the severe injuries in the area.

CITY OF LOS ANGELES

The work is the result of LA’s partnership with University of South California graduate students. As a result of these analyses, the Los Angeles Police Department has been cracking down on jaywalking near the University of Southern California.

Abhi Nemani, the former chief data officer for LA, explained why the city needed to “go back to school” for help.

“In resource-constrained environments — the environment most cities find themselves in these days — you often have to beg, borrow, and steal innovation; particularly so, when it comes to in-demand resources such as data science expertise,” he told the Huffington Post.

“That’s why in Los Angeles, we opted to lean on the community for support: both the growing local tech sector and the expansive academic base. The academic community, in particular, was eager to collaborate with the city. In fact, most — if not all — local institutions reached out to me at some point asking to partner on a data science project with their graduate students.”

The City of Los Angeles is now working with another member of its tech sector toeliminate traffic deaths. DataScience, based in Culver City, California, received $22 million dollars in funding in December to make predictive insights for customers.

“The City of Los Angeles is very data-driven,” DataScience CEO Ian Swanson told HuffPost. “I commend Mayor Eric Garcetti and the City of Los Angeles on the openness, transparency, and availability of city data initiatives, like Vision Zero, put the City of Los Angeles‘ data into action and improve life in this great city.”

DataScience created an interactive online map showing the locations of collisions involving bicycles across the city….(More)”

Big Data Analysis: New Algorithms for a New Society


Book edited by Nathalie Japkowicz and Jerzy Stefanowski: “This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area.

It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued concerning the potential dangers of Big Data Analysis along with its pitfalls and challenges….(More)”

Smarter as the New Urban Agenda


New book edited by Gil-Garcia, J. Ramon, Pardo, Theresa A., Nam, Taewoo: “This book will provide one of the first comprehensive approaches to the study of smart city governments with theories and concepts for understanding and researching 21st century city governments innovative methodologies for the analysis and evaluation of smart city initiatives. The term “smart city” is now generally used to represent efforts that in different ways describe a comprehensive vision of a city for the present and future. A smarter city infuses information into its physical infrastructure to improve conveniences, facilitate mobility, add efficiencies, conserve energy, improve the quality of air and water, identify problems and fix them quickly, recover rapidly from disasters, collect data to make better decisions, deploy resources effectively and share data to enable collaboration across entities and domains. These and other similar efforts are expected to make cities more intelligent in terms of efficiency, effectiveness, productivity, transparency, and sustainability, among other important aspects. Given this changing social, institutional and technology environment, it seems feasible and likeable to attain smarter cities and by extension, smarter governments: virtually integrated, networked, interconnected, responsive, and efficient. This book will help build the bridge between sound research and practice expertise in the area of smarter cities and will be of interest to researchers and students in the e-government, public administration, political science, communication, information science, administrative sciences and management, sociology, computer science, and information technology. As well as government officials and public managers who will find practical recommendations based on rigorous studies that will contain insights and guidance for the development, management, and evaluation of complex smart cities and smart government initiatives….(More)”

The Moral Failure of Computer Scientists


Kaveh Waddell at the Atlantic: “Computer scientists and cryptographers occupy some of the ivory tower’s highest floors. Among academics, their work is prestigious and celebrated. To the average observer, much of it is too technical to comprehend. The field’s problems can sometimes seem remote from reality.

But computer science has quite a bit to do with reality. Its practitioners devise the surveillance systems that watch over nearly every space, public or otherwise—and they design the tools that allow for privacy in the digital realm. Computer science is political, by its very nature.

That’s at least according to Phillip Rogaway, a professor of computer science at the University of California, Davis, who has helped create some of the most important tools that secure the Internet today. Last week, Rogaway took his case directly to a roomful of cryptographers at a conference in Auckland, New Zealand. He accused them of a moral failure: By allowing the government to construct a massive surveillance apparatus, the field had abused the public trust. Rogaway said the scientists had a duty to pursue social good in their work.
He likened the danger posed by modern governments’ growing surveillance capabilities to the threat of nuclear warfare in the 1950s, and called upon scientists to step up and speak out today, as they did then.

I spoke to Rogaway about why cryptographers fail to see their work in moral terms, and the emerging link between encryption and terrorism in the national conversation. A transcript of our conversation appears below, lightly edited for concision and clarity….(More)”

Stretching science: why emotional intelligence is key to tackling climate change


Faith Kearns at the Conversation: “…some environmental challenges are increasingly taking on characteristics of intractable conflicts, which may remain unresolved despite good faith efforts.

In the case of climate change, conflicts ranging from debates over how to lower emissions to denialism are obvious and ongoing -– the science community has often approached them as something to be defeated or ignored.

While some people love it and others hate it, conflict is often an indicator that something important is happening; we generally don’t fight about things we don’t care about.

Working with conflict is a challenging proposition, in part because while it manifests in interactions with others, much of the real effort comes in dealing with our own internal conflicts.

However, beginning to accept and even value conflict as a necessary part of large-scale societal transformation has the potential to generate new approaches to climate change engagement. For example, understanding that in some cases denial by another person is protective may lead to new approaches to engagement.

As we connect more deeply with conflict, we may come to see it not as a flame to be fanned or put out, but as a resource.

A relational approach to climate change

Indeed, because of the emotion and conflict involved, the concept of a relational approach is one that offers a great deal of promise in the climate change arena. It is, however, vastly underexplored.

Relationship-centered approaches have been taken up in law, medicine, and psychology.

A common thread among these fields is a shift from expert-driven to more collaborative modes of working together. Navigating the personal and emotional elements of this kind of work asks quite a bit more of practitioners than subject-matter expertise.

In medicine, for example, relationship-centered care is a framework examining how relationships – between patients and clinicians, among clinicians, and even with broader communities – impact health care. It recognizes that care may go well beyond technical competency.

This kind of framework can demonstrate how a relational approach is different from more colloquial understandings of relationships; it can be a way to intentionally and transparently attend to conflict and power dynamics as they arise.

Although this is a simplified view of relational work, many would argue that an emphasis on emergent and transformative properties of relationships has been revolutionary. And one of the key challenges, and opportunities, of a relationship-centered approach to climate work is that we truly have no idea what the outcomes will be.

We have long tried to motivate action around climate change by decreasing scientific uncertainty, so introducing social uncertainty feels risky. At the same time it can be a relief because, in working together, nobody has to have the answer.

Learning to be comfortable with discomfort

A relational approach to climate change may sound basic to some, and complicated to others. In either case, it can be useful to know there is evidence that skillful relational capacity can be taught and learned.

The medical and legal communities have been developing relationship-centered training for years.

It is clear that relational skills and capacities like conflict resolution, empathy, and compassion can be enhanced through practices including active listening and self-reflection. Although it may seem an odd fit, climate change invites ability to work together in new ways that include acknowledging and working with the strong emotions involved.

With a relationship-centered approach, climate change issues become less about particular solutions, and more about transforming how we work together. It is both risky and revolutionary in that it asks us to take a giant leap into trusting not just scientific information, but each other….(More)”

China’s Biggest Polluters Face Wrath of Data-Wielding Citizens


Bloomberg News: “Besides facing hefty fines, criminal punishments and the possibility of closing, the worst emitters in China risk additional public anger as new smartphone applications and lower-cost monitoring devices widen access to data on pollution sources.

The Blue Map app, developed by the Institute of Public & Environmental Affairs with support from the SEE Foundation and the Alibaba Foundation, provides pollution data from more than 3,000 large coal-power, steel, cement and petrochemical production plants. Origins Technology Ltd. in July began sale of the Laser Egg, a palm-sized air quality monitor used to track indoor and outdoor air quality by measuring fine particulate matter in the air.

“Letting people know the sources of regional pollution will help the push for control over emissions of every chimney,” said Ma Jun, the founder and director of the Beijing-based IPE.

The phone map and Laser Egg are the latest levers in prying control over information on air quality from the hands of the few to the many, and they’re beginning to weigh on how officials respond to the issue. Numerous smartphone applications, including those developed by SINA Corp. and Moji Fengyun (Beijing) Software Technology Development Co., now provide people in China with real-time access to air quality readings, essentially democratizing what was once an information pipeline available only to the government.

“China’s continuing struggle to control and reduce air pollution exemplifies the government’s fear that lifestyle issues will mutate into demands for political change,” said Mary Gallagher, an associate professor of political science at the University of Michigan.

Even the government is getting in on the act. The Ministry of Environmental Protection rolled out a smartphone application called “Nationwide Air Quality” with the help ofWuhan Juzheng Environmental Science & Technology Co. at the end of 2013.

“As citizens know more about air pollution, more pressure will be put on the government,” said Xu Qinxiang, a technology manager at Wuhan Juzheng. “This will urge the government to control pollutant sources and upgrade heavy industries.”

 Laser Egg

Sources of air quality data come from the China National Environment Monitoring Center, local environmental protection bureaus and non-Chinese sources such as the U.S. Embassy’s website in Beijing, Xu said.

Air quality is a controversial subject in China. Since 2012, the public has pushed the government to move more quickly than planned to begin releasing data measuring pollution levels — especially of PM2.5, the particulates most harmful to human health.

The reading was 267 micrograms per cubic meter at 10 a.m. Monday near Tiananmen Square, according to the Beijing Municipal Environmental Monitoring Center. The World Health Organization cautions against 24-hour exposure to concentrations higher than 25.

The availability of data appears to be filling a need, especially with the arrival of colder temperatures and the associated smog that blanketed Beijing and northern Chinarecently….

“With more disclosure of the data, everyone becomes more sensitive, hoping the government can do something,” Li Yajuan, a 27-year-old office secretary, said in an interview in Beijing’s Fuchengmen area. “It’s our own living environment after all.”

Efforts to make products linked to air data continue. IBM has been developing artificial intelligence to help fight Beijing’s toxic air pollution, and plans to work with other municipalities in China and India on similar projects to manage air quality….(More)”

Big Data Before the Web


Evan Hepler-Smith in the Wall Street Journal: “Sometime in the early 1950s, on a reservation in Wisconsin, a Menominee Indian man looked at an ink blot. An anthropologist recorded the man’s reaction according to a standard Rorschach-test protocol. The researcher submitted a copy of these notes to an enormous cache of records collected over the course of decades by American social scientists working among various “societies ‘other than our own.’ ” This entire collection of social-scientific data was photographed and printed in arrays of microscopic images on 3-by-5-inch cards. Sets of these cards were shipped to research libraries around the world. They gathered dust.

In the results of this Rorschach test, the anthropologist saw evidence of a culture eroded by modernity. Sixty years later, these documents also testify to the aspirations and fate of the social-scientific project for which they were generated. Deep within this forgotten Ozymandian card file sits the Menominee man’s reaction to Rorschach card VI: “It is like a dead planet. It seems to tell the story of a people once great who have lost . . . like something happened. All that’s left is the symbol.”

In “Database of Dreams: The Lost Quest to Catalog Humanity,” Rebecca Lemov delves into the ambitious efforts of mid-20th-century social scientists to build a “capacious and reliable science of the varieties of the human being” by generating an archive of human experience through interviews and tests and by storing the information on the high-tech media of the day.

 For these psychologists and anthropologists, the key to a universal human science lay in studying members of cultures in transition between traditional and modern ways of life and in rendering their individuality as data. Interweaving stories of social scientists, Native American research subjects and information technologies, Ms. Lemov presents a compelling account of “what ‘humanness’ came to mean in an age of rapid change in technological and social conditions.” Ms. Lemov, an associate professor of the history of science at Harvard University, follows two contrasting threads through a story that she calls “a parable for our time.” She shows, first, how collecting data about human experience shapes human experience and, second, how a high-tech data repository of the 1950s became, as she puts it, a “data ruin.”…(More) – See also: Database of Dreams: The Lost Quest to Catalog Humanity