The Power of the Nudge to Change Our Energy Future


Sebastian Berger in the Scientific American: “More than ever, psychology has become influential not only in explaining human behavior, but also as a resource for policy makers to achieve goals related to health, well-being, or sustainability. For example, President Obama signed an executive order directing the government to systematically use behavioral science insights to “better serve the American people.” Not alone in this endeavor, many governments – including the UK, Germany, Denmark, or Australia – are turning to the insights that most frequently stem from psychological researchers, but also include insights from behavioral economics, sociology, or anthropology.

Particularly relevant are the analysis and the setting of “default-options.” A default is the option that a decision maker receives if he or she does not specifically state otherwise. Are we automatically enrolled in a 401(k), are we organ donors by default, or is the flu-shot a standard that is routinely given to all citizens? Research has given us many examples of how and when defaults can promote public safety or wealth.

One of the most important questions facing the planet, however, is how to manage the transition into a carbon-free economy. In a recent paper, Felix Ebeling of the University of Cologne and I tested whether defaults could nudge consumers into choosing a green energy contract over one that relies on conventional energy. The results were striking: setting the default to green energy increased participation nearly tenfold. This is an important result because it tells us that subtle, non-coercive changes in the decision making environment are enough to show substantial differences in consumers’ preferences in the domain of clean energy. It changes green energy participation from “hardly anyone” to “almost everyone”. Merely within the domain of energy behavior, one can think of many applications where this finding can be applied:  For instance, default engines of new cars could be set to hybrid and customers would need to actively switch to standard options. Standard temperatures of washing machines could be low, etc….(More)”

This Is How Visualizing Open Data Can Help Save Lives


Alexander Howard at the Huffington Post: “Cities are increasingly releasing data that they can use to make life better for their residents online — enabling journalists and researchers to better inform the public.

Los Angeles, for example, has analyzed data about injuries and deaths on its streets and published it online. Now people can check its conclusions and understand why LA’s public department prioritizes certain intersections.

The impact from these kinds of investments can lead directly to saving lives and preventing injuries. The work is part of a broader effort around the world to make cities safer.

Like New York City, San Francisco and Portland, Oregon, Los Angeles has adopted Sweden’s “Vision Zero” program as part of its strategy for eliminating traffic deathsCalifornia led the nation in bicycle deaths in 2014.

At visionzero.lacity.org, you can see that the City of Los Angeles is using data visualization to identify the locations of “high injury networks,” or the 6 percent of intersections that account for 65 percent of the severe injuries in the area.

CITY OF LOS ANGELES

The work is the result of LA’s partnership with University of South California graduate students. As a result of these analyses, the Los Angeles Police Department has been cracking down on jaywalking near the University of Southern California.

Abhi Nemani, the former chief data officer for LA, explained why the city needed to “go back to school” for help.

“In resource-constrained environments — the environment most cities find themselves in these days — you often have to beg, borrow, and steal innovation; particularly so, when it comes to in-demand resources such as data science expertise,” he told the Huffington Post.

“That’s why in Los Angeles, we opted to lean on the community for support: both the growing local tech sector and the expansive academic base. The academic community, in particular, was eager to collaborate with the city. In fact, most — if not all — local institutions reached out to me at some point asking to partner on a data science project with their graduate students.”

The City of Los Angeles is now working with another member of its tech sector toeliminate traffic deaths. DataScience, based in Culver City, California, received $22 million dollars in funding in December to make predictive insights for customers.

“The City of Los Angeles is very data-driven,” DataScience CEO Ian Swanson told HuffPost. “I commend Mayor Eric Garcetti and the City of Los Angeles on the openness, transparency, and availability of city data initiatives, like Vision Zero, put the City of Los Angeles‘ data into action and improve life in this great city.”

DataScience created an interactive online map showing the locations of collisions involving bicycles across the city….(More)”

Big Data Analysis: New Algorithms for a New Society


Book edited by Nathalie Japkowicz and Jerzy Stefanowski: “This edited volume is devoted to Big Data Analysis from a Machine Learning standpoint as presented by some of the most eminent researchers in this area.

It demonstrates that Big Data Analysis opens up new research problems which were either never considered before, or were only considered within a limited range. In addition to providing methodological discussions on the principles of mining Big Data and the difference between traditional statistical data analysis and newer computing frameworks, this book presents recently developed algorithms affecting such areas as business, financial forecasting, human mobility, the Internet of Things, information networks, bioinformatics, medical systems and life science. It explores, through a number of specific examples, how the study of Big Data Analysis has evolved and how it has started and will most likely continue to affect society. While the benefits brought upon by Big Data Analysis are underlined, the book also discusses some of the warnings that have been issued concerning the potential dangers of Big Data Analysis along with its pitfalls and challenges….(More)”

Smarter as the New Urban Agenda


New book edited by Gil-Garcia, J. Ramon, Pardo, Theresa A., Nam, Taewoo: “This book will provide one of the first comprehensive approaches to the study of smart city governments with theories and concepts for understanding and researching 21st century city governments innovative methodologies for the analysis and evaluation of smart city initiatives. The term “smart city” is now generally used to represent efforts that in different ways describe a comprehensive vision of a city for the present and future. A smarter city infuses information into its physical infrastructure to improve conveniences, facilitate mobility, add efficiencies, conserve energy, improve the quality of air and water, identify problems and fix them quickly, recover rapidly from disasters, collect data to make better decisions, deploy resources effectively and share data to enable collaboration across entities and domains. These and other similar efforts are expected to make cities more intelligent in terms of efficiency, effectiveness, productivity, transparency, and sustainability, among other important aspects. Given this changing social, institutional and technology environment, it seems feasible and likeable to attain smarter cities and by extension, smarter governments: virtually integrated, networked, interconnected, responsive, and efficient. This book will help build the bridge between sound research and practice expertise in the area of smarter cities and will be of interest to researchers and students in the e-government, public administration, political science, communication, information science, administrative sciences and management, sociology, computer science, and information technology. As well as government officials and public managers who will find practical recommendations based on rigorous studies that will contain insights and guidance for the development, management, and evaluation of complex smart cities and smart government initiatives….(More)”

The Moral Failure of Computer Scientists


Kaveh Waddell at the Atlantic: “Computer scientists and cryptographers occupy some of the ivory tower’s highest floors. Among academics, their work is prestigious and celebrated. To the average observer, much of it is too technical to comprehend. The field’s problems can sometimes seem remote from reality.

But computer science has quite a bit to do with reality. Its practitioners devise the surveillance systems that watch over nearly every space, public or otherwise—and they design the tools that allow for privacy in the digital realm. Computer science is political, by its very nature.

That’s at least according to Phillip Rogaway, a professor of computer science at the University of California, Davis, who has helped create some of the most important tools that secure the Internet today. Last week, Rogaway took his case directly to a roomful of cryptographers at a conference in Auckland, New Zealand. He accused them of a moral failure: By allowing the government to construct a massive surveillance apparatus, the field had abused the public trust. Rogaway said the scientists had a duty to pursue social good in their work.
He likened the danger posed by modern governments’ growing surveillance capabilities to the threat of nuclear warfare in the 1950s, and called upon scientists to step up and speak out today, as they did then.

I spoke to Rogaway about why cryptographers fail to see their work in moral terms, and the emerging link between encryption and terrorism in the national conversation. A transcript of our conversation appears below, lightly edited for concision and clarity….(More)”

Stretching science: why emotional intelligence is key to tackling climate change


Faith Kearns at the Conversation: “…some environmental challenges are increasingly taking on characteristics of intractable conflicts, which may remain unresolved despite good faith efforts.

In the case of climate change, conflicts ranging from debates over how to lower emissions to denialism are obvious and ongoing -– the science community has often approached them as something to be defeated or ignored.

While some people love it and others hate it, conflict is often an indicator that something important is happening; we generally don’t fight about things we don’t care about.

Working with conflict is a challenging proposition, in part because while it manifests in interactions with others, much of the real effort comes in dealing with our own internal conflicts.

However, beginning to accept and even value conflict as a necessary part of large-scale societal transformation has the potential to generate new approaches to climate change engagement. For example, understanding that in some cases denial by another person is protective may lead to new approaches to engagement.

As we connect more deeply with conflict, we may come to see it not as a flame to be fanned or put out, but as a resource.

A relational approach to climate change

Indeed, because of the emotion and conflict involved, the concept of a relational approach is one that offers a great deal of promise in the climate change arena. It is, however, vastly underexplored.

Relationship-centered approaches have been taken up in law, medicine, and psychology.

A common thread among these fields is a shift from expert-driven to more collaborative modes of working together. Navigating the personal and emotional elements of this kind of work asks quite a bit more of practitioners than subject-matter expertise.

In medicine, for example, relationship-centered care is a framework examining how relationships – between patients and clinicians, among clinicians, and even with broader communities – impact health care. It recognizes that care may go well beyond technical competency.

This kind of framework can demonstrate how a relational approach is different from more colloquial understandings of relationships; it can be a way to intentionally and transparently attend to conflict and power dynamics as they arise.

Although this is a simplified view of relational work, many would argue that an emphasis on emergent and transformative properties of relationships has been revolutionary. And one of the key challenges, and opportunities, of a relationship-centered approach to climate work is that we truly have no idea what the outcomes will be.

We have long tried to motivate action around climate change by decreasing scientific uncertainty, so introducing social uncertainty feels risky. At the same time it can be a relief because, in working together, nobody has to have the answer.

Learning to be comfortable with discomfort

A relational approach to climate change may sound basic to some, and complicated to others. In either case, it can be useful to know there is evidence that skillful relational capacity can be taught and learned.

The medical and legal communities have been developing relationship-centered training for years.

It is clear that relational skills and capacities like conflict resolution, empathy, and compassion can be enhanced through practices including active listening and self-reflection. Although it may seem an odd fit, climate change invites ability to work together in new ways that include acknowledging and working with the strong emotions involved.

With a relationship-centered approach, climate change issues become less about particular solutions, and more about transforming how we work together. It is both risky and revolutionary in that it asks us to take a giant leap into trusting not just scientific information, but each other….(More)”

China’s Biggest Polluters Face Wrath of Data-Wielding Citizens


Bloomberg News: “Besides facing hefty fines, criminal punishments and the possibility of closing, the worst emitters in China risk additional public anger as new smartphone applications and lower-cost monitoring devices widen access to data on pollution sources.

The Blue Map app, developed by the Institute of Public & Environmental Affairs with support from the SEE Foundation and the Alibaba Foundation, provides pollution data from more than 3,000 large coal-power, steel, cement and petrochemical production plants. Origins Technology Ltd. in July began sale of the Laser Egg, a palm-sized air quality monitor used to track indoor and outdoor air quality by measuring fine particulate matter in the air.

“Letting people know the sources of regional pollution will help the push for control over emissions of every chimney,” said Ma Jun, the founder and director of the Beijing-based IPE.

The phone map and Laser Egg are the latest levers in prying control over information on air quality from the hands of the few to the many, and they’re beginning to weigh on how officials respond to the issue. Numerous smartphone applications, including those developed by SINA Corp. and Moji Fengyun (Beijing) Software Technology Development Co., now provide people in China with real-time access to air quality readings, essentially democratizing what was once an information pipeline available only to the government.

“China’s continuing struggle to control and reduce air pollution exemplifies the government’s fear that lifestyle issues will mutate into demands for political change,” said Mary Gallagher, an associate professor of political science at the University of Michigan.

Even the government is getting in on the act. The Ministry of Environmental Protection rolled out a smartphone application called “Nationwide Air Quality” with the help ofWuhan Juzheng Environmental Science & Technology Co. at the end of 2013.

“As citizens know more about air pollution, more pressure will be put on the government,” said Xu Qinxiang, a technology manager at Wuhan Juzheng. “This will urge the government to control pollutant sources and upgrade heavy industries.”

 Laser Egg

Sources of air quality data come from the China National Environment Monitoring Center, local environmental protection bureaus and non-Chinese sources such as the U.S. Embassy’s website in Beijing, Xu said.

Air quality is a controversial subject in China. Since 2012, the public has pushed the government to move more quickly than planned to begin releasing data measuring pollution levels — especially of PM2.5, the particulates most harmful to human health.

The reading was 267 micrograms per cubic meter at 10 a.m. Monday near Tiananmen Square, according to the Beijing Municipal Environmental Monitoring Center. The World Health Organization cautions against 24-hour exposure to concentrations higher than 25.

The availability of data appears to be filling a need, especially with the arrival of colder temperatures and the associated smog that blanketed Beijing and northern Chinarecently….

“With more disclosure of the data, everyone becomes more sensitive, hoping the government can do something,” Li Yajuan, a 27-year-old office secretary, said in an interview in Beijing’s Fuchengmen area. “It’s our own living environment after all.”

Efforts to make products linked to air data continue. IBM has been developing artificial intelligence to help fight Beijing’s toxic air pollution, and plans to work with other municipalities in China and India on similar projects to manage air quality….(More)”

Big Data Before the Web


Evan Hepler-Smith in the Wall Street Journal: “Sometime in the early 1950s, on a reservation in Wisconsin, a Menominee Indian man looked at an ink blot. An anthropologist recorded the man’s reaction according to a standard Rorschach-test protocol. The researcher submitted a copy of these notes to an enormous cache of records collected over the course of decades by American social scientists working among various “societies ‘other than our own.’ ” This entire collection of social-scientific data was photographed and printed in arrays of microscopic images on 3-by-5-inch cards. Sets of these cards were shipped to research libraries around the world. They gathered dust.

In the results of this Rorschach test, the anthropologist saw evidence of a culture eroded by modernity. Sixty years later, these documents also testify to the aspirations and fate of the social-scientific project for which they were generated. Deep within this forgotten Ozymandian card file sits the Menominee man’s reaction to Rorschach card VI: “It is like a dead planet. It seems to tell the story of a people once great who have lost . . . like something happened. All that’s left is the symbol.”

In “Database of Dreams: The Lost Quest to Catalog Humanity,” Rebecca Lemov delves into the ambitious efforts of mid-20th-century social scientists to build a “capacious and reliable science of the varieties of the human being” by generating an archive of human experience through interviews and tests and by storing the information on the high-tech media of the day.

 For these psychologists and anthropologists, the key to a universal human science lay in studying members of cultures in transition between traditional and modern ways of life and in rendering their individuality as data. Interweaving stories of social scientists, Native American research subjects and information technologies, Ms. Lemov presents a compelling account of “what ‘humanness’ came to mean in an age of rapid change in technological and social conditions.” Ms. Lemov, an associate professor of the history of science at Harvard University, follows two contrasting threads through a story that she calls “a parable for our time.” She shows, first, how collecting data about human experience shapes human experience and, second, how a high-tech data repository of the 1950s became, as she puts it, a “data ruin.”…(More) – See also: Database of Dreams: The Lost Quest to Catalog Humanity

Data Science ethics


Gov.uk blog: “If Tesco knows day-to-day how poorly the nation is, how can Government access  similar  insights so it can better plan health services? If Airbnb can give you a tailored service depending on your tastes, how can Government provide people with the right support to help them back into work in a way that is right for them? If companies are routinely using social media data to get feedback from their customers to improve their services, how can Government also use publicly available data to do the same?

Data science allows us to use new types of data and powerful tools to analyse this more quickly and more objectively than any human could. It can put us in the vanguard of policymaking – revealing new insights that leads to better and more tailored interventions. And  it can help reduce costs, freeing up resource to spend on more serious cases.

But some of these data uses and machine-learning techniques are new and still relatively untested in Government. Of course, we operate within legal frameworks such as the Data Protection Act and Intellectual Property law. These are flexible but don’t always talk explicitly about the new challenges data science throws up. For example, how are you to explain the decision making process of a deep learning black box algorithm? And if you were able to, how would you do so in plain English and not a row of 0s and 1s?

We want data scientists to feel confident to innovate with data, alongside  the policy makers and operational staff who make daily decisions on the data that the analysts provide –. That’s why we are creating an ethical framework which brings together the relevant parts of the law and ethical considerations into a simple document that helps Government officials decide what it can do and what it should do. We have a moral responsibility to maximise the use of data – which is never more apparent than after incidents of abuse or crime are left undetected – as well as to pay heed to the potential risks of these new tools. The guidelines are draft and not formal government policy, but we want to share them more widely in order to help iterate and improve them further….

So what’s in the framework? There is more detail in the fuller document, but it is based around six key principles:

  1. Start with a clear user need and public benefit: this will help you justify the level of data sensitivity and method you use
  2. Use the minimum level of data necessary to fulfill the public benefit: there are many techniques for doing so, such as de-identification, aggregation or querying against data
  3. Build robust data science models: the model is only as good as the data it contains and while machines are less biased than humans they can get it wrong. It’s critical to be clear about the confidence of the model and think through unintended consequences and biases contained within the data
  4. Be alert to public perceptions: put simply, what would a normal person on the street think about the project?
  5. Be as open and accountable as possible: Transparency is the antiseptic for unethical behavior. Aim to be as open as possible (with explanations in plain English), although in certain public protection cases the ability to be transparent will be constrained.
  6. Keep data safe and secure: this is not restricted to data science projects but we know that the public are most concerned about losing control of their data….(More)”

Big Data in the Policy Cycle: Policy Decision Making in the Digital Era


Paper by Johann Höchtl et al in the Journal of Organizational Computing and Electronic Commerce: “Although of high relevance to political science, the interaction between technological change and political change in the era of Big Data remains somewhat of a neglected topic. Most studies focus on the concept of e-government and e-governance, and on how already existing government activities performed through the bureaucratic body of public administration could be improved by technology. This paper attempts to build a bridge between the field of e-governance and theories of public administration that goes beyond the service delivery approach that dominates a large part of e-government research. Using the policy cycle as a generic model for policy processes and policy development, a new look on how policy decision making could be conducted on the basis of ICT and Big Data is presented in this paper….(More)”