Stretching science: why emotional intelligence is key to tackling climate change


Faith Kearns at the Conversation: “…some environmental challenges are increasingly taking on characteristics of intractable conflicts, which may remain unresolved despite good faith efforts.

In the case of climate change, conflicts ranging from debates over how to lower emissions to denialism are obvious and ongoing -– the science community has often approached them as something to be defeated or ignored.

While some people love it and others hate it, conflict is often an indicator that something important is happening; we generally don’t fight about things we don’t care about.

Working with conflict is a challenging proposition, in part because while it manifests in interactions with others, much of the real effort comes in dealing with our own internal conflicts.

However, beginning to accept and even value conflict as a necessary part of large-scale societal transformation has the potential to generate new approaches to climate change engagement. For example, understanding that in some cases denial by another person is protective may lead to new approaches to engagement.

As we connect more deeply with conflict, we may come to see it not as a flame to be fanned or put out, but as a resource.

A relational approach to climate change

Indeed, because of the emotion and conflict involved, the concept of a relational approach is one that offers a great deal of promise in the climate change arena. It is, however, vastly underexplored.

Relationship-centered approaches have been taken up in law, medicine, and psychology.

A common thread among these fields is a shift from expert-driven to more collaborative modes of working together. Navigating the personal and emotional elements of this kind of work asks quite a bit more of practitioners than subject-matter expertise.

In medicine, for example, relationship-centered care is a framework examining how relationships – between patients and clinicians, among clinicians, and even with broader communities – impact health care. It recognizes that care may go well beyond technical competency.

This kind of framework can demonstrate how a relational approach is different from more colloquial understandings of relationships; it can be a way to intentionally and transparently attend to conflict and power dynamics as they arise.

Although this is a simplified view of relational work, many would argue that an emphasis on emergent and transformative properties of relationships has been revolutionary. And one of the key challenges, and opportunities, of a relationship-centered approach to climate work is that we truly have no idea what the outcomes will be.

We have long tried to motivate action around climate change by decreasing scientific uncertainty, so introducing social uncertainty feels risky. At the same time it can be a relief because, in working together, nobody has to have the answer.

Learning to be comfortable with discomfort

A relational approach to climate change may sound basic to some, and complicated to others. In either case, it can be useful to know there is evidence that skillful relational capacity can be taught and learned.

The medical and legal communities have been developing relationship-centered training for years.

It is clear that relational skills and capacities like conflict resolution, empathy, and compassion can be enhanced through practices including active listening and self-reflection. Although it may seem an odd fit, climate change invites ability to work together in new ways that include acknowledging and working with the strong emotions involved.

With a relationship-centered approach, climate change issues become less about particular solutions, and more about transforming how we work together. It is both risky and revolutionary in that it asks us to take a giant leap into trusting not just scientific information, but each other….(More)”

China’s Biggest Polluters Face Wrath of Data-Wielding Citizens


Bloomberg News: “Besides facing hefty fines, criminal punishments and the possibility of closing, the worst emitters in China risk additional public anger as new smartphone applications and lower-cost monitoring devices widen access to data on pollution sources.

The Blue Map app, developed by the Institute of Public & Environmental Affairs with support from the SEE Foundation and the Alibaba Foundation, provides pollution data from more than 3,000 large coal-power, steel, cement and petrochemical production plants. Origins Technology Ltd. in July began sale of the Laser Egg, a palm-sized air quality monitor used to track indoor and outdoor air quality by measuring fine particulate matter in the air.

“Letting people know the sources of regional pollution will help the push for control over emissions of every chimney,” said Ma Jun, the founder and director of the Beijing-based IPE.

The phone map and Laser Egg are the latest levers in prying control over information on air quality from the hands of the few to the many, and they’re beginning to weigh on how officials respond to the issue. Numerous smartphone applications, including those developed by SINA Corp. and Moji Fengyun (Beijing) Software Technology Development Co., now provide people in China with real-time access to air quality readings, essentially democratizing what was once an information pipeline available only to the government.

“China’s continuing struggle to control and reduce air pollution exemplifies the government’s fear that lifestyle issues will mutate into demands for political change,” said Mary Gallagher, an associate professor of political science at the University of Michigan.

Even the government is getting in on the act. The Ministry of Environmental Protection rolled out a smartphone application called “Nationwide Air Quality” with the help ofWuhan Juzheng Environmental Science & Technology Co. at the end of 2013.

“As citizens know more about air pollution, more pressure will be put on the government,” said Xu Qinxiang, a technology manager at Wuhan Juzheng. “This will urge the government to control pollutant sources and upgrade heavy industries.”

 Laser Egg

Sources of air quality data come from the China National Environment Monitoring Center, local environmental protection bureaus and non-Chinese sources such as the U.S. Embassy’s website in Beijing, Xu said.

Air quality is a controversial subject in China. Since 2012, the public has pushed the government to move more quickly than planned to begin releasing data measuring pollution levels — especially of PM2.5, the particulates most harmful to human health.

The reading was 267 micrograms per cubic meter at 10 a.m. Monday near Tiananmen Square, according to the Beijing Municipal Environmental Monitoring Center. The World Health Organization cautions against 24-hour exposure to concentrations higher than 25.

The availability of data appears to be filling a need, especially with the arrival of colder temperatures and the associated smog that blanketed Beijing and northern Chinarecently….

“With more disclosure of the data, everyone becomes more sensitive, hoping the government can do something,” Li Yajuan, a 27-year-old office secretary, said in an interview in Beijing’s Fuchengmen area. “It’s our own living environment after all.”

Efforts to make products linked to air data continue. IBM has been developing artificial intelligence to help fight Beijing’s toxic air pollution, and plans to work with other municipalities in China and India on similar projects to manage air quality….(More)”

Opening up government data for public benefit


Keiran Hardy at the Mandarin (Australia): “…This post explains the open data movement and considers the benefits and risks of releasing government data as open data. It then outlines the steps taken by the Labor and Liberal governments in accordance with this trend. It argues that the Prime Minister’stask, while admirably intentioned, is likely to prove difficult due to ongoing challenges surrounding the requirements of privacy law and a public service culture that remains reluctant to release government data into the public domain….

A key purpose of releasing government data is to improve the effectiveness and efficiency of services delivered by the government. For example, data on crops, weather and geography might be analysed to improve current approaches to farming and industry, or data on hospital admissions might be analysed alongside demographic and census data to improve the efficiency of health services in areas of need. It has been estimated that such innovation based on open data could benefit the Australian economy by up to $16 billion per year.

Another core benefit is that the open data movement is making gains in transparency and accountability, as a greater proportion of government decisions and operations are being shared with the public. These democratic values are made clear in the OGP’s Open Government Declaration, which aims to make governments ‘more open, accountable, and responsive to citizens’.

Open data can also improve democratic participation by allowing citizens to contribute to policy innovation. Events like GovHack, an annual Australian competition in which government, industry and the general public collaborate to find new uses for open government data, epitomise a growing trend towards service delivery informed by user input. The winner of the “Best Policy Insights Hack” at GovHack 2015 developed a software program for analysing which suburbs are best placed for rooftop solar investment.

At the same time, the release of government data poses significant risks to the privacy of Australian citizens. Much of the open data currently available is spatial (geographic or satellite) data, which is relatively unproblematic to post online as it poses minimal privacy risks. However, for the full benefits of open data to be gained, these kinds of data need to be supplemented with information on welfare payments, hospital admission rates and other potentially sensitive areas which could drive policy innovation.

Policy data in these areas would be de-identified — that is, all names, addresses and other obvious identifying information would be removed so that only aggregate or statistical data remains. However, debates continue as to the reliability of de-identification techniques, as there have been prominent examples of individuals being re-identified by cross-referencing datasets….

With regard to open data, a culture resistant to releasing government informationappears to be driven by several similar factors, including:

  • A generational preference amongst public service management for maintaining secrecy of information, whereas younger generations expect that data should be made freely available;
  • Concerns about the quality or accuracy of information being released;
  • Fear that mistakes or misconduct on behalf of government employees might be exposed;
  • Limited understanding of the benefits that can be gained from open data; and
  • A lack of leadership to help drive the open data movement.

If open data policies have a similar effect on public service culture as FOI legislation, it may be that open data policies in fact hinder transparency by having a chilling effect on government decision-making for fear of what might be exposed….

These legal and cultural hurdles will pose ongoing challenges for the Turnbull government in seeking to release greater amounts of government data as open data….(More)

OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data


Paper by Taha A Kass-Hout et al in JAMIA: “The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs).

Materials and Methods: Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges.

Results:Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products

Conclusion: With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products…(More)”

Data Science ethics


Gov.uk blog: “If Tesco knows day-to-day how poorly the nation is, how can Government access  similar  insights so it can better plan health services? If Airbnb can give you a tailored service depending on your tastes, how can Government provide people with the right support to help them back into work in a way that is right for them? If companies are routinely using social media data to get feedback from their customers to improve their services, how can Government also use publicly available data to do the same?

Data science allows us to use new types of data and powerful tools to analyse this more quickly and more objectively than any human could. It can put us in the vanguard of policymaking – revealing new insights that leads to better and more tailored interventions. And  it can help reduce costs, freeing up resource to spend on more serious cases.

But some of these data uses and machine-learning techniques are new and still relatively untested in Government. Of course, we operate within legal frameworks such as the Data Protection Act and Intellectual Property law. These are flexible but don’t always talk explicitly about the new challenges data science throws up. For example, how are you to explain the decision making process of a deep learning black box algorithm? And if you were able to, how would you do so in plain English and not a row of 0s and 1s?

We want data scientists to feel confident to innovate with data, alongside  the policy makers and operational staff who make daily decisions on the data that the analysts provide –. That’s why we are creating an ethical framework which brings together the relevant parts of the law and ethical considerations into a simple document that helps Government officials decide what it can do and what it should do. We have a moral responsibility to maximise the use of data – which is never more apparent than after incidents of abuse or crime are left undetected – as well as to pay heed to the potential risks of these new tools. The guidelines are draft and not formal government policy, but we want to share them more widely in order to help iterate and improve them further….

So what’s in the framework? There is more detail in the fuller document, but it is based around six key principles:

  1. Start with a clear user need and public benefit: this will help you justify the level of data sensitivity and method you use
  2. Use the minimum level of data necessary to fulfill the public benefit: there are many techniques for doing so, such as de-identification, aggregation or querying against data
  3. Build robust data science models: the model is only as good as the data it contains and while machines are less biased than humans they can get it wrong. It’s critical to be clear about the confidence of the model and think through unintended consequences and biases contained within the data
  4. Be alert to public perceptions: put simply, what would a normal person on the street think about the project?
  5. Be as open and accountable as possible: Transparency is the antiseptic for unethical behavior. Aim to be as open as possible (with explanations in plain English), although in certain public protection cases the ability to be transparent will be constrained.
  6. Keep data safe and secure: this is not restricted to data science projects but we know that the public are most concerned about losing control of their data….(More)”

Five Studies: How Behavioral Science Can Help in International Development


 in Pacific Standard: “In 2012, there were 896 million people around the world—12.7 percent of the global population—living on less than two dollars a day. The World Food Programestimates that 795 million people worldwide don’t have enough food to “lead a healthy life”; 25 percent of people living in Sub-Saharan Africa are undernourished. Over three million children die every year thanks to poor nutrition, and hunger is the leading cause of death worldwide. In 2012, just three preventable diseases (pneumonia, diarrhea, and malaria) killed 4,600 children every day.

Last month, the World Bank announced the launch of the Global Insights Initiative (GINI). The initiative, which follows in the footsteps of so-called “nudge units” in the United Kingdom and United States, is the Bank’s effort to incorporate insights from the field of behavioral science into the design of international development programs; too often, those programs failed to account for how people behave in the real world. Development policy, according to the Bank’s 2015 World Development Report, is overdue for a “redesign based on careful consideration of human factors.” Researchers have applauded the announcement, but it raises an interesting question: What can nudges really accomplish in the face of the developing world’s overwhelming poverty and health-care deficits?

In fact, researchers have found that instituting small program changes, informed by a better understanding of people’s motivations and limitations, can have big effects on everything from savings rates to vaccination rates to risky sexual behavior. Here are five studies that demonstrate the benefits of bringing empirical social science into the developing world….(More)”

Smart Urbanism: Utopian vision or false dawn?


Book edited by Simon MarvinAndrés Luque-Ayala, and Colin McFarlane: “Smart Urbanism (SU) – the rebuilding of cities through the integration of digital technologies with buildings, neighbourhoods, networked infrastructures and people – is being represented as a unique emerging ‘solution’ to the majority of problems faced by cities today. SU discourses, enacted by technology companies, national governments and supranational agencies alike, claim a supremacy of urban digital technologies for managing and controlling infrastructures, achieving greater effectiveness in managing service demand and reducing carbon emissions, developing greater social interaction and community networks, providing new services around health and social care etc. Smart urbanism is being represented as the response to almost every facet of the contemporary urban question.

This book explores this common conception of the problematic of smart urbanism and critically address what new capabilities are being created by whom and with what exclusions; how these are being developed – and contested; where is this happening both within and between cities; and, with what sorts of social and material consequences. The aim of the book is to identify and convene a currently fragmented and disconnected group of researchers, commentators, developers and users from both within and outside the mainstream SU discourse, including several of those that adopt a more critical perspective, to assess ‘what’ problems of the city smartness can address

The volume provides the first internationally comparative assessment of SU in cities of the global north and south, critically evaluates whether current visions of SU are able to achieve their potential; and then identifies alternative trajectories for SU that hold radical promise for reshaping cities….(More)”

New frontiers in social innovation research


Geoff Mulgan: “Nesta has published a new book with Palgrave which contains an introduction by me and many important chapters from leading academics around the world. I hope that many people will read it, and think about it, because it challenges, in a highly constructive way, many of the rather tired assumptions of the London media/political elite of both left and right.

The essay is by Roberto Mangabeira Unger, perhaps the world’s most creative and important contemporary intellectual. He is Professor of Law at Harvard (where he taught Obama); a philosopher and political theorist; author of one of the most interesting recent books on religion; co-author of an equally ground-breaking recent book on theoretical physics; and serves as strategy minister in the Brazilian government.

His argument is that a radically different way of thinking about politics, government and social change is emerging, which has either not been noticed by many political leaders, or misinterpreted. The essence of the argument is that practice is moving faster than theory; that systematic experimentation is a faster way to solve problems than clever authorship of pamphlets, white papers and plans; and that societies have the potential to be far more active agents of their own future than we assume.

The argument has implications for many fields. One is think-tanks. Twenty years ago I set up a think-tank, Demos. At that time the dominant model for policy making was to bring together some clever people in a capital city to write pamphlets, white papers and then laws. In the 1950s to 1970s a primary role was played by professors in universities, or royal commissions. Then it shifted to think-tanks. Sometimes teams within governments played a similar role – and I oversaw several of these, including the Strategy Unit in government. All saw policy as an essentially paper-based process, involving a linear transmission from abstract theories and analyses to practical implementation.

There’s still an important role to be played by think-tanks. But an opposite approach has now become common, and is promoted by Unger. In this approach, practice precedes theory. Experiment in the real world drives the development of new ideas – in business, civil society, and on the edges of the public sector. Learning by doing complements, and often leads analysis. The role of the academics and think-tanks shifts from inventing ideas to making sense of what’s emerging, and generalising it. Policies don’t try to specify every detail but rather set out broad directions and then enable a process of experiment and discovery.

As Unger shows, this approach has profound philosophical roots (reaching back to the 19th century pragmatists and beyond), and profound political implications (it’s almost opposite to the classic Marxist view, later adopted by the neoliberal right, in which intellectuals define solutions in theory which are then translated into practice). It also has profound implications for civil society – which he argues should adopt a maximalist rather than a minimalist view of social innovation.

The Unger approach doesn’t work for everything – for example, constitutional reform. But it is a superior method for improving most of the fields where governments have power – from welfare and health, to education and economic policy, and it has worked well for Nesta – evolving new models of healthcare, working with dozens of governments to redesign business policy, testing out new approaches to education.

The several hundred public sector labs and innovation teams around the world – from Chile to China, south Africa to Denmark – share this ethos too, as do many political leaders. Michael Bloomberg has been an exemplar, confident enough to innovate and experiment constantly in his time as New York Mayor. Won Soon Park in Korea is another…..

Unger’s chapter should be required reading for anyone aspiring to play a role in 21st century politics. You don’t have to agree with what he says. But you do need to work out where you disagree and why….(New Frontiers in Social Innovation Research)

Creating Value through Open Data


Press Release: “Capgemini Consulting, the global strategy and transformation consulting arm of the Capgemini Group, today published two new reports on the state of play of Open Data in Europe, to mark the launch of the European Open Data Portal. The first report addresses “Open Data Maturity in Europe 2015: Insights into the European state of play” and the second focuses on “Creating Value through Open Data: Study on the Impact of Re-use of Public Data Resources.” The countries covered by these assessments include the EU28 countries plus Iceland, Liechtenstein, Norway, and Switzerland – commonly referred to as the EU28+ countries. The reports were requested by the European Commission within the framework of the Connecting Europe Facility program, supporting the deployment of European Open Data infrastructure.

Open Data refers to the information collected, produced or paid for by public bodies and can be freely used, modified and shared by anyone.. For the period 2016-2020, the direct market size for Open Data is estimated at EUR 325 billion for Europe. Capgemini’s study “Creating Value through Open Data” illustrates how Open Data can create economic value in multiple ways including increased market transactions, job creation from producing services and products based on Open Data, to cost savings and efficiency gains. For instance, effective use of Open Data could help save 629 million hours of unnecessary waiting time on the roads in the EU; and help reduce energy consumption by 16%. The accumulated cost savings for public administrations making use of Open Data across the EU28+ in 2020 are predicted to equal 1.7 bn EUR. Reaping these benefits requires reaching a high level of Open Data maturity.

In order to address the accessibility and the value of Open Data across European countries, the European Union has launched the Beta version of the European Data Portal. The Portal addresses the whole Data Value Chain, from data publishing to data re-use. Over 240,000 data sets are referenced on the Portal and 34 European countries. It offers seamless access to public data across Europe, with over 13 content categories to categorize data, ranging from health or education to transport or even science and justice. Anyone, citizens, businesses, journalists or administrations can search, access and re-use the full data collection. A wide range of data is available, from crime records in Helsinki, labor mobility in the Netherlands, forestry maps in France to the impact of digitization in Poland…..The study, “Open Data Maturity in Europe 2015: Insights into the European state of play”, uses two key indicators: Open Data Readiness and Portal Maturity. These indicators cover both the maturity of national policies supporting Open Data as well as an assessment of the features made available on national data portals. The study shows that the EU28+ have completed just 44% of the journey towards achieving full Open Data Maturity and there are large discrepancies across countries. A third of European countries (32%), recognized globally, are leading the way with solid policies, licensing norms, good portal traffic and many local initiatives and events to promote Open Data and its re-use….(More)”

Government’s innovative approach to skills sharing


Nicole Blake Johnson at GovLoop: “For both managers and employees, it often seems there aren’t enough hours in the day to tackle every priority project.

But what if there was another option — a way for federal managers to get the skills they need internally and for employees to work on projects they’re interested in but unaware of?

Maybe you’re the employee who is really into data analytics or social media, but that’s not a part of your regular job duties. What if you had the support of your supervisor to help out on an analytics project down the hall or in a field office across the country?

I’m not making up hypothetical scenarios. These types of initiatives are actually taking shape at federal agencies, including the Environmental Protection Agency, Social Security Administration, Health and Human Services and Commerce departments.

Many agencies are in the pilot phase of rolling out their programs, which are versions of a governmentwide initiative called GovConnect. The initiative was inspired by an EPA program called Skills Marketplace that dates back to 2011.(Read more about GovConnect here.)

“We felt like we had something really promising at EPA, and we wanted to share it with other government agencies,” said Noha Gaber, EPA’s Director of Internal Communications. “So we actually pitched it to OPM and several other agencies, and that ended up becoming GovConnect.”

“The goal of GovConnect is to develop federal workforce skills through cross-agency collaboration and teamwork, to enable more agile response to mission demands without being unnecessarily limited by organizational silos,” said Melissa Kline Lee, who serves as Program Manager of GovConnect at the Office of Personnel Management. “As part of the President’s Management Agenda, the Office of Personnel Management and Environmental Protection Agency are using the GovConnect pilot to help agencies test and scale new approaches to workforce development.”…

Managers post projects or tasks in the online marketplace, which was developed using the agency’s existing SharePoint environment. Projects include clear tasks that employees can accomplish using up to 20 percent of their workweek or less. Projects cannot be open-ended and should not exceed one year.

From there, any employee can view the projects, evaluate what skills or competencies are needed and apply for the position. Managers review the applications and conduct interviews before selecting a candidate. Here are the latest stats for Skills Marketplace as of November 2015:

  • Managers posted 358 projects in the marketplace
  • Employees submitted 577 applications
  • More than 750 people have created profiles for the marketplace

Gaber shared one example involving an employee from the Office of Pesticide Programs and staff from the Office of Environmental Information (OEI), which is the main IT office at EPA. The employee brought to the team technical expertise and skills in geographic information systems to support OEI’s Toxic Release Inventory Program, which tracks data on toxic chemicals being produced by different facilities.

The benefits were twofold: The employee established new connections in a different part of the agency, and his home office benefited from the experiences and knowledge he gleaned while working on the project….(More)