Oliver Wearn, RobinFreeman and David Jacoby in Nature: “Machine learning (ML) is revolutionizing efforts to conserve nature. ML algorithms are being applied to predict the extinction risk of thousands of species, assess the global footprint of fisheries, and identify animals and humans in wildlife sensor data recorded in the field. These efforts have recently been given a huge boost with support from the commercial sector. New initiatives, such as Microsoft’s AI for Earth and Google’s AI for Social Good, are bringing new resources and new ML tools to bear on some of the biggest challenges in conservation. In parallel to this, the open data revolution means that global-scale, conservation-relevant datasets can be fed directly to ML algorithms from open data repositories, such as Google Earth Engine for satellite data or Movebank for animal tracking data. Added to these will be Wildlife Insights, a Google-supported platform for hosting and
Weather Service prepares to launch prediction model many forecasters don’t trust
Jason Samenow in the Washington Post: “In a month, the National Weather Service plans to launch its “next generation” weather prediction model with the aim of “better, more timely forecasts.” But many meteorologists familiar with the model fear it is unreliable.
The introduction of a model that forecasters lack confidence in matters, considering the enormous impact that weather has on the economy, valued at around $485 billion annually.
The Weather Service announced Wednesday that the model, known as the GFS-FV3 (FV3 stands for Finite Volume Cubed-Sphere dynamical core), is “tentatively” set to become the United States’ primary forecast model on March 20, pending tests. It is an update to the current version of the GFS (Global Forecast System), popularly known as the American model, which has existed in various forms for more than 30 years
A concern is that if forecasters cannot rely on the FV3, they will be left to rely only on the European model for their predictions without a credible alternative for comparisons. And they’ll also have to pay large fees for the European model data. Whereas model data from the Weather Service is free, the European Center for Medium-Range Weather Forecasts, which produces the European model, charges for access.
But there is an alternative perspective, which is that forecasters will just need to adjust to the new model and learn to account for its biases. That is, a little short-term pain is worth the long-term potential benefits as the model improves
The Weather Service’s parent agency, the National Oceanic and Atmospheric Administration, recently entered an agreement with the National Center for Atmospheric Research to increase collaboration between forecasters and researchers in improving forecast modeling.
In addition, President Trump recently signed into law the Weather Research and Forecast Innovation Act Reauthorization, which establishes the NOAA Earth Prediction Innovation Center, aimed at further enhancing prediction capabilities. But even while NOAA develops relationships and infrastructure to improve the Weather Service’s modeling, the question remains whether the FV3 can meet the forecasting needs of the moment. Until the problems identified are addressed, its introduction could represent a step back in U.S. weather prediction despite a well-intended effort to leap forward….(More).
Not so gameful: A critical review of gamification in mobile energy applications
Many of these interventions and products have related apps that use gamification in some capacity in order to improve the user experience, offer motivation, and encourage behavior change. We identified 57 apps from nearly 2400 screened apps that both target direct energy use and employ at least one element of gamification.
We evaluated these apps with
New mathematical model can help save endangered species
The risk of extinction varies from species to species depending on how individuals in its populations reproduce and how long each animal survives. Understanding the dynamics of survival and reproduction can support management actions to improve a specie’s chances of surviving.
Mathematical and statistical models have become powerful tools to help explain these dynamics. However, the quality of the information we use to construct such models is crucial to improve our chances of accurately predicting the fate of populations in nature.
Colchero’s research focuses on mathematically recreating the population dynamics by better understanding the species’s demography. He works on constructing and exploring stochastic population models that predict how a certain population (for example an endangered species) will change over time.
These models include mathematical factors to describe how the species’ environment, survival rates and reproduction determine to the population’s size and growth. For practical reasons some assumptions are necessary.
Two commonly accepted assumptions are that survival and reproduction are constant with age, and that high survival in the species goes hand in hand with reproduction across all age groups within a species. Colchero challenged these assumptions by accounting for age-specific survival and reproduction, and for trade-offs between survival and reproduction. This is, that sometimes conditions that favor survival will be unfavorable for reproduction, and vice versa.

For his work Colchero used statistics, mathematical derivations, and computer simulations with data from wild populations of 24 species of vertebrates. The outcome was a significantly improved model that had more accurate predictions for a species’ population growth.
Despite the technical nature of Fernando’s work, this type of model can have very practical implications as they provide qualified explanations for the underlying reasons for the extinction. This can be used to take management actions and may help prevent
IBM aims to use crowdsourced sensor data to improve local weather forecasting globally
Larry Dignan at ZDN: “IBM is hoping that mobile barometric sensors from individuals opting in,
Big Blue, which owns The Weather Company, will outline the IBM Global High-Resolution Atmospheric Forecasting System (GRAF). GRAF incorporates IoT data in its weather models via crowdsourcing.
While
Mary Glackin, senior vice president of The Weather Company, said the company is “trying to fill in the blanks.” She added, “In a place like India, weather stations are kilometers away. We think this can be as significant as bringing satellite data into models.”
For instance, the developing world gets forecasts based on global data that are updated every 6 hours and resolutions at 10km to 15km. By using GRAF, IBM said it can offer forecasts for the day ahead that are updated hourly on average and have a 3km resolution….(More)”.
Los Angeles Accuses Weather Channel App of Covertly Mining User Data
Jennifer Valentino-DeVries and Natasha Singer in The New York Times: “The Weather Channel app deceptively collected, shared and profited from the location information of millions of American consumers, the city attorney of Los Angeles said in a lawsuit filed on Thursday.
One of the most popular online weather services in the United States, the Weather Channel app has been downloaded more than 100 million times and has 45 million active users monthly.
The government said the Weather Company, the business behind the app, unfairly manipulated users into turning on location tracking by implying that the information would be used only to localize weather reports. Yet the company, which is owned by IBM, also used the data for unrelated commercial purposes, like targeted marketing and analysis for hedge funds, according to the lawsuit…
In the complaint, the city attorney excoriated the Weather Company, saying it unfairly took advantage of its app’s popularity and the fact that consumers were likely to give their location data to get local weather alerts. The city said that the company failed to sufficiently disclose its data practices when it got users’ permission to track their location and that it obscured other tracking details in its privacy policy.
“These issues certainly aren’t limited to our state,” Mr. Feuer said. “Ideally this litigation will be the catalyst for other action — either litigation or legislative activity — to protect consumers’ ability to assure their private information remains just that, unless they speak clearly in advance.”…(More)”.
Index: Open Data
By Alexandra Shaw, Michelle Winowatan, Andrew Young, and Stefaan Verhulst
The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on open data and was originally published in 2018.
Value and Impact
-
Direct market value of open data in EU from 2016 to 2020: estimated EUR 325 billion
-
Predicted number of Open Data jobs in Europe by 2020: 100,000 (35% increase)
-
The projected year at which all 28+ EU member countries will have a fully operating open data portal: 2020
-
Between 2016 and 2020, the market size of open data in Europe is expected to increase by 36.9%, and reach this value by 2020: EUR 75.7 billion
-
Estimated cost savings for public administration in the EU by 2020: EUR 1.7 billion
-
2013 estimates of potential value of global open data, as estimated by McKinsey: $3 trillion annually
-
Potential yearly value for the United States: $1.1 trillion
-
Europe: $900 billion
-
Rest of the world: $1.7 trillion
-
-
Potential yearly value of open data in Australia: AUD 25 billion
-
Value of Transport for London open data projects: £115 million per year
-
Value that open data can help unlock in economic value annually across seven sectors in the United States: $3-5 trillion
Public Views on and Use of Open Government Data
-
Number of Americans who do not trust the federal government or social media sites to protect their data: Approximately 50%
-
Key findings from The Economist Intelligence Unit report on Open Government Data Demand:
-
Percentage of respondents who say the key reason why governments open up their data is to create greater trust between the government and citizens: 70%
-
Percentage of respondents who say OGD plays an important role in improving lives of citizens: 78%
-
Percentage of respondents who say OGD helps with daily decision making especially for transportation, education, environment: 53%
-
Percentage of respondents who cite lack of awareness about OGD and its potential use and benefits as the greatest barrier to usage: 50%
-
Percentage of respondents who say they lack access to usable and relevant data: 31%
-
Percentage of respondents who think they don’t have sufficient technical skills to use open government data: 25%
-
Percentage of respondents who feel the number of OGD apps available is insufficient, indicating an opportunity for app developers: 20%
-
Percentage of respondents who say OGD has the potential to generate economic value and new business opportunity: 61%
-
Percentage of respondents who say they don’t trust governments to keep data safe, protected, and anonymized: 19%
-
Efforts and Involvement
-
Time that’s passed since open government advocates convened to create a set of principles for open government data – the instance that started the open data government movement: 10 years
-
Countries participating in the Open Government Partnership today: 79 OGP participating countries and 20 subnational governments
-
Percentage of “open data readiness” in Europe according to European Data Portal: 72%
-
Open data readiness consists of four indicators which are presence of policy, national coordination, licensing norms, and use of data.
-
-
Number of U.S. cities with Open Data portals: 27
-
Number of governments who have adopted the International Open Data Charter: 62
-
Number of non-state organizations endorsing the International Open Data Charter: 57
-
Number of countries analyzed by the Open Data Index: 94
-
Number of Latin American countries that do not have open data portals as of 2017: 4 total – Belize, Guatemala, Honduras and Nicaragua
-
Number of cities participating in the Open Data Census: 39
Demand for Open Data
-
Open data demand measured by frequency of open government data use according to The Economist Intelligence Unit report:
-
Australia
-
Monthly: 15% of respondents
-
Quarterly: 22% of respondents
-
Annually: 10% of respondents
-
-
Finland
-
Monthly: 28% of respondents
-
Quarterly: 18% of respondents
-
Annually: 20% of respondents
-
-
France
-
Monthly: 27% of respondents
-
Quarterly: 17% of respondents
-
Annually: 19% of respondents
-
-
India
-
Monthly: 29% of respondents
-
Quarterly: 20% of respondents
-
Annually: 10% of respondents
-
-
Singapore
-
Monthly: 28% of respondents
-
Quarterly: 15% of respondents
-
Annually: 17% of respondents
-
-
UK
-
Monthly: 23% of respondents
-
Quarterly: 21% of respondents
-
Annually: 15% of respondents
-
-
US
-
Monthly: 16% of respondents
-
Quarterly: 15% of respondents
-
Annually: 20% of respondents
-
-
-
Number of FOIA requests received in the US for fiscal year 2017: 818,271
-
Number of FOIA request processed in the US for fiscal year 2017: 823,222
-
Distribution of FOIA requests in 2017 among top 5 agencies with highest number of request:
-
DHS: 45%
-
DOJ: 10%
-
NARA: 7%
-
DOD: 7%
-
HHS: 4%
-
Examining Datasets
-
Country with highest index score according to ODB Leaders Edition: Canada (76 out of 100)
-
Country with lowest index score according to ODB Leaders Edition: Sierra Leone (22 out of 100)
-
Number of datasets open in the top 30 governments according to ODB Leaders Edition: Fewer than 1 in 5
-
Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition: 19%
-
Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition by sector/subject:
-
Budget: 30%
-
Companies: 13%
-
Contracts: 27%
-
Crime: 17%
-
Education: 13%
-
Elections: 17%
-
Environment: 20%
-
Health: 17%
-
Land: 7%
-
Legislation: 13%
-
Maps: 20%
-
Spending: 13%
-
Statistics: 27%
-
Trade: 23%
-
Transport: 30%
-
-
Percentage of countries that release data on government spending according to ODB Leaders Edition: 13%
-
Percentage of government data that is updated at regular intervals according to ODB Leaders Edition: 74%
-
Number of datasets available through:
-
Number of datasets classed as “open” in 94 places worldwide analyzed by the Open Data Index: 11%
-
Percentage of open datasets in the Caribbean, according to Open Data Census: 7%
-
Number of companies whose data is available through OpenCorporates: 158,589,950
City Open Data
-
New York City
-
Number of datasets available through NYC Open Data: 2170
-
New datasets published in New York City for fiscal year 2018: 629 (2,000+ in total)
-
Number of automated datasets in New York City for fiscal year 2018: 246 (38 new datasets added)
-
Open data demand in New York City in fiscal year 2018 measured by:
-
Unique users: 1,000,000+
-
Requests by application: 550+ million
-
-
-
Singapore
-
Barcelona
-
London
-
Bandung
-
Number of datasets published in Bandung: 1,417
-
-
Buenos Aires
-
Number of datasets published in Buenos Aires: 216
-
-
Dubai
-
Number of datasets published in Dubai: 267
-
-
Melbourne
-
Number of datasets published in Melbourne: 199
-
Sources
-
About OGP, Open Government Partnership. 2018.
-
Analytical Report no. 9: The Economic Benefits of Open Data, European Data Portal, 2017.
-
Creating Value through Open Data: Study on the Impact of Re-use of Public Data Resources, European Data Portal. 2015.
-
European Data Portal Datasets, European Data Portal.
-
Find Open Data, UK Government Data.
-
Global Open Data Index: Dataset Overview, Open Knowledge International.
-
Global Open Data Index: Place Overview, Open Knowledge International.
-
Local Data Catalog, DATA.GOV.
-
Many Americans do not trust modern institutions to protect their personal data – even as they frequently neglect cybersecurity best practices in their own personal lives, Pew Research Center – Internet and Technology. 2017.
-
NYC Open Data, City of New York, 2018.
-
Open Data Barometer, World Wide Web Foundation. 2017.
-
Open Data Barometer 4th Edition, World Wide Web Foundation, 2017.
-
Open Data Barometer Leaders Edition: From Promise to Progress, World Wide Web Foundation. 2018.
-
Open Data for All Report, NYC DoITT, 2018.
-
Open data service of the Barcelona City Council, City of Barcelona.
-
Open Data in Europe, European Data Portal. 2018.
-
Open Government Data: Assessing demand around the world, The Economist Intelligence Unit. 2017.
-
“Policy in the Data Age: Data Enablement for the Common Good.” Karim Tadjeddine and Martin Lundqvist. McKinsey and Company. August 2016.
-
Search Data, Australian Government.
-
Singapore Open Data Portal, Singaporean Government.
-
Starting an Open Data Initiative, The World Bank.
-
Summary of Annual FOIA Reports for Fiscal Year 2017, US Department of Justice. 2017.
-
The home of the U.S. Government’s Open Data, DATA.GOV, 2018.
-
The Open Database of the Corporate World, Opencorporates. 2018.
-
The State of Open Data Portals in Latin America. Center for Data Innovation, 2017.
-
Tracking the State of Open Government Data, Open Knowledge International.
-
U.S. ranks 4th in open data, with leadership by cities and states helping support the numbers, Statescoop. 2017.
-
What is the Open Data Survey? Open Data Census.
Citizen science for environmental policy: Development of an EU-wide inventory and analysis of selected practices
EU Science Hub: “Citizen science is the non-professional involvement of volunteers in the scientific process, whether in the data collection phase or in other phases of the research.
It can be a powerful tool for environmental management that has the potential to inform an increasingly complex environmental policy landscape and to meet the growing demands from society for more participatory decision-making.
While there is growing interest from international bodies and national governments in citizen science, the evidence that it can successfully contribute to environmental policy development, implementation, evaluation or compliance remains scant.
Central to elucidating this question is a better understanding of the benefits delivered by citizen science, that is to determine to what extent these benefits can contribute to environmental policy, and to establish whether projects that provide policy support also co-benefit science and encourage meaningful citizen engagement.
EU-wide inventory
In order to get an evidence base of citizen science activities that can support environmental policies in the European Union (EU), the European Commission (DG ENV, with the support of DG JRC) contracted Bio Innovation Service (FR), in association with Fundacion Ibercivis (ES) and The Natural History Museum (UK), to perform a “Study on an inventory of citizen science activities for environmental policies”.
The first objective was to develop an inventory of citizen science projects relevant for environmental policy and assess how these projects contribute to the Sustainable Development Goals (SDGs) set by the United Nations (UN) General Assembly.
To this end, a desk-research and an EU-wide survey were used to identify 503 citizen science projects of relevance to environmental policy.

The study demonstrates the breadth of citizen science that can be of relevance to environmental policy
- Government support, not only in the funding, but also through active participation in the design and implementation of the project appears to be a key factor for the successful uptake of citizen science in environmental policy.
- When there is easy engagement process for the citizens, that is, with projects requiring limited efforts and a priori skills, this facilitates their policy uptake.
- Scientific aspects on the other hand did not appear to affect the policy uptake of the analysed projects, but they were a strong determinant of how well the project could serve policy: projects with high scientific standards and endorsed by scientists served more phases of the environmental policy cycle.
In conclusion, this study demonstrates that citizen science has the potential to be a cost-effective way to contribute to policy and highlights the importance of fostering a diversity of citizen science activities and their innovativeness
New methods help identify what drives sensitive or socially unacceptable behaviors
Mary Guiden at Physorg: “Conservation scientists and statisticians at Colorado State University have teamed up to solve a key problem for the study of sensitive behaviors like poaching, harassment, bribery, and drug use.
Sensitive behaviors—defined as socially unacceptable or not compliant with rules and regulations—are notoriously hard to study, researchers say, because people often do not want to answer direct questions about them.
To overcome this challenge, scientists have developed indirect questioning approaches that protect responders’ identities. However, these methods also make it difficult to predict which sectors of a population are more likely to participate in sensitive behaviors, and which factors, such as knowledge of laws, education, or income, influence the probability that an individual will engage in a sensitive behavior.
Assistant Professor Jennifer Solomon and Associate Professor Michael Gavin of the Department of Human Dimensions of Natural Resources at CSU, and Abu Conteh from MacEwan University in Alberta, Canada, have teamed up with Professor Jay Breidt and doctoral student Meng Cao in the CSU Department of Statistics to develop a new method to solve the problem.
The study, “Understanding the drivers of sensitive behavior using Poisson regression from quantitative randomized response technique data,” was published recently in PLOS One.
Conteh, who, as a doctoral student, worked with Gavin in New Zealand, used a specific technique, known as quantitative randomized response, to elicit confidential answers to questions on behaviors related to non-compliance with natural resource regulations from a protected area in Sierra Leone.
In this technique, the researcher conducting interviews has a large container containing
Armed with the new computer program, the scientists found that people from rural communities with less access to jobs in urban centers were more likely to hunt in the reserve. People in communities with a greater
The researchers said that collaborating across disciplines was and is key to addressing complex problems like this one. It is commonplace for people to be noncompliant with rules and regulations and equally important for social scientists to analyze these behaviors
Nudging compliance in government: A human-centered approach to public sector program design
Article by Michelle Cho, Joshua Schoop, Timothy Murphy: “What are the biggest challenges facing government? Bureaucracy? Gridlock? A shrinking pool of resources?
Chances are compliance—when people act in accordance with preset rules, policies, and/or expectations—doesn’t top the list for many. Yet maybe it should. Compliance touches nearly every aspect of public policy implementation. Over the past 10 years, US government spending on compliance reached US$7.5 billion.
Even the most sophisticated and well-planned policies often require cooperation and input from real humans to be successful. From voluntary tax filing at the Internal Revenue Service (IRS) to reducing greenhouse emissions at the Environmental Protection Agency (EPA), to achieving the public policy outcomes decision-makers intend, compliance is fundamental.
Consider these examples of noncompliance and their costs:
- Taxes. By law, the IRS requires all income-earning, eligible constituents to file and pay their owed taxes. Tax evasion—the illegal nonpayment or underpayment of tax—cost the federal government an average of US$458 billion per year between 2008 and 2010.3 The IRS believes it will recover just 11 percent of the amount lost in that time frame.
- The environment. The incorrect disposal of recyclable materials has cost more than US$744 million in the state of Washington since 2009.4 The city audit in San Diego found that 76 percent of materials disposed of citywide are recyclable and estimates that those recyclables could power 181,000 households for a year or conserve 3.4 million barrels of oil.5
Those who fail to comply with these rules could face direct and indirect consequences, including penalties and even jail time. Yet a significant subset of the population still behaves in a noncompliant manner. Why?
Behavioral sciences offer some clues. Through the combination of psychology, economics, and neuroscience, behavioral sciences demonstrate that people do not always do what is asked of them, even when it seems in their best interest to do so. Often, people choose a noncompliant path because of one of these reasons: They are unaware of their improper behavior, they find the “right” choice is too complex to decipher, or they simply are not intrinsically motivated to make the compliant choice.
For any of these reasons, when a cognitive hurdle emerges, some people resort to noncompliant behavior. But these hurdles can be overcome. Policymakers can use these same behavioral insights to understand why noncompliance occurs and alternatively, employ behavioral-inspired tools to encourage compliant behavior in a more agile and resource-efficient fashion.
In this spirit, leaders can take a more human-centered approach to program design by using behavioral science lessons to develop policies and programs in a manner that can make compliance easier and more appealing. In our article, we discuss three common reasons behind noncompliance and how better, more human-centered design can help policymakers achieve more positive results….(More)”.