Study finds that a GPS outage would cost $1 billion per day


Eric Berger at Ars Technica: “….one of the most comprehensive studies on the subject has assessed the value of this GPS technology to the US economy and examined what effect a 30-day outage would have—whether it’s due to a severe space weather event or “nefarious activity by a bad actor.” The study was sponsored by the US government’s National Institutes of Standards and Technology and performed by a North Carolina-based research organization named RTI International.

Economic effect

As part of the analysis, researchers spoke to more than 200 experts in the use of GPS technology for various services, from agriculture to the positioning of offshore drilling rigs to location services for delivery drivers. (If they’d spoken to me, I’d have said the value of using GPS to navigate Los Angeles freeways and side streets was incalculable). The study covered a period from 1984, when the nascent GPS network was first opened to commercial use, through 2017. It found that GPS has generated an estimated $1.4 trillion in economic benefits during that time period.

The researchers found that the largest benefit, valued at $685.9 billion, came in the “telecommunications” category,  including improved reliability and bandwidth utilization for wireless networks. Telematics (efficiency gains, cost reductions, and environmental benefits through improved vehicle dispatch and navigation) ranked as the second most valuable category at $325 billion. Location-based services on smartphones was third, valued at $215 billion.

Notably, the value of GPS technology to the US economy is growing. According to the study, 90 percent of the technology’s financial impact has come since just 2010, or just 20 percent of the study period. Some sectors of the economy are only beginning to realize the value of GPS technology, or are identifying new uses for it, the report says, indicating that its value as a platform for innovation will continue to grow.

Outage impact

In the case of some adverse event leading to a widespread outage, the study estimates that the loss of GPS service would have a $1 billion per-day impact, although the authors acknowledge this is at best a rough estimate. It would likely be higher during the planting season of April and May, when farmers are highly reliant on GPS technology for information about their fields.

To assess the effect of an outage, the study looked at several different variables. Among them was “precision timing” that enables a number of wireless services, including the synchronization of traffic between carrier networks, wireless handoff between base stations, and billing management. Moreover, higher levels of precision timing enable higher bandwidth and provide access to more devices. (For example, the implementation of 4G LTE technology would have been impossible without GPS technology)….(More)”

Can we nudge farmers into saving water? Evidence from a randomised experiment


Paper by Sylvain Chabé-Ferret, Philippe Le Coent, Arnaud Reynaud, Julie Subervie and Daniel Lepercq: “We test whether social comparison nudges can promote water-saving behaviour among farmers as a complement to traditional CAP measures. We conducted a randomised controlled trial among 200 farmers equipped with irrigation smart meters in South-West France. Treated farmers received weekly information on individual and group water consumption over four months. Our results rule out medium to large effect-sizes of the nudge. Moreover, they suggest that the nudge was effective at reducing the consumption of those who irrigate the most, although it appears to have reduced the proportion of those who do not consume water at all….(More)”.

IBM aims to use crowdsourced sensor data to improve local weather forecasting globally


Larry Dignan at ZDN: “IBM is hoping that mobile barometric sensors from individuals opting in, supercomputing ,and the Internet of Things can make weather forecasting more local globally.

Big Blue, which owns The Weather Company, will outline the IBM Global High-Resolution Atmospheric Forecasting System (GRAF). GRAF incorporates IoT data in its weather models via crowdsourcing.

While hyper local weather forecasts are available in the US, Japan, and some parts of Western Europe, many regions in the world lack an accurate picture of weather.

Mary Glackin, senior vice president of The Weather Company, said the company is “trying to fill in the blanks.” She added, “In a place like India, weather stations are kilometers away. We think this can be as significant as bringing satellite data into models.”

For instance, the developing world gets forecasts based on global data that are updated every 6 hours and resolutions at 10km to 15km. By using GRAF, IBM said it can offer forecasts for the day ahead that are updated hourly on average and have a 3km resolution….(More)”.

Seven design principles for using blockchain for social impact


Stefaan Verhulst at Apolitical: “2018 will probably be remembered as the bust of the blockchain hype. Yet even as crypto currencies continue to sink in value and popular interest, the potential of using blockchain technologies to achieve social ends remains important to consider but poorly understood.

In 2019, business will continue to explore blockchain for sectors as disparate as finance, agriculture, logistics and healthcare. Policymakers and social innovators should also leverage 2019 to become more sophisticated about blockchain’s real promise, limitations  and current practice.

In a recent report I prepared with Andrew Young, with the support of the Rockefeller Foundation, we looked at the potential risks and challenges of using blockchain for social change — or “Blockchan.ge.” A number of implementations and platforms are already demonstrating potential social impact.

The technology is now being used to address issues as varied as homelessness in New York City, the Rohingya crisis in Myanmar and government corruption around the world.

In an illustration of the breadth of current experimentation, Stanford’s Center for Social Innovation recently analysed and mapped nearly 200 organisations and projects trying to create positive social change using blockchain. Likewise, the GovLab is developing a mapping of blockchange implementations across regions and topic areas; it currently contains 60 entries.

All these examples provide impressive — and hopeful — proof of concept. Yet despite the very clear potential of blockchain, there has been little systematic analysis. For what types of social impact is it best suited? Under what conditions is it most likely to lead to real social change? What challenges does blockchain face, what risks does it pose and how should these be confronted and mitigated?

These are just some of the questions our report, which builds its analysis on 10 case studies assembled through original research, seeks to address.

While the report is focused on identity management, it contains a number of lessons and insights that are applicable more generally to the subject of blockchange.

In particular, it contains seven design principles that can guide individuals or organisations considering the use of blockchain for social impact. We call these the Genesis principles, and they are outlined at the end of this article…(More)”.

Blockchain systems are tracking food safety and origins


Nir Kshetri at The Conversation: “When a Chinese consumer buys a package labeled “Australian beef,” there’s only a 50-50 chance the meat inside is, in fact, Australian beef. It could just as easily contain rat, dog, horse or camel meat – or a mixture of them all. It’s gross and dangerous, but also costly.

Fraud in the global food industry is a multi-billion-dollar problem that has lingered for years, duping consumers and even making them ill. Food manufacturers around the world are concerned – as many as 39 percent of them are worried that their products could be easily counterfeited, and 40 percent say food fraud is hard to detect.

In researching blockchain for more than three years, I have become convinced that this technology’s potential to prevent fraud and strengthen security could fight agricultural fraud and improve food safety. Many companies agree, and are already running various tests, including tracking wine from grape to bottle and even following individual coffee beans through international trade.

Tracing food items

An early trial of a blockchain system to track food from farm to consumer was in 2016, when Walmart collected information about pork being raised in China, where consumers are rightly skeptical about sellers’ claims of what their food is and where it’s from. Employees at a pork farm scanned images of farm inspection reports and livestock health certificates, storing them in a secure online database where the records could not be deleted or modified – only added to.

As the animals moved from farm to slaughter to processing, packaging and then to stores, the drivers of the freight trucks played a key role. At each step, they would collect documents detailing the shipment, storage temperature and other inspections and safety reports, and official stamps as authorities reviewed them – just as they did normally. In Walmart’s test, however, the drivers would photograph those documents and upload them to the blockchain-based database. The company controlled the computers running the database, but government agencies’ systems could also be involved, to further ensure data integrity.

As the pork was packaged for sale, a sticker was put on each container, displaying a smartphone-readable code that would link to that meat’s record on the blockchain. Consumers could scan the code right in the store and assure themselves that they were buying exactly what they thought they were. More recent advances in the technology of the stickers themselves have made them more secure and counterfeitresistant.

Walmart did similar tests on mangoes imported to the U.S. from Latin America. The company found that it took only 2.2 seconds for consumers to find out an individual fruit’s weight, variety, growing location, time it was harvested, date it passed through U.S. customs, when and where it was sliced, which cold-storage facility the sliced mango was held in and for how long it waited before being delivered to a store….(More)”.

Better “nowcasting” can reveal what weather is about to hit within 500 meters


MIT Technology Review: “Weather forecasting is impressively accurate given how changeable and chaotic Earth’s climate can be. It’s not unusual to get 10-day forecasts with a reasonable level of accuracy.

But there is still much to be done.  One challenge for meteorologists is to improve their “nowcasting,” the ability to forecast weather in the next six hours or so at a spatial resolution of a square kilometer or less.

In areas where the weather can change rapidly, that is difficult. And there is much at stake. Agricultural activity is increasingly dependent on nowcasting, and the safety of many sporting events depends on it too. Then there is the risk that sudden rainfall could lead to flash flooding, a growing problem in many areas because of climate change and urbanization. That has implications for infrastructure, such as sewage management, and for safety, since this kind of flooding can kill.

So meteorologists would dearly love to have a better way to make their nowcasts.

Enter Blandine Bianchi from EPFL in Lausanne, Switzerland, and a few colleagues, who have developed a method for combining meteorological data from several sources to produce nowcasts with improved accuracy. Their work has the potential to change the utility of this kind of forecasting for everyone from farmers and gardeners to emergency services and sewage engineers.

Current forecasting is limited by the data and the scale on which it is gathered and processed. For example, satellite data has a spatial resolution of 50 to 100 km and allows the tracking and forecasting of large cloud cells over a time scale of six to nine hours. By contrast, radar data is updated every five minutes, with a spatial resolution of about a kilometer, and leads to predictions on the time scale of one to three hours. Another source of data is the microwave links used by telecommunications companies, which are degraded by rainfall….(More)”

Governments fail to capitalise on swaths of open data


Valentina Romei in the Financial Times: “…Behind the push for open data is a desire to make governments more transparent, accountable and efficient — but also to allow businesses to create products and services that spark economic development. The global annual opportunity cost of failing to do this effectively is about $5tn, according to one estimate from McKinsey, the consultancy.

The UK is not the only country falling short, says the Open Data Barometer, which monitors the status of government data across the world. Among the 30 leading governments — those that have championed the open data movement and have made progress over five years — “less than a quarter of the data with the biggest potential for social and economic impact” is truly open. This goal of transparency, it seems, has not proved sufficient for “creating value” — the movement’s latest focus. In 2015, nearly a decade after advocates first discussed the principles of open government data, 62 countries adopted the six Open Data Charter principles — which called for data to be open by default, usable and comparable….

The use of open data has already bore fruit for some countries. In 2015, Japan’s ministry of land, infrastructure and transport set up an open data site aimed at disabled and elderly people. The 7,000 data points published are downloadable and the service can be used to generate a map that shows which passenger terminals on train, bus and ferry networksprovide barrier-free access.

In the US, The Climate Corporation, a digital agriculture company, combined 30 years of weather data and 60 years of crop yield data to help farmers increase their productivity. And in the UK, subscription service Land Insight merges different sources of land data to help individuals and developers compare property information, forecast selling prices, contact land owners and track planning applications…
Open Data 500, an international network of organisations that studies the use and impact of open data, reveals that private companies in South Korea are using government agency data, with technology, advertising and business services among the biggest users. It shows, for example, that Archidraw, a four-year-old Seoul-based company that provides 3D visualisation tools for interior design and property remodelling, has used mapping data from the Ministry of Land, Infrastructure and Transport…(More)”.

Governments fail to capitalise on swaths of open data


Valentina Romei in the Financial Times: “…Behind the push for open data is a desire to make governments more transparent, accountable and efficient — but also to allow businesses to create products and services that spark economic development. The global annual opportunity cost of failing to do this effectively is about $5tn, according to one estimate from McKinsey, the consultancy.

The UK is not the only country falling short, says the Open Data Barometer, which monitors the status of government data across the world. Among the 30 leading governments — those that have championed the open data movement and have made progress over five years — “less than a quarter of the data with the biggest potential for social and economic impact” is truly open. This goal of transparency, it seems, has not proved sufficient for “creating value” — the movement’s latest focus. In 2015, nearly a decade after advocates first discussed the principles of open government data, 62 countries adopted the six Open Data Charter principles — which called for data to be open by default, usable and comparable….

The use of open data has already bore fruit for some countries. In 2015, Japan’s ministry of land, infrastructure and transport set up an open data site aimed at disabled and elderly people. The 7,000 data points published are downloadable and the service can be used to generate a map that shows which passenger terminals on train, bus and ferry networksprovide barrier-free access.

In the US, The Climate Corporation, a digital agriculture company, combined 30 years of weather data and 60 years of crop yield data to help farmers increase their productivity. And in the UK, subscription service Land Insight merges different sources of land data to help individuals and developers compare property information, forecast selling prices, contact land owners and track planning applications…
Open Data 500, an international network of organisations that studies the use and impact of open data, reveals that private companies in South Korea are using government agency data, with technology, advertising and business services among the biggest users. It shows, for example, that Archidraw, a four-year-old Seoul-based company that provides 3D visualisation tools for interior design and property remodelling, has used mapping data from the Ministry of Land, Infrastructure and Transport…(More)”.

Positive deviance, big data, and development: A systematic literature review


Paper by Basma Albanna and Richard Heeks: “Positive deviance is a growing approach in international development that identifies those within a population who are outperforming their peers in some way, eg, children in low‐income families who are well nourished when those around them are not. Analysing and then disseminating the behaviours and other factors underpinning positive deviance are demonstrably effective in delivering development results.

However, positive deviance faces a number of challenges that are restricting its diffusion. In this paper, using a systematic literature review, we analyse the current state of positive deviance and the potential for big data to address the challenges facing positive deviance. From this, we evaluate the promise of “big data‐based positive deviance”: This would analyse typical sources of big data in developing countries—mobile phone records, social media, remote sensing data, etc—to identify both positive deviants and the factors underpinning their superior performance.

While big data cannot solve all the challenges facing positive deviance as a development tool, they could reduce time, cost, and effort; identify positive deviants in new or better ways; and enable positive deviance to break out of its current preoccupation with public health into domains such as agriculture, education, and urban planning. In turn, positive deviance could provide a new and systematic basis for extracting real‐world development impacts from big data…(More)”.

Study: Crowdsourced Hospital Ratings May Not Be Fair


Samantha Horton at WFYI: “Though many websites offer non-scientific ratings on a number of services, two Indiana University scientists say judging hospitals that way likely isn’t fair.

Their recently-released study compares the federal government’s Hospital Compare and crowdsourced sites such as Facebook, Yelp and Google. The research finds it’s difficult for people to accurately understand everything a hospital does, and that leads to biased ratings.

Patient experiences with food, amenities and bedside manner often aligns with federal government ratings. But IU professor Victoria Perez says judging quality of care and safety is much more nuanced and people often get it wrong.

“About 20 percent of the hospitals rated best within a local market on social media were rated worst in that market by Hospital Compare in terms of patient health outcomes,” she says.

For the crowdsourced ratings to be more useful, Perez says people would have to know how to cross-reference them with a more reliable data source, such as Hospital Compare. But even that site can be challenging to navigate depending on what the consumer is looking for.

“If you have a condition-specific concern and you can see the clinical measure for a hospital that may be helpful,” says Perez. “But if your particular medical concern is not listed there, it might be hard to extrapolate from the ones that are listed or to know which ones you should be looking at.”

She says consumers would need more information about patient outcomes and other quality metrics to be able to reliably crowdsource a hospital on a site such as Google…(More)”.