Inside the fascinating, bizarre world of ‘Prepper Pinterest’


Caitlin Dewey in the Washington Post: “Pinterest, the aspirational candyland of women everywhere, has long been beloved by homebuyers, wedding-planners, moms, narcissists, and people who spend too much time on their hair.

Now you can add another, odder demographic to the list: “doomsday” preppers, whose rabid interest in all things DIY actually makes for a pretty comfortable cultural fit.

Prepper Pinterest has exploded in the past year, according to the site itself: The total volume of prepper pins is up 87 percent, and repins of prepping posts have nearly tripled. Leading preppers on the platform, like Angela Paskett, Damian Brindle and Glenn Levy, have racked up tens of thousands of followers.

It’s the conclusive sign, perhaps, that the much-maligned prepper movement has finally gone mainstream — or that a particularly precious branch of it has, at least. One popular infographic, currently circulating among Pinterest’s prepper ranks, depicts a “luxury bomb shelter” complete with self-filtering bathtubs and scented oxygen tanks….

It may help that mainstream culture has, in the past 10 years, become more hospitable to the prepper ethic — thanks, in large part, to a trend that Jessica Grose once dubbed “the Pinterest effect.” Young women have revitalized the $29 billion craft industry, prodded along by ideas on Etsy, Pinterest and lifestyle blogs. Concerns about the origins of our food gave us farmers’ markets, first — followed by urban farms and “Modern Farmers” and backyard chicken coops….(More)”

 

On the Farm: Startups Put Data in Farmers’ Hands


Jacob Bunge at the Wall Street Journal: “Farmers and entrepreneurs are starting to compete with agribusiness giants over the newest commodity being harvested on U.S. farms—one measured in bytes, not bushels.

Startups including Farmobile LLC, Granular Inc. and Grower Information Services Cooperative are developing computer systems that will enable farmers to capture data streaming from their tractors and combines, store it in digital silos and market it to agriculture companies or futures traders. Such platforms could allow farmers to reap larger profits from a technology revolution sweeping the U.S. Farm Belt and give them more control over the information generated on their fields.

The efforts in some cases would challenge a wave of data-analysis tools from big agricultural companies such as Monsanto Co., DuPontCo., Deere & Co. and Cargill Inc. Those systems harness modern planters, combines and other machinery outfitted with sensors to track planting, spraying and harvesting, then crunch that data to provide farm-management guidance that these firms say can help farmers curb costs and grow larger crops. The companies say farmers own their data, and it won’t be sold to third parties.

Some farmers and entrepreneurs say crop producers can get the most from their data by compiling and analyzing it themselves—for instance, to determine the best time to apply fertilizer to their soil and how much. Then, farmers could profit further by selling data to seed, pesticide and equipment makers seeking a glimpse into how and when farmers use machinery and crop supplies.

The new ventures come as farmers weigh the potential benefits of sharing their data with large agricultural firms against privacy concerns and fears that agribusinesses could leverage farm-level information to charge higher rates for seeds, pesticides and other supplies.

“We need to get farmers involved in this because it’s their information,” said Dewey Hukill, board president of Grower Information Services Cooperative, or GISC, a farmer-owned cooperative that is building a platform to collect its members’ data. The cooperative has signed up about 1,500 members across 37 states….

Companies developing markets for farm data say it’s not their intention to displace big seed and machinery suppliers but to give farmers a platform that would enable them to manage their own information. Storing and selling their own data wouldn’t necessarily bar a farmer from sharing information with a seed company to get a planting recommendation, they say….(More)”

 

The Art of Managing Complex Collaborations


Eric Knight, Joel Cutcher-Gershenfeld, and Barbara Mittleman at MIT Sloan Management Review: “It’s not easy for stakeholders with widely varying interests to collaborate effectively in a consortium. The experience of the Biomarkers Consortium offers five lessons on how to successfully navigate the challenges that arise….

Society’s biggest challenges are also its most complex. From shared economic growth to personalized medicine to global climate change, few of our most pressing problems are likely to have simple solutions. Perhaps the only way to make progress on these and other challenges is by bringing together the important stakeholders on a given issue to pursue common interests and resolve points of conflict.

However, it is not easy to assemble such groups or to keep them together. Many initiatives have stumbled and disbanded. The Biomarkers Consortium might have been one of them, but this consortium beat the odds, in large part due to the founding parties’ determination to make it work. Nine years after it was founded, this public-private partnership, which is managed by the Foundation for the National Institutes of Health and based in Bethesda, Maryland, is still working to advance the availability of biomarkers (biological indicators for disease states) as tools for drug development, including applications at the frontiers of personalized medicine.

The Biomarkers Consortium’s mandate — to bring together, in the group’s words, “the expertise and resources of various partners to rapidly identify, develop, and qualify potential high-impact biomarkers particularly to enable improvements in drug development, clinical care, and regulatory decision-making” — may look simple. However, the reality has been quite complex. The negotiations that led to the consortium’s formation in 2006 were complicated, and the subsequent balancing of common and competing interests remains challenging….

Many in the biomedical sector had seen the need to tackle drug discovery costs for a long time, with multiple companies concurrently spending millions, sometimes billions, of dollars only to hit common dead ends in the drug development process. In 2004 and 2005, then National Institutes of Health director Elias Zerhouni convened key people from the U.S. Food and Drug Administration, the NIH, and the Pharmaceutical Research and Manufacturers of America to create a multistakeholder forum.

Every member knew from the outset that their fellow stakeholders represented many divergent and sometimes opposing interests: large pharmaceutical companies, smaller entrepreneurial biotechnology companies, FDA regulators, NIH science and policy experts, university researchers and nonprofit patient advocacy organizations….(More)”

A New Kind of Media Using Government Data


Eric Newburger at the Department of Commerce:MSNBC has published a data-heavy story collection that takes advantage of the internet’s power to communicate not only faster, but in different and meaningful ways.  “The Geography of Poverty” combines narrative, data graphics, and photo-essay content through an interface so seamless as to be almost invisible.

So far they have released three of what will eventually be five parts, but already they have tapped datasets from BLS, Census, the Department of Agriculture, and EPA.  They combined these federal sources with private data: factory data from Randy Peterson and Chemplants.com; displacement information from news sources; Mary Sternberg’s “Along the River Road”; and Steve Lerner’s Diamond and Kate Orff’s research in “Petrochemical America.”

These layers of data feed visualizations which provide a deeper understanding of the highly personal stories the photos tell; the text weaves the elements into a cohesive whole.  Today’s web tools make this kind of reporting not only possible, but fairly simple to assemble.

The result is a new kind of media that mixes the personal and the societal, the social and the environmental, fitting small scale stories of individuals and local communities into the broader context of our whole nation….(More)”

The Trouble With Disclosure: It Doesn’t Work


Jesse Eisinger at ProPublica: “Louis Brandeis was wrong. The lawyer and Supreme Court justice famously declared that sunlight is the best disinfectant, and we have unquestioningly embraced that advice ever since.

 Over the last century, disclosure and transparency have become our regulatory crutch, the answer to every vexing problem. We require corporations and government to release reams of information on food, medicine, household products, consumer financial tools, campaign finance and crime statistics. We have a booming “report card” industry for a range of services, including hospitals, public schools and restaurants.

All this sunlight is blinding. As new scholarship is demonstrating, the value of all this information is unproved. Paradoxically, disclosure can be useless — and sometimes actually harmful or counterproductive.

“We are doing disclosure as a regulatory move all over the board,” says Adam J. Levitin, a law professor at Georgetown, “The funny thing is, we are doing this despite very little evidence of its efficacy.”

Let’s start with something everyone knows about — the “terms of service” agreements for the likes of iTunes. Like everybody else, I click the “I agree” box, feeling a flash of resentment. I’m certain that in Paragraph 184 is a clause signing away my firstborn to a life of indentured servitude to Timothy D. Cook as his chief caviar spoon keeper.

Our legal theoreticians have determined these opaque monstrosities work because someone, somewhere reads the fine print in these contracts and keeps corporations honest. It turns out what we laymen intuit is true: No one reads them, according to research by a New York University law professor, Florencia Marotta-Wurgler.

In real life, there is no critical mass of readers policing the agreements. And if there were an eagle-eyed crew of legal experts combing through these agreements, what recourse would they have? Most people don’t even know that the Supreme Court has gutted their rights to sue in court, and they instead have to go into arbitration, which usually favors corporations.

The disclosure bonanza is easy to explain. Nobody is against it. It’s politically expedient. Companies prefer such rules, especially in lieu of actual regulations that would curtail bad products or behavior. The opacity lobby — the remora fish class of lawyers, lobbyists and consultants in New York and Washington — knows that disclosure requirements are no bar to dodgy practices. You just have to explain what you’re doing in sufficiently incomprehensible language, a task that earns those lawyers a hefty fee.

Of course, some disclosure works. Professor Levitin cites two examples. The first is an olfactory disclosure. Methane doesn’t have any scent, but a foul smell is added to alert people to a gas leak. The second is ATM. fees. A study in Australia showed that once fees were disclosed, people avoided the high-fee machines and took out more when they had to go to them.

But to Omri Ben-Shahar, co-author of a recent book, ” More Than You Wanted To Know: The Failure of Mandated Disclosure,” these are cherry-picked examples in a world awash in useless disclosures. Of course, information is valuable. But disclosure as a regulatory mechanism doesn’t work nearly well enough, he argues….(More)

We are data: the future of machine intelligence


Douglas Coupland in the Financial Times: “…But what if the rise of Artificial Intuition instead blossoms under the aegis of theology or political ideology? With politics we can see an interesting scenario developing in Europe, where Google is by far the dominant search engine. What is interesting there is that people are perfectly free to use Yahoo or Bing yet they choose to stick with Google and then they get worried about Google having too much power — which is an unusual relationship dynamic, like an old married couple. Maybe Google could be carved up into baby Googles? But no. How do you break apart a search engine? AT&T was broken into seven more or less regional entities in 1982 but you can’t really do that with a search engine. Germany gets gaming? France gets porn? Holland gets commerce? It’s not a pie that can be sliced.

The time to fix this data search inequity isn’t right now, either. The time to fix this problem was 20 years ago, and the only country that got it right was China, which now has its own search engine and social networking systems. But were the British or Spanish governments — or any other government — to say, “OK, we’re making our own proprietary national search engine”, that would somehow be far scarier than having a private company running things. (If you want paranoia, let your government control what you can and can’t access — which is what you basically have in China. Irony!)

The tendency in theocracies would almost invariably be one of intense censorship, extreme limitations of access, as well as machine intelligence endlessly scouring its system in search of apostasy and dissent. The Americans, on the other hand, are desperately trying to implement a two-tiered system to monetise information in the same way they’ve monetised medicine, agriculture, food and criminality. One almost gets misty-eyed looking at North Koreans who, if nothing else, have yet to have their neurons reconfigured, thus turning them into a nation of click junkies. But even if they did have an internet, it would have only one site to visit, and its name would be gloriousleader.nk.

. . .

To summarise. Everyone, basically, wants access to and control over what you will become, both as a physical and metadata entity. We are also on our way to a world of concrete walls surrounding any number of niche beliefs. On our journey, we get to watch machine intelligence become profoundly more intelligent while, as a society, we get to watch one labour category after another be systematically burped out of the labour pool. (Doug’s Law: An app is only successful if it puts a lot of people out of work.)…(More)”

Setting High and Compatible Standards


Laura Bacon at Omidyar Network:  “…Standards enable interoperability, replicability, and efficiency. Airplane travel would be chaotic at best and deadly at worst if flights and air traffic control did not use common codes for call signs, flight numbers, location, date, and time. Trains that cross national borders need tracks built to a standard gauge as evidenced by Spain’s experience in making its trains interoperable with the rest of the continent’s.

Standards matter in data collection and publication as well.  This is especially true for those datasets that matter most to people’s lives, such as health, education, agriculture, and water. Disparate standards for basic category definitions like geography and organizations mean that data sources cannot be easily or cost-effectively analyzed for cross-comparison and decision making.

Compatible data standards that enable data being ‘joined up,’ would enable more efficacious logging and use of immunization records, controlling the spread of infectious disease, helping educators prioritize spending based on the greatest needs, and identifying the beneficial owners of companies to help ensure transparent and legal business transactions.

Data: More Valuable When Joined Up

Lots of efforts, time, and money are poured into the generation and publication of open data. And where open data is important in itself, the biggest return on investment is potentially from the inter-linkages among datasets. However, it is very difficult to yield this return because of the now-missing standards and building blocks (e.g., geodata, organizational identifiers, project identifiers) that would enable joining up of data.

Omidyar Network currently supports open data standards for contracting, extractives, budgets, and others. If “joining up” work is not considered and executed at early stages, these standards 1) could evolve in silos and 2) may not reach their full capacity.

Interoperability will not happen automatically; specific investments and efforts must be made to develop the public good infrastructure for the joining up of key datasets….The two organizations leading this project have an impressive track record working in this area. Development Initiatives is a global organization working to empower people to make more effective use of information. In 2013, it commissioned Open Knowledge Foundation to publish a cross-initiative scoping study, Joined-Up Data: Building Blocks for Common Standards, which recommended focus areas, shared learning, and the adoption of joined-up data and common standards for all publishers. Partnering with Development Initiatives is Publish What You Fund,…(More)”

Exploring Open Energy Data in Urban Areas


The Worldbank: “…Energy efficiency – using less energy input to deliver the same level of service – has been described by many as the ‘first fuel’ of our societies. However, lack of adequate data to accurately predict and measure energy efficiency savings, particularly at the city level, has limited the realization of its promise over the past two decades.
Why Open Energy Data?
Open Data can be a powerful tool to reduce information asymmetry in markets, increase transparency and help achieve local economic development goals. Several sectors like transport, public sector management and agriculture have started to benefit from Open Data practices. Energy markets are often characterized by less-than-optimal conditions with high system inefficiencies, misaligned incentives and low levels of transparency. As such, the sector has a lot to potentially gain from embracing Open Data principles.
The United States is a leader in this field with its ‘Energy Data’ initiative. This initiative makes data easy to find, understand and apply, helping to fuel a clean energy economy. For example, the Energy Information Administration’s (EIA) open application programming interface (API) has more than 1.2 million time series of data and is frequently visited by users from the private sector, civil society and media. In addition, the Green Button  initiative is empowering American citizens to have access to their own energy usage data, and OpenEI.org is an Open Energy Information platform to help people find energy information, share their knowledge and connect to other energy stakeholders.
Introducing the Open Energy Data Assessment
To address this data gap in emerging and developing countries, the World Bank is conducting a series of Open Energy Data Assessments in urban areas. The objective is to identify important energy-related data, raise awareness of the benefits of Open Data principles and improve the flow of data between traditional energy stakeholders and others interested in the sector.
The first cities we assessed were Accra, Ghana and Nairobi, Kenya. Both are among the fastest-growing cities in the world, with dynamic entrepreneurial and technology sectors, and both are capitals of countries with an ongoing National Open Data Initiative., The two cities have also been selected to be part of the Negawatt Challenge, a World Bank international competition supporting technology innovation to solve local energy challenges.
The ecosystem approach
The starting point for the exercise was to consider the urban energy sector as an ecosystem, comprised of data suppliers, data users, key datasets, a legal framework, funding mechanisms, and ICT infrastructure. The methodology that we used adapted the established World Bank Open Data Readiness Assessment (ODRA), which highlights valuable connections between data suppliers and data demand.  The assessment showcases how to match pressing urban challenges with the opportunity to release and use data to address them, creating a longer-term commitment to the process. Mobilizing key stakeholders to provide quick, tangible results is also key to this approach….(More) …See also World Bank Open Government Data Toolkit.”

5 cool ways connected data is being used


 at Wareable: “The real news behind the rise of wearable tech isn’t so much the gadgetry as the gigantic amount of personal data that it harnesses.

Concerns have already been raised over what companies may choose to do with such valuable information, with one US life insurance company already using Fitbits to track customers’ exercise and offer them discounts when they hit their activity goals.

Despite a mildly worrying potential dystopia in which our own data could be used against us, there are plenty of positive ways in which companies are using vast amounts of connected data to make the world a better place…

Parkinson’s disease research

Apple Health ResearchKit was recently unveiled as a platform for collecting collaborative data for medical studies, but Apple isn’t the first company to rely on crowdsourced data for medical research.

The Michael J. Fox Foundation for Parkinson’s Research recently unveiled a partnership with Intel to improve research and treatment for the neurodegenerative brain disease. Wearables are being used to unobtrusively gather real-time data from sufferers, which is then analysed by medical experts….

Saving the rhino

Connected data and wearable tech isn’t just limited to humans. In South Africa, the Madikwe Conservation Project is using wearable-based data to protect endangered rhinos from callous poachers.

A combination of ultra-strong Kevlar ankle collars powered by an Intel Galileo chip, along with an RFID chip implanted in each rhino’s horn allows the animals to be monitored. Any break in proximity between the anklet and horn results in anti-poaching teams being deployed to catch the bad guys….

Making public transport smart

A company called Snips is collecting huge amounts of urban data in order to improve infrastructure. In partnership with French national rail operator SNCF, Snips produced an app called Tranquilien to utilise location data from commuters’ phones and smartwatches to track which parts of the rail network were busy at which times.

Combining big data with crowdsourcing, the information helps passengers to pick a train where they can find a seat during peak times, while the data can also be useful to local businesses when serving the needs of commuters who are passing through.

Improving the sports fan experience

We’ve already written about how wearable tech is changing the NFL, but the collection of personal data is also set to benefit the fans.

Levi’s Stadium – the new home of the San Francisco 49ers – opened in 2014 and is one of the most technically advanced sports venues in the world. As well as a strong Wi-Fi signal throughout the stadium, fans also benefit from a dedicated app. This not only offers instant replays and real-time game information, but it also helps them find a parking space, order food and drinks directly to their seat and even check the lines at the toilets. As fans use the app, all of the data is collated to enhance the fan experience in future….

Creating interactive art

Don’t be put off by the words ‘interactive installation’. On Broadway is a cool work of art that “represents life in the 21st Century city through a compilation of images and data collected along the 13 miles of Broadway that span Manhattan”….(More)”

WFP And OCHA Join Forces To Make Data More Accessible


World Food Programme Press Release: “The United Nations World Food Programme (WFP) and the United Nations Office for the Coordination of Humanitarian Affairs (OCHA) have teamed up to provide access to global data on hunger and food insecurity. The data can be used to understand the type of food available in certain markets, how families cope in the face of food insecurity and how WFP provides food assistance in emergencies to those in need.

The data is being made available through OCHA’s Humanitarian Data Exchange (HDX), an open platform for sharing crisis data. The collaboration between WFP, the world’s largest humanitarian organization fighting hunger worldwide, and OCHA began at the height of the Ebola crisis when WFP shared its data on food market prices in affected countries in West Africa.

With funding from the UK’s Department for International Development (DFID) and the Bill & Melinda Gates Foundation, WFP has since been able to make large amounts of its data available dynamically, making it easier to integrate with other systems, including HDX.

From there, HDX built an interactive visualization for Food Prices data that allows a range of users, from the general public to a data scientist, to explore the data in insightful ways. The same visualization is also available on the WFP VAM Shop….(More)