Innovation procurement


European Commission: “Innovation Procurement enables the public sector to modernize its services while saving costs and creating market opportunities for the companies in Europe. This workshop was organised on 7 October 2014 during the Open Days 2014 under the title “Make use of the enabling button for Innovation Procurement (PCP/PPI) to tackle societal challenges in Europe”….
Ms Lieve Bos (European Commission DG CONNECT) presented the importance and potential of pre-commercial procurement (PCP) and public procurement of innovative solutions (PPI) to modernize public services in Europe while creating market opportunities for companies. She presented the funding schemes in H2020 that  co-finance the preparation, coordination and the execution of PCP and PPI Procurements. 130M Euro of EU funding is currently available (deadlines for proposals in 2015) to support Innovation Procurements implementation in many domains of public interest. …
Mr Peter Asché (Uniklinik Rwth Aachen, Germany) presented the Thalea Pre-Commercial Procurement (PCP) project that is challenging providers to develop new innovative solutions for remote decision support to intensive care units through an interoperable telemedicine platform. Mr.Asché stressed that the project attracted considerable market interest with 23 companies from 5 different Member States participating to the open market consultation that preceded the publication of the Thalea PCP call for tender.
Mr van Berlo (Smart Homes, The Netherlands) presented the Stop and Go Public Procurement of Innovative Solutions (PPI) project that aims at deploying cost-effective, sustainable and innovative solutions for telecare for elderly. A transnational procurement in four Member States will enable the participant organizations to purchase innovative solutions with clear clinical and social outcomes creating in that way economies of scale that will benefit the procurers and the market and contributing at the same time to standardization. …”

How Paperbacks Helped the U.S. Win World War II


The books were Armed Services Editions, printed by a coalition of publishers with funding from the government and shipped by the Army and Navy. The largest of them were only three-quarters of an inch thick—thin enough to fit in the pocket of a soldier’s pants. Soldiers read them on transport ships, in camps and in foxholes. Wounded and waiting for medics, men turned to them on Omaha Beach, propped against the base of the cliffs. Others were buried with a book tucked in a pocket.
“When Books Went to War: The Stories That Helped Us Win World War II” by Molly Guptill Manning tells the story of the Armed Services Editions. To be published by Houghton Mifflin Harcourt on Dec. 2, the book reveals how the special editions sparked correspondence between soldiers and authors, lifted “The Great Gatsby” from obscurity, and created a new audience of readers back home.
The program was conceived by a group of publishers, including Doubleday, Random House and W. W. Norton. In 1942 they formed the Council on Books in Wartime to explore how books could serve the nation during the war. Ultimately, the program transformed the publishing industry. “It basically provided the foundation for the mass-market paperback,” said Michael Hackenberg, a bookseller and historian. It also turned a generation of young men into lifelong readers….”

USDA Opens VIVO Research Networking Tool to Public


 Sharon Durham at the USDA: VIVO, a Web application used internally by U.S. Department of Agriculture (USDA) scientists since 2012 to allow better national networking across disciplines and locations, is now available to the public. USDA VIVO will be a “one-stop shop” for Federal agriculture expertise and research outcomes.”USDA employs over 5,000 researchers to ensure our programs are based on sound public policy and the best available science,” said USDA Chief Scientist and Undersecretary for Research, Education, and Economics Dr. Catherine Woteki. “USDA VIVO provides a powerful Web search tool for connecting interdisciplinary researchers, research projects and outcomes with others who might bring a different approach or scope to a research project. Inviting private citizens to use the system will increase the potential for collaboration to solve food- and agriculture-related problems.”
The idea behind USDA VIVO is to link researchers with peers and potential collaborators to ignite synergy among our nation’s best scientific minds and to spark unique approaches to some of our toughest agricultural problems. This efficient networking tool enables scientists to easily locate others with a particular expertise. VIVO also makes it possible to quickly identify scientific expertise and respond to emerging agricultural issues, like specific plant and animal disease or pests.
USDA’s Agricultural Research Service (ARS), Economic Research Service, National Institute of Food and Agriculture, National Agricultural Statistics Service and Forest Service are the first five USDA agencies to participate in VIVO. The National Agricultural Library, which is part of ARS, will host the Web application. USDA hopes to add other agencies in the future.
VIVO was in part developed under a $12.2 million grant from the National Center for Research Resources, part of the National Institutes of Health (NIH). The grant, made under the 2009 American Recovery and Reinvestment Act, was provided to the University of Florida and collaborators at Cornell University, Indiana University, Weill Cornell Medical College, Washington University in St. Louis, the Scripps Research Institute and the Ponce School of Medicine.
VIVO’s underlying database draws information about research being conducted by USDA scientists from official public systems of record and then makes it uniformly available for searching. The data can then be easily leveraged in other applications. In this way, USDA is also making its research projects and related impacts available to the Federal RePORTER tool, released by NIH on September 22, 2014. Federal RePORTER is part of a collaborative effort between Federal entities and other research institutions to create a repository that will be useful to assess the impact of Federal research and development investments.”

IMF: Statistics for Policymaking


Christine Lagarde, Managing Director, IMF: “So if you are wondering why the IMF cares so much about statistics and hosting such forums—I would say the reason is obvious.
The quest for understanding and making sense of the real world—by recording tasks and counting objects—has anchored economic development and social behavior over the past several millennia. Data has gained prominence as a vital building block for making sound policy. Without reliable and timely economic data, we would be wandering in the dark, making decisions on the basis of anecdotes, gut feelings, or worse.
However, the world of economic and financial statistics is not “static.” Markets evolve, and policy needs adapt. There needs to be continuous dialogue between the users and suppliers of data on relevant economic and financial issues.
This is precisely the objective of our forum today. It provides a unique setting for discussing cutting-edge statistics among a broad range of stakeholders: academics, private sector analysts, data compilers, and decision makers.
The theme for this year’s forum is identifying macroeconomic and financial vulnerabilities. To do this, we need to touch upon a broad range of topics, including cross-border linkages, key market indicators, and survey data, and even “Big Data.”
We need to bring all relevant information to the service of macroeconomic policymaking.
I would like to use this opportunity to offer a few thoughts on three key activities under way at the Fund:
(i) the IMF/FSB G-20 Data Gaps Initiative;
(ii) the IMF Data Standards Initiatives; and
(iii) our Data Publication Initiative.

And I have an important announcement to make—starting January 1, 2015 we will provide all our online data free-of-charge to everyone.
This will help all those who draw on our data make better use of this vital statistical resource—from budget numbers to balance of payments data, debt statistics to critical global indicators.”

Colombia’s Data-Driven Fight Against Crime


One Monday in 1988, El Mundo newspaper of Medellín, Colombia, reported, as it did every Monday, on the violent deaths in the city of two million people over the weekend. An article giving an hour-by-hour description of the deaths from Saturday night to Sunday night was remarkable for, among other things, the journalist’s skill in finding different ways to report a murder. “Someone took the life of Luís Alberto López at knife point … Luís Alberto Patiño ceased to exist with a bullet in his head … Mario Restrepo turned up dead … An unidentified person killed Néstor Alvarez with three shots.” In reporting 27 different murders, the author repeated his phrasing only once.

….What Guerrero did to make Cali safer was remarkable because it worked, and because of the novelty of his strategy. Before becoming mayor, Guerrero was not a politician, but a Harvard-trained epidemiologist who was president of the Universidad del Valle in Cali. He set out to prevent murder the way a doctor prevents disease. What public health workers are doing now to stop the spread of Ebola, Guerrero did in Cali to stop the spread of violence.

Although his ideas have now been used in dozens of cities throughout Latin America, they are worth revisiting because they are not employed in the places that need them most. The most violent places in Latin America are Honduras, El Salvador and Guatemala — indeed, they are among the most violent countries in the world not at war. The wave of youth migration to the United States is from these countries, and the refugees are largely fleeing violence.

One small municipality in El Salvador, Santa Tecla, has employed Cali’s strategies since about 10 years ago, and the homicide rate has dropped there. But Santa Tecla is an anomaly. Most of the region’s cities have not tried to do what Guerrero did — and they are failing to protect their citizens….

Guerrero went on to spread his ideas. Working with the Pan-American Health Organization and the Inter-American Development Bank, he took his epidemiological methods to 18 other countries.

“The approach was very low-cost and pragmatic,” said Joan Serra Hoffman, a senior specialist in crime and violence prevention in Latin America and the Caribbean at the World Bank. “You could see it was conceived by someone who was an academic and a policy maker. It can be fully operational for between $50,000 and $80,000.”…

Personalised Health and Care 2020: Using Data and Technology to Transform Outcomes for Patients and Citizens


Report and Framework of Action by the UK National Information Board: “One of the greatest opportunities of the 21st century is the potential to safely harness the power of the technology revolution, which has transformed our society, to meet the challenges of improving health and providing better, safer, sustainable care for all. To date the health and care system has only begun to exploit the potential of using data and technology at a national or local level. Our ambition is for a health and care system that enables people to make healthier choices, to be more resilient, to deal more effectively with illness and disability when it arises, and to have happier, longer lives in old age; a health and care system where technology can help tackle inequalities and improve access to services for the vulnerable.
The purpose of this paper is to consider what progress the health and care system has already made and what can be learnt from other industries and the wider economy…”

Hungry Planet: Can Big Data Help Feed 9 Billion Humans?


at NBC News: “With a population set to hit 9 billion human beings by 2050, the world needs to grow more food —without cutting down forests and jungles, which are the climate’s huge lungs.

The solution, according to one soil management scientist, is Big Data.

Kenneth Cassman, an agronomist at the University of Nebraska, Lincoln, recently unveiled a new interactive mapping tool that shows in fine-grain detail where higher crop yields are possible on current arable land.

“By some estimates, 20 to 30 percent of greenhouse gas emissions are associated with agriculture and of that a large portion is due to conversion of natural systems like rainforests or grassland savannahs to crop production, agriculture,” Cassman told NBC News at a conference in suburban Seattle.

The only practical way to stop the conversion of wild lands to farmland is grow more food on land already dedicated to agriculture, he said. Currently, the amount of farmland used to produce rice, wheat, maize and soybean, he noted, is expanding at a rate of about 20 million acres a year.

Cassman and colleagues unveiled the Global Yield Gap and Water Productivity Atlas in October at the Water for Food conference. The atlas was six years and $6 million in the making and contains site-specific data on soil, climate and cropping systems to determine potential yield versus actual yield farm by farm in nearly 20 countries around the world. Projects are ongoing to secure data for 30 more countries….

A key initiative going forward is to teach smallholder farmers how to use the atlas, Cassman said. Until now, the tool has largely rested with agricultural researchers who have validated its promise of delivering information that can help grow more food on existing farmland….

New Tool in Fighting Corruption: Open Data


Martin Tisne at Omidyar Network: “Yesterday in Brisbane, the G20 threw its weight behind open data by featuring it prominently in the G20 Anti-Corruption working action plan. Specifically, the action plan calls for effort in three related areas:

(1)   Prepare a G20 compendium of good practices and lessons learned on open data and its application in the fight against corruption
(2)   Prepare G20 Open Data Principles, including identifying areas or sectors where their application is particularly useful
(3)   Complete self‑assessments of G20 country open data frameworks and initiatives

Open data describes information that is not simply public, but that has been published in a manner that makes it easy to access and easy to compare and connect with other information.
This matters for anti corruption – if you are a journalist or a civil society activist investigating bribery and corruption those connections are everything. They tell you that an anonymous person (e.g. ‘Mr Smith’) who owns an obscure company registered in a tax haven is linked to a another company that has been illegally exporting timber from a neighboring country. That the said Mr. Smith is also the son-in-law of the mining minister of yet another country, who herself has been accused of embezzling mining revenues. As we have written elsewhere on this blog, investigative journalists, prosecution authorities, and civil society groups all need access to this linked data for their work.
The action plan also links open data to the wider G20 agenda, citing its impact on the ability of businesses to make better investment decisions. You can find the full detail here….”

Design for Policy


New book edited by Christian Bason:Design for Policy is the first publication to chart the emergence of collaborative design approaches to innovation in public policy. Drawing on contributions from a range of the world’s leading academics, design practitioners and public managers, it provides a rich, detailed analysis of design as a tool for addressing public problems and capturing opportunities for achieving better and more efficient societal outcomes.
In his introduction, Christian Bason suggests that design may offer a fundamental reinvention of the art and craft of policy making for the twenty-first century. From challenging current problem spaces to driving the creative quest for new solutions and shaping the physical and virtual artefacts of policy implementation, design holds a significant yet largely unexplored potential.
The book is structured in three main sections, covering the global context of the rise of design for policy, in-depth case studies of the application of design to policy making, and a guide to concrete design tools for policy intent, insight, ideation and implementation. The summary chapter lays out a future agenda for design in government, suggesting how to position design more firmly on the public policy stage.
Design for Policy is intended as a resource for leaders and scholars in government departments, public service organizations and institutions, schools of design and public management, think tanks and consultancies that wish to understand and use design as a tool for public sector reform and innovation….More: Full contents list; Introduction – The Design for Policy Nexus.”
 

Building a complete Tweet index


Yi Zhuang (@yz) at Twitter: “Since that first simple Tweet over eight years ago, hundreds of billions of Tweets have captured everyday human experiences and major historical events. Our search engine excelled at surfacing breaking news and events in real time, and our search index infrastructure reflected this strong emphasis on recency. But our long-standing goal has been to let people search through every Tweet ever published.
This new infrastructure enables many use cases, providing comprehensive results for entire TV and sports seasons, conferences (#TEDGlobal), industry discussions (#MobilePayments), places, businesses and long-lived hashtag conversations across topics, such as #JapanEarthquake, #Election2012, #ScotlandDecides, #HongKong, #Ferguson and many more. This change will be rolling out to users over the next few days.
In this post, we describe how we built a search service that efficiently indexes roughly half a trillion documents and serves queries with an average latency of under 100ms….”