USDA Opens VIVO Research Networking Tool to Public


 Sharon Durham at the USDA: VIVO, a Web application used internally by U.S. Department of Agriculture (USDA) scientists since 2012 to allow better national networking across disciplines and locations, is now available to the public. USDA VIVO will be a “one-stop shop” for Federal agriculture expertise and research outcomes.”USDA employs over 5,000 researchers to ensure our programs are based on sound public policy and the best available science,” said USDA Chief Scientist and Undersecretary for Research, Education, and Economics Dr. Catherine Woteki. “USDA VIVO provides a powerful Web search tool for connecting interdisciplinary researchers, research projects and outcomes with others who might bring a different approach or scope to a research project. Inviting private citizens to use the system will increase the potential for collaboration to solve food- and agriculture-related problems.”
The idea behind USDA VIVO is to link researchers with peers and potential collaborators to ignite synergy among our nation’s best scientific minds and to spark unique approaches to some of our toughest agricultural problems. This efficient networking tool enables scientists to easily locate others with a particular expertise. VIVO also makes it possible to quickly identify scientific expertise and respond to emerging agricultural issues, like specific plant and animal disease or pests.
USDA’s Agricultural Research Service (ARS), Economic Research Service, National Institute of Food and Agriculture, National Agricultural Statistics Service and Forest Service are the first five USDA agencies to participate in VIVO. The National Agricultural Library, which is part of ARS, will host the Web application. USDA hopes to add other agencies in the future.
VIVO was in part developed under a $12.2 million grant from the National Center for Research Resources, part of the National Institutes of Health (NIH). The grant, made under the 2009 American Recovery and Reinvestment Act, was provided to the University of Florida and collaborators at Cornell University, Indiana University, Weill Cornell Medical College, Washington University in St. Louis, the Scripps Research Institute and the Ponce School of Medicine.
VIVO’s underlying database draws information about research being conducted by USDA scientists from official public systems of record and then makes it uniformly available for searching. The data can then be easily leveraged in other applications. In this way, USDA is also making its research projects and related impacts available to the Federal RePORTER tool, released by NIH on September 22, 2014. Federal RePORTER is part of a collaborative effort between Federal entities and other research institutions to create a repository that will be useful to assess the impact of Federal research and development investments.”

IMF: Statistics for Policymaking


Christine Lagarde, Managing Director, IMF: “So if you are wondering why the IMF cares so much about statistics and hosting such forums—I would say the reason is obvious.
The quest for understanding and making sense of the real world—by recording tasks and counting objects—has anchored economic development and social behavior over the past several millennia. Data has gained prominence as a vital building block for making sound policy. Without reliable and timely economic data, we would be wandering in the dark, making decisions on the basis of anecdotes, gut feelings, or worse.
However, the world of economic and financial statistics is not “static.” Markets evolve, and policy needs adapt. There needs to be continuous dialogue between the users and suppliers of data on relevant economic and financial issues.
This is precisely the objective of our forum today. It provides a unique setting for discussing cutting-edge statistics among a broad range of stakeholders: academics, private sector analysts, data compilers, and decision makers.
The theme for this year’s forum is identifying macroeconomic and financial vulnerabilities. To do this, we need to touch upon a broad range of topics, including cross-border linkages, key market indicators, and survey data, and even “Big Data.”
We need to bring all relevant information to the service of macroeconomic policymaking.
I would like to use this opportunity to offer a few thoughts on three key activities under way at the Fund:
(i) the IMF/FSB G-20 Data Gaps Initiative;
(ii) the IMF Data Standards Initiatives; and
(iii) our Data Publication Initiative.

And I have an important announcement to make—starting January 1, 2015 we will provide all our online data free-of-charge to everyone.
This will help all those who draw on our data make better use of this vital statistical resource—from budget numbers to balance of payments data, debt statistics to critical global indicators.”

Colombia’s Data-Driven Fight Against Crime


One Monday in 1988, El Mundo newspaper of Medellín, Colombia, reported, as it did every Monday, on the violent deaths in the city of two million people over the weekend. An article giving an hour-by-hour description of the deaths from Saturday night to Sunday night was remarkable for, among other things, the journalist’s skill in finding different ways to report a murder. “Someone took the life of Luís Alberto López at knife point … Luís Alberto Patiño ceased to exist with a bullet in his head … Mario Restrepo turned up dead … An unidentified person killed Néstor Alvarez with three shots.” In reporting 27 different murders, the author repeated his phrasing only once.

….What Guerrero did to make Cali safer was remarkable because it worked, and because of the novelty of his strategy. Before becoming mayor, Guerrero was not a politician, but a Harvard-trained epidemiologist who was president of the Universidad del Valle in Cali. He set out to prevent murder the way a doctor prevents disease. What public health workers are doing now to stop the spread of Ebola, Guerrero did in Cali to stop the spread of violence.

Although his ideas have now been used in dozens of cities throughout Latin America, they are worth revisiting because they are not employed in the places that need them most. The most violent places in Latin America are Honduras, El Salvador and Guatemala — indeed, they are among the most violent countries in the world not at war. The wave of youth migration to the United States is from these countries, and the refugees are largely fleeing violence.

One small municipality in El Salvador, Santa Tecla, has employed Cali’s strategies since about 10 years ago, and the homicide rate has dropped there. But Santa Tecla is an anomaly. Most of the region’s cities have not tried to do what Guerrero did — and they are failing to protect their citizens….

Guerrero went on to spread his ideas. Working with the Pan-American Health Organization and the Inter-American Development Bank, he took his epidemiological methods to 18 other countries.

“The approach was very low-cost and pragmatic,” said Joan Serra Hoffman, a senior specialist in crime and violence prevention in Latin America and the Caribbean at the World Bank. “You could see it was conceived by someone who was an academic and a policy maker. It can be fully operational for between $50,000 and $80,000.”…

Personalised Health and Care 2020: Using Data and Technology to Transform Outcomes for Patients and Citizens


Report and Framework of Action by the UK National Information Board: “One of the greatest opportunities of the 21st century is the potential to safely harness the power of the technology revolution, which has transformed our society, to meet the challenges of improving health and providing better, safer, sustainable care for all. To date the health and care system has only begun to exploit the potential of using data and technology at a national or local level. Our ambition is for a health and care system that enables people to make healthier choices, to be more resilient, to deal more effectively with illness and disability when it arises, and to have happier, longer lives in old age; a health and care system where technology can help tackle inequalities and improve access to services for the vulnerable.
The purpose of this paper is to consider what progress the health and care system has already made and what can be learnt from other industries and the wider economy…”

Hungry Planet: Can Big Data Help Feed 9 Billion Humans?


at NBC News: “With a population set to hit 9 billion human beings by 2050, the world needs to grow more food —without cutting down forests and jungles, which are the climate’s huge lungs.

The solution, according to one soil management scientist, is Big Data.

Kenneth Cassman, an agronomist at the University of Nebraska, Lincoln, recently unveiled a new interactive mapping tool that shows in fine-grain detail where higher crop yields are possible on current arable land.

“By some estimates, 20 to 30 percent of greenhouse gas emissions are associated with agriculture and of that a large portion is due to conversion of natural systems like rainforests or grassland savannahs to crop production, agriculture,” Cassman told NBC News at a conference in suburban Seattle.

The only practical way to stop the conversion of wild lands to farmland is grow more food on land already dedicated to agriculture, he said. Currently, the amount of farmland used to produce rice, wheat, maize and soybean, he noted, is expanding at a rate of about 20 million acres a year.

Cassman and colleagues unveiled the Global Yield Gap and Water Productivity Atlas in October at the Water for Food conference. The atlas was six years and $6 million in the making and contains site-specific data on soil, climate and cropping systems to determine potential yield versus actual yield farm by farm in nearly 20 countries around the world. Projects are ongoing to secure data for 30 more countries….

A key initiative going forward is to teach smallholder farmers how to use the atlas, Cassman said. Until now, the tool has largely rested with agricultural researchers who have validated its promise of delivering information that can help grow more food on existing farmland….

New Tool in Fighting Corruption: Open Data


Martin Tisne at Omidyar Network: “Yesterday in Brisbane, the G20 threw its weight behind open data by featuring it prominently in the G20 Anti-Corruption working action plan. Specifically, the action plan calls for effort in three related areas:

(1)   Prepare a G20 compendium of good practices and lessons learned on open data and its application in the fight against corruption
(2)   Prepare G20 Open Data Principles, including identifying areas or sectors where their application is particularly useful
(3)   Complete self‑assessments of G20 country open data frameworks and initiatives

Open data describes information that is not simply public, but that has been published in a manner that makes it easy to access and easy to compare and connect with other information.
This matters for anti corruption – if you are a journalist or a civil society activist investigating bribery and corruption those connections are everything. They tell you that an anonymous person (e.g. ‘Mr Smith’) who owns an obscure company registered in a tax haven is linked to a another company that has been illegally exporting timber from a neighboring country. That the said Mr. Smith is also the son-in-law of the mining minister of yet another country, who herself has been accused of embezzling mining revenues. As we have written elsewhere on this blog, investigative journalists, prosecution authorities, and civil society groups all need access to this linked data for their work.
The action plan also links open data to the wider G20 agenda, citing its impact on the ability of businesses to make better investment decisions. You can find the full detail here….”

Design for Policy


New book edited by Christian Bason:Design for Policy is the first publication to chart the emergence of collaborative design approaches to innovation in public policy. Drawing on contributions from a range of the world’s leading academics, design practitioners and public managers, it provides a rich, detailed analysis of design as a tool for addressing public problems and capturing opportunities for achieving better and more efficient societal outcomes.
In his introduction, Christian Bason suggests that design may offer a fundamental reinvention of the art and craft of policy making for the twenty-first century. From challenging current problem spaces to driving the creative quest for new solutions and shaping the physical and virtual artefacts of policy implementation, design holds a significant yet largely unexplored potential.
The book is structured in three main sections, covering the global context of the rise of design for policy, in-depth case studies of the application of design to policy making, and a guide to concrete design tools for policy intent, insight, ideation and implementation. The summary chapter lays out a future agenda for design in government, suggesting how to position design more firmly on the public policy stage.
Design for Policy is intended as a resource for leaders and scholars in government departments, public service organizations and institutions, schools of design and public management, think tanks and consultancies that wish to understand and use design as a tool for public sector reform and innovation….More: Full contents list; Introduction – The Design for Policy Nexus.”
 

Building a complete Tweet index


Yi Zhuang (@yz) at Twitter: “Since that first simple Tweet over eight years ago, hundreds of billions of Tweets have captured everyday human experiences and major historical events. Our search engine excelled at surfacing breaking news and events in real time, and our search index infrastructure reflected this strong emphasis on recency. But our long-standing goal has been to let people search through every Tweet ever published.
This new infrastructure enables many use cases, providing comprehensive results for entire TV and sports seasons, conferences (#TEDGlobal), industry discussions (#MobilePayments), places, businesses and long-lived hashtag conversations across topics, such as #JapanEarthquake, #Election2012, #ScotlandDecides, #HongKong, #Ferguson and many more. This change will be rolling out to users over the next few days.
In this post, we describe how we built a search service that efficiently indexes roughly half a trillion documents and serves queries with an average latency of under 100ms….”

Innovating Practice in a Culture of Expertise


Aleem Walji at SSI Review: “When I joined the World Bank five years ago to lead a new innovation practice, the organization asked me to help expand the space for experimentation and learning with an emphasis on emergent technologies. But that mandate was intimidating and counter-intuitive in an “expert-driven” culture. Experts want detailed plans, budgets, clear success indicators, and minimal risk. But innovation is about managing risk and navigating uncertainty intelligently. You fail fast and fail forward. It has been a step-by-step process, and the journey is far from over, but the World Bank today sees innovation as essential to achieving its mission.
It’s taught me a lot about seeding innovation in a culture of expertise, including phasing change across approaches to technology, teaming, problem solving, and ultimately leadership.
Innovating technologies: As a newcomer, my goal was not to try to change the World Bank’s culture. I was content to carve out a space where my team could try new things we couldn’t do elsewhere in the institution, learn fast, and create impact. Our initial focus was leveraging technologies with approaches that, if they took root, could be very powerful.
Over the first 18 to 24 months, we served as an incubator for ideas and had a number of successes that built on senior management’s support for increased access to information. The Open Data Initiative, for example, made our trove of information on countries, people, projects, and programs widely available and searchable. To our surprise, people came in droves to access it. We also launched the Mapping for Results initiative, which mapped project results and poverty data to show the relationship between where we lend and where the poor live, and the results of our work. These programs are now mainstream at the World Bank and have penetrated other development institutions….
Innovating teams: The lab idea—phase two—would require collaboration and experimentation in an unprecedented way. For example, we worked with other parts of the World Bank and a number of outside organizations to incubate the Open Development Technology Alliance, now part of the digital engagement unit of the World Bank. It worked to enhance accountability, and improve the delivery and quality of public services through technology-enabled citizen engagement such as using mobile phones, interactive mapping, and social media to draw citizens into collective problem mapping and problem solving….
Innovating problem solving: At the same time, we recognized that we face some really complex problems that the World Bank’s traditional approach of lending to governments and supervising development projects is not solving. For this, we needed another type of lab that innovated the very way we solve problems. We needed a deliberate process for experimenting, learning, iterating, and adapting. But that’s easier said than done. At our core, we are an expert-driven organization with know-how in disciplines ranging from agricultural economics and civil engineering to maternal health and early childhood development. Our problem-solving architecture is rooted in designing technical solutions to complicated problems. Yet the hardest problems in the world defy technical fixes. We work in contexts where political environments shift, leaders change, and conditions on the ground constantly evolve. Problems like climate change, financial inclusion, food security, and youth unemployment demand new ways of solving old problems.
The innovation we most needed was innovation in the leadership architecture of how we confront complex challenges. We share knowledge and expertise on the “what” of reform, but the “how” is what we need most. We need to marry know-how with do-how. We need multiyear, multi-stakeholder, and systems approaches to solving problems. We need to get better at framing and reframing problems, integrative thinking, and testing a range of solutions. We need to iterate and course-correct as we learn what works and doesn’t work in which context. That’s where we are right now with what we call “integrated leadership learning innovation”—phase four. It’s all about shaping an innovative process to address complex problems….”

Can Government Mine Tweets to Assess Public Opinion?


at Government Technology: “What if instead of going to a city meeting, you could go on Twitter, tweet your opinion, and still be heard by those in government? New research suggests this is a possibility.
The Urban Attitudes Lab at Tufts University has conducted research on accessing “big data” on social networking sites for civic purposes, according to Justin Hollander, associate professor in the Department of Urban and Environmental Policy and Planning at Tufts.
About six months ago, Hollander began researching new ways of accessing how people think about the places they live, work and play. “We’re looking to see how tapping into social media data to understand attitudes and opinions can benefit both urban planning and public policy,” he said.
Harnessing natural comments — there are about one billion tweets per day — could help governments learn what people are saying and feeling, said Hollander. And while formal types of data can be used as proxies for how happy people are, people openly share their sentiments on social networking sites.
Twitter and other social media sites can also provide information in an unobtrusive way. “The idea is that we can capture a potentially more valid and reliable view [of people’s] opinions about the world,” he said. As an inexact science, social science relies on a wide range of data sources to inform research, including surveys, interviews and focus groups; but people respond to being the subject of study, possibly affecting outcomes, Hollander said.
Hollander is also interested in extracting data from social sites because it can be done on a 24/7 basis, which means not having to wait for government to administer surveys, like the Decennial Census. Information from Twitter can also be connected to place; Hollander has approximated that about 10 percent of all tweets are geotagged to location.
In its first study earlier this year, the lab looked at using big data to learn about people’s sentiments and civic interests in New Bedford, Mass., comparing Twitter messages with the city’s published meeting minutes.
To extract tweets over a six-week period from February to April, researchers used the lab’s own software to capture 122,186 tweets geotagged within the city that also had words pertaining to the New Bedford area. Hollander said anyone can get API information from Twitter to also mine data from an area as small as a neighborhood containing a couple hundred houses.
Researchers used IBM’s SPSS Modeler software, comparing this to custom-designed software, to leverage a sentiment dictionary of nearly 3,000 words, assigning a sentiment score to each phrase — ranging from -5 for awful feelings to +5 for feelings of elation. The lab did this for the Twitter messages, and found that about 7 percent were positive versus 5.5 percent negative, and correspondingly in the minutes, 1.7 percent were positive and .7 percent negative. In total, about 11,000 messages contained sentiments.
The lab also used NVivo qualitative software to analyze 24 key words in a one-year sample of the city’s meeting minutes. By searching for the same words in Twitter posts, the researchers found that “school,” “health,” “safety,” “parks,” “field” and “children” were used frequently across both mediums.
….
Next up for the lab is a new study contrasting Twitter posts from four Massachusetts cities with the recent election results.