Frontiers in Massive Data Analysis


New report from the National Academy of Sciences: “Data mining of massive data sets is transforming the way we think about crisis response, marketing, entertainment, cybersecurity and national intelligence. Collections of documents, images, videos, and networks are being thought of not merely as bit strings to be stored, indexed, and retrieved, but as potential sources of discovery and knowledge, requiring sophisticated analysis techniques that go far beyond classical indexing and keyword counting, aiming to find relational and semantic interpretations of the phenomena underlying the data.
Frontiers in Massive Data Analysis examines the frontier of analyzing massive amounts of data, whether in a static database or streaming through a system. Data at that scale–terabytes and petabytes–is increasingly common in science (e.g., particle physics, remote sensing, genomics), Internet commerce, business analytics, national security, communications, and elsewhere. The tools that work to infer knowledge from data at smaller scales do not necessarily work, or work well, at such massive scale. New tools, skills, and approaches are necessary, and this report identifies many of them, plus promising research directions to explore. Frontiers in Massive Data Analysis discusses pitfalls in trying to infer knowledge from massive data, and it characterizes seven major classes of computation that are common in the analysis of massive data. Overall, this report illustrates the cross-disciplinary knowledge–from computer science, statistics, machine learning, and application disciplines–that must be brought to bear to make useful inferences from massive data.”

Connecting Grassroots to Government for Disaster Management


New Report by the Commons Lab (Wilson Center): “The growing use of social media and other mass collaboration technologies is opening up new opportunities in disaster management efforts, but is also creating new challenges for policymakers looking to incorporate these tools into existing frameworks, according to our latest report.
The Commons Lab, part of the Wilson Center’s Science & Technology Innovation Program, hosted a September 2012 workshop bringing together emergency responders, crisis mappers, researchers, and software programmers to discuss issues surrounding the adoption of these new technologies.
We are now proud to unveil “Connecting Grassroots to Government for Disaster Management: Workshop Summary,” a report discussing the key findings, policy suggestions, and success stories that emerged during the workshop. The report’s release coincides with the tenth annual Disaster Preparedness Month, sponsored by the Federal Emergency Management Agency in the Department of Homeland Security to help educate the public about preparing for emergencies.  The report can be downloaded here.”

Coase’s theories predicted Internet’s impact on how business is done


Don Tapscott in The Globe and Mail: “Renowned economist Ronald Coase died last week at the age of 102. Among his many achievements, Mr. Coase was awarded the 1991 Nobel Prize in Economics, largely for his inspiring 1937 paper The Nature of the Firm. The Nobel committee applauded the academic for his “discovery and clarification of the significance of transaction costs … for the institutional structure and functioning of the economy.”
Mr. Coase’s enduring legacy may well be that 60 years later, his paper and theories help us understand the Internet’s impact on business, the economy and all our institutions… Mr. Coase wondered why there was no market within the firm. Why is it unprofitable to have each worker, each step in the production process, become an independent buyer and seller? Why doesn’t the draftsperson auction their services to the engineer? Why is it that the engineer does not sell designs to the highest bidder? Mr. Coase argued that preventing this from happening created marketplace friction.
Mr. Coase argued that this friction gave rise to transaction costs – or to put it more broadly, collaboration or relationship costs. There are three types of these relationship costs. First are search costs, such as the hunt for appropriate suppliers. Second are contractual costs, including price and contract negotiations. Third are the co-ordination costs of meshing the different products and processes.
The upshot is that most vertically integrated corporations found it cheaper and simpler to perform most functions in-house, rather than incurring the cost, hassle and risk of constant transactions with outside partners….This is no longer the case. Many behemoths have lost market share to more supple competitors. Digital technologies slash transaction and collaboration costs. Smart companies are making their boundaries porous, using the Internet to harness knowledge, resources and capabilities outside the company. Everywhere,leading firms set a context for innovation and then invite their customers, partners and other third parties to co-create their products and services.
Today’s economic engines are Internet-based clusters of businesses. While each company retains its identity, companies function together, creating more wealth than they could ever hope to create individually. Where corporations were once gigantic, new business ecosystems tend toward the amorphous.
Procter & Gamble now gets 60 per cent of its innovation from outside corporate walls. Boeing has built a massive ecosystem to design and manufacture jumbo jets. China’s motorcycle industry, which consists of dozens of companies collaborating with no single company pulling the strings, now comprises 40 per cent of global motorcycle production.
Looked at one way, Amazon.com is a website with many employees that ships books. Looked at another way, however, Amazon is a vast ecosystem that includes authors, publishers, customers who write reviews for the site, delivery companies like UPS, and tens of thousands of affiliates that market products and arrange fulfilment through the Amazon network. Hundreds of thousands of people are involved in Amazon’s viral marketing network.
This is leading to the biggest change to the corporation in a century and altering how we orchestrate capability to innovate, create goods and services and engage with the world. From now on, the ecosystem itself, not the corporation per se, should serve as the point of departure for every business strategist seeking to understand the new economy – and for every manager, entrepreneur and investor seeking to prosper in it.
Nor does the Internet tonic apply only to corporations. The Web is dropping transaction costs everywhere – enabling networked approaches to almost every institution in society, from government, media, science and health care to our energy grid, transportation systems and institutions for global problem solving.
Governments can change from being vertically integrated, industrial-age bureaucracies to become networks. By releasing their treasures of raw data, governments can now become platforms upon which companies, NGOs, academics, foundations, individuals and other government agencies can collaborate to create public value…”

Political Scientists Acknowledge Need to Make Stronger Case for Their Field


Beth McMurtrie in The Chronicle of Higher Education: “Back in March, Congress limited federal support for political-science research by the National Science Foundation to projects that promote national security or American economic interests. That decision was a victory for Sen. Tom Coburn, a Republican from Oklahoma who has long aimed to eliminate all NSF grants for political science, arguing that unlike the hard sciences it rarely produces concrete benefits to society.
Congress’s action has led to soul searching within the discipline about how effective academics have been in conveying the value of their work to the public. It has also revived a longstanding debate among political scientists about the shift toward more statistically sophisticated, mathematically esoteric research, and its usefulness outside of academe. Those discussions were out front at the annual conference of the American Political Science Association, held here last week.
Rogers M. Smith, a political-science professor at the University of Pennsylvania, was one of 13 members of a panel that discussed the controversy over NSF money for political-science studies. He put the problem bluntly: “We need to make a better case for ourselves.”
Few on the panel, in fact, seemed to think that political science had done a good job on that front. The association has created a task force—led by Arthur Lupia, a political-science professor at the University of Michigan at Ann Arbor—to improve public perceptions of political science’s value. He said his colleagues could learn from organizations like the American Association for the Advancement of Science, which holds special sessions for the news media at its annual conference to explain the work of its members to the public.”

Fighting for Reliable Evidence


New book by Judy Gueron and Howard Rolston: “Once primarily used in medical clinical trials, random assignment experimentation is now accepted among social scientists across a broad range of disciplines. The technique has been used in social experiments to evaluate a variety of programs, from microfinance and welfare reform to housing vouchers and teaching methods. How did randomized experiments move beyond medicine and into the social sciences, and can they be used effectively to evaluate complex social problems? Fighting for Reliable Evidence provides an absorbing historical account of the characters and controversies that have propelled the wider use of random assignment in social policy research over the past forty years.
Drawing from their extensive experience evaluating welfare reform programs, noted scholar practitioners Judith M. Gueron and Howard Rolston portray randomized experiments as a vital research tool to assess the impact of social policy. In a random assignment experiment, participants are sorted into either a treatment group that participates in a particular program, or a control group that does not. Because the groups are randomly selected, they do not differ from one another systematically. Therefore any subsequent differences between the groups can be attributed to the influence of the program or policy. The theory is elegant and persuasive, but many scholars worry that such an experiment is too difficult or expensive to implement in the real world. Can a control group be truly insulated from the treatment policy? Would staffers comply with the random allocation of participants? Would the findings matter?”

Can The "GitHub For Science" Convince Researchers To Open-Source Their Data?


Interview at Co.Labs: “Science has a problem: Researchers don’t share their data. A new startup wants to change that by melding GitHub and Google Docs…Nathan Jenkins is a condensed matter physicist and programmer who has worked at CERN, the European Organization for Nuclear Research. He recently left his post-doc program at New York University to cofound Authorea, a platform that helps scientists draft, collaborate on, share, and publish academic articles. We talked with him about the idea behind Authorea, the open science movement, and the future of scientific publishing.”

Money and trust among strangers


New paper by Gabriele Camera, Marco Casari and Maria Bigoni in PNAS:”What makes money essential for the functioning of modern society? Through an experiment, we present evidence for the existence of a relevant behavioral dimension in addition to the standard theoretical arguments. Subjects faced repeated opportunities to help an anonymous counterpart who changed over time. Cooperation required trusting that help given to a stranger today would be returned by a stranger in the future. Cooperation levels declined when going from small to large groups of strangers, even if monitoring and payoffs from cooperation were invariant to group size. We then introduced intrinsically worthless tokens. Tokens endogenously became money: subjects took to reward help with a token and to demand a token in exchange for help. Subjects trusted that strangers would return help for a token. Cooperation levels remained stable as the groups grew larger. In all conditions, full cooperation was possible through a social norm of decentralized enforcement, without using tokens. This turned out to be especially demanding in large groups. Lack of trust among strangers thus made money behaviorally essential. To explain these results, we developed an evolutionary model. When behavior in society is heterogeneous, cooperation collapses without tokens. In contrast, the use of tokens makes cooperation evolutionarily stable.”

Project Anticipation


New site for the UNESCO Chair in Anticipatory Systems: “The purpose of the Chair in Anticipatory Systems is to both develop and promote the Discipline of Anticipation, thereby bringing a critical idea to life. To this end, we have a two pronged strategy consisting of knowledge development and communication. The two are equally important. While many academic projects naturally emphasize knowledge development, we must also reach a large and disparate audience, and open minds locked within the longstanding legacy of reactive science. Thus, from a practical standpoint, how we conceptualize and communicate the Discipline of Anticipation is as important as the Discipline of Anticipation itself….
The project’s main objective is the development of the Discipline of Anticipation, including the development of a system of anticipatory strategies and techniques. The more the culture of anticipation spreads, the easier it will be to develop socially acceptable anticipatory strategies. It will then be possible to accumulate relevant experience on how to think about the future and to use anticipatory methods. It will also be possible to try and develop a language and a body of practices that are more adapted for thinking about the future and for developing new ways to address threads and opportunities.
The following outcomes are envisaged:

  • Futures Literacy: Development of a set of protocols for the appropriate implementation on the ground of the different kinds of anticipation (under the rubric of futures literacy), together with syllabi and teaching materials on the Discipline of Anticipation.
  • Anticipatory Capability Profile: Development of a Anticipatory Capability Profile for communities and institutions, together with a set of recommendations on how a community, organization or institution may raise its anticipatory performance.
  • Resilience Profile: Setting of a resilience index and analysis of the resilience level of selected communities and regions, including a set of recommendations on how to raise their resilience level.”

Berks, wankers and wonks: how to pitch science policy advice


Stian Westlake in the Guardian: ” If you think about the kinds of people whom policymakers generally hear from when they cast about for advice, the distinction between berks and wankers is rather useful.
The berks of the policy world are easiest to recognise. They’re oversimplifiers, charlatans and blowhards. Berks can be trusted to take a complicated issue and deliver a simplistic and superficially plausible answer. In their search for a convenient message, they misrepresent research or ignore it entirely. They happily range far from their field of expertise and offer opinions on subjects about which they know little – while pretending to be on their expert home turf. And they are very good at soundbites.
Policymakers who consult berkish experts will get clear, actionable advice. But it could very well be wrong.
Most researchers, especially those with an academic background, will find avoiding berkhood comes naturally. After all, graduate school teaches rigour and caution. Academia reserves an especially withering contempt for professors who use their intellectual authority to advance controversial positions outside their area of expertise, from Linus Pauling’s speculations on vitamin C to Niall Ferguson’s opinions on US economic policy. No one wants to be a dodgy dossier merchant.
The risk of becoming a wanker is far more subtle. If the berks of the policy world are too ready to give an opinion, the wankers never give an opinion on anything, except to say how complicated it is.
In some ways, wankers are more harmless than berks, in the sense that being overconfident about what you know is often more dangerous than being too modest. Much bad policy is based on bad evidence, and rigorous research can expose that. Sometimes policymakers are asking the wrong questions entirely, and need to be told as much.
But policymakers who get all their advice from wankers are likely to be as ill-served as those who rely on berks. As anyone who’s ever advised a friend will know, good advice is not just a matter of providing information, or summarising research. It also involves making a judgment about the balance of facts, helping frame the issue, and communicating in a way that the person you’re counselling will understand and act on…
Neither glibness or prolixity make for useful advice. There are lots more tips on this from initiatives like the Alliance for Useful Evidence (of which Nesta, my employer, is a funder) and WonkComms.”

Innovating to Improve Disaster Response and Recovery


Todd Park at OSTP blog: “Last week, the White House Office of Science and Technology Policy (OSTP) and the Federal Emergency Management Agency (FEMA) jointly challenged a group of over 80 top innovators from around the country to come up with ways to improve disaster response and recovery efforts.  This diverse group of stakeholders, consisting of representatives from Zappos, Airbnb, Marriott International, the Parsons School of Design, AOL/Huffington Post’s Social Impact, The Weather Channel, Twitter, Topix.com, Twilio, New York City, Google and the Red Cross, to name a few, spent an entire day at the White House collaborating on ideas for tools, products, services, programs, and apps that can assist disaster survivors and communities…
During the “Data Jam/Think Tank,” we discussed response and recovery challenges…Below are some of the ideas that were developed throughout the day. In the case of the first two ideas, participants wrote code and created actual working prototypes.

  • A real-time communications platform that allows survivors dependent on electricity-powered medical devices to text or call in their needs—such as batteries, medication, or a power generator—and connect those needs with a collaborative transportation network to make real-time deliveries.
  • A technical schema that tags all disaster-related information from social media and news sites – enabling municipalities and first responders to better understand all of the invaluable information generated during a disaster and help identify where they can help.
  • A Disaster Relief Innovation Vendor Engine (DRIVE) which aggregates pre-approved vendors for disaster-related needs, including transportation, power, housing, and medical supplies, to make it as easy as possible to find scarce local resources.
  • A crowdfunding platform for small businesses and others to receive access to capital to help rebuild after a disaster, including a rating system that encourages rebuilding efforts that improve the community.
  • Promoting preparedness through talk shows, working closely with celebrities, musicians, and children to raise awareness.
  • A “community power-go-round” that, like a merry-go-round, can be pushed to generate electricity and additional power for battery-charged devices including cell phones or a Wi-Fi network to provide community internet access.
  • Aggregating crowdsourced imagery taken and shared through social media sites to help identify where trees have fallen, electrical lines have been toppled, and streets have been obstructed.
  • A kid-run local radio station used to educate youth about preparedness for a disaster and activated to support relief efforts during a disaster that allows youth to share their experiences.”