Prizes and Productivity: How Winning the Fields Medal Affects Scientific Output


New NBER working paper by George J. Borjas and Kirk B. Doran: “Knowledge generation is key to economic growth, and scientific prizes are designed to encourage it. But how does winning a prestigious prize affect future output? We compare the productivity of Fields medalists (winners of the top mathematics prize) to that of similarly brilliant contenders. The two groups have similar publication rates until the award year, after which the winners’ productivity declines. The medalists begin to “play the field,” studying unfamiliar topics at the expense of writing papers. It appears that tournaments can have large post-prize effects on the effort allocation of knowledge producers.”

The Contours of Crowd Capability


New paper by Prashant Shukla and John Prpi: “The existence of dispersed knowledge has been a subject of inquiry for more than six decades. Despite the longevity of this rich research tradition, the “knowledge problem” has remained largely unresolved both in research and practice, and remains “the central theoretical problem of all social science”. However, in the 21st century, organizations are presented with opportunities through technology to potentially benefit from the dispersed knowledge problem to some extent. One such opportunity is represented by the recent emergence of a variety of crowd-engaging information systems (IS).
In this vein, Crowdsourcing  is being widely studied in numerous contexts, and the knowledge generated from these IS phenomena is well-documented. At the same time, other organizations are leveraging dispersed knowledge by putting in place IS-applications such as Predication Markets to gather large sample-size forecasts from within and without the organization. Similarly, we are also observing many organizations using IS-tools such as “Wikis” to access the knowledge of dispersed populations within the boundaries of the organization. Further still, other organizations are applying gamification techniques to accumulate Citizen Science knowledge from the public at large through IS.
Among these seemingly disparate phenomena, a complex ecology of crowd- engaging IS has emerged, involving millions of people all around the world generating knowledge for organizations through IS. However, despite the obvious scale and reach of this emerging crowd-engagement paradigm, there are no examples of research (as far as we know), that systematically compares and contrasts a large variety of these existing crowd-engaging IS-tools in one work. Understanding this current state of affairs, we seek to address this significant research void by comparing and contrasting a number of the crowd-engaging forms of IS currently available for organizational use.

To achieve this goal, we employ the Theory of Crowd Capital as a lens to systematically structure our investigation of crowd-engaging IS. Employing this parsimonious lens, we first explain how Crowd Capital is generated through Crowd Capability in organizations. Taking this conceptual platform as a point of departure, in Section 3, we offer an array of examples of IS currently in use in modern practice to generate Crowd Capital. We compare and contrast these emerging IS techniques using the Crowd Capability construct, therein highlighting some important choices that organizations face when entering the crowd- engagement fray. This comparison, which we term “The Contours of Crowd Capability”, can be used by decision-makers and researchers alike, to differentiate among the many extant methods of Crowd Capital generation. At the same time, our comparison also illustrates some important differences to be found in the internal organizational processes that accompany each form of crowd-engaging IS. In section 4, we conclude with a discussion of the limitations of our work.”

Smarter Than You Think: How Technology is Changing Our Minds for the Better


New book by Clive Thompson: “It’s undeniable—technology is changing the way we think. But is it for the better? Amid a chorus of doomsayers, Clive Thompson delivers a resounding “yes.” The Internet age has produced a radical new style of human intelligence, worthy of both celebration and analysis. We learn more and retain it longer, write and think with global audiences, and even gain an ESP-like awareness of the world around us. Modern technology is making us smarter, better connected, and often deeper—both as individuals and as a society.
In Smarter Than You Think Thompson shows that every technological innovation—from the written word to the printing press to the telegraph—has provoked the very same anxieties that plague us today. We panic that life will never be the same, that our attentions are eroding, that culture is being trivialized. But as in the past, we adapt—learning to use the new and retaining what’s good of the old.”

From Networked Publics to Issue Publics: Reconsidering the Public/Private Distinction in Web Science


New paper by Andreas Birkbak: “As an increasing part of everyday life becomes connected with the web in many areas of the globe, the question of how the web mediates political processes becomes still more urgent. Several scholars have started to address this question by thinking about the web in terms of a public space. In this paper, we aim to make a twofold contribution towards the development of the concept of publics in web science. First, we propose that although the notion of publics raises a variety of issues, two major concerns continue to be user privacy and democratic citizenship on the web. Well-known arguments hold that the complex connectivity of the web puts user privacy at risk and enables the enclosure of public debate in virtual echo chambers. Our first argument is that these concerns are united by a set of assumptions coming from liberal political philosophy that are rarely made explicit. As a second contribution, this paper points towards an alternative way to think about publics by proposing a pragmatist reorientation of the public/private distinction in web science, away from seeing two spheres that needs to be kept separate, towards seeing the public and the private as something that is continuously connected. The theoretical argument is illustrated by reference to a recently published case study of Facebook groups, and future research agendas for the study of web-mediated publics are proposed.”

Frontiers in Massive Data Analysis


New report from the National Academy of Sciences: “Data mining of massive data sets is transforming the way we think about crisis response, marketing, entertainment, cybersecurity and national intelligence. Collections of documents, images, videos, and networks are being thought of not merely as bit strings to be stored, indexed, and retrieved, but as potential sources of discovery and knowledge, requiring sophisticated analysis techniques that go far beyond classical indexing and keyword counting, aiming to find relational and semantic interpretations of the phenomena underlying the data.
Frontiers in Massive Data Analysis examines the frontier of analyzing massive amounts of data, whether in a static database or streaming through a system. Data at that scale–terabytes and petabytes–is increasingly common in science (e.g., particle physics, remote sensing, genomics), Internet commerce, business analytics, national security, communications, and elsewhere. The tools that work to infer knowledge from data at smaller scales do not necessarily work, or work well, at such massive scale. New tools, skills, and approaches are necessary, and this report identifies many of them, plus promising research directions to explore. Frontiers in Massive Data Analysis discusses pitfalls in trying to infer knowledge from massive data, and it characterizes seven major classes of computation that are common in the analysis of massive data. Overall, this report illustrates the cross-disciplinary knowledge–from computer science, statistics, machine learning, and application disciplines–that must be brought to bear to make useful inferences from massive data.”

Connecting Grassroots to Government for Disaster Management


New Report by the Commons Lab (Wilson Center): “The growing use of social media and other mass collaboration technologies is opening up new opportunities in disaster management efforts, but is also creating new challenges for policymakers looking to incorporate these tools into existing frameworks, according to our latest report.
The Commons Lab, part of the Wilson Center’s Science & Technology Innovation Program, hosted a September 2012 workshop bringing together emergency responders, crisis mappers, researchers, and software programmers to discuss issues surrounding the adoption of these new technologies.
We are now proud to unveil “Connecting Grassroots to Government for Disaster Management: Workshop Summary,” a report discussing the key findings, policy suggestions, and success stories that emerged during the workshop. The report’s release coincides with the tenth annual Disaster Preparedness Month, sponsored by the Federal Emergency Management Agency in the Department of Homeland Security to help educate the public about preparing for emergencies.  The report can be downloaded here.”

Coase’s theories predicted Internet’s impact on how business is done


Don Tapscott in The Globe and Mail: “Renowned economist Ronald Coase died last week at the age of 102. Among his many achievements, Mr. Coase was awarded the 1991 Nobel Prize in Economics, largely for his inspiring 1937 paper The Nature of the Firm. The Nobel committee applauded the academic for his “discovery and clarification of the significance of transaction costs … for the institutional structure and functioning of the economy.”
Mr. Coase’s enduring legacy may well be that 60 years later, his paper and theories help us understand the Internet’s impact on business, the economy and all our institutions… Mr. Coase wondered why there was no market within the firm. Why is it unprofitable to have each worker, each step in the production process, become an independent buyer and seller? Why doesn’t the draftsperson auction their services to the engineer? Why is it that the engineer does not sell designs to the highest bidder? Mr. Coase argued that preventing this from happening created marketplace friction.
Mr. Coase argued that this friction gave rise to transaction costs – or to put it more broadly, collaboration or relationship costs. There are three types of these relationship costs. First are search costs, such as the hunt for appropriate suppliers. Second are contractual costs, including price and contract negotiations. Third are the co-ordination costs of meshing the different products and processes.
The upshot is that most vertically integrated corporations found it cheaper and simpler to perform most functions in-house, rather than incurring the cost, hassle and risk of constant transactions with outside partners….This is no longer the case. Many behemoths have lost market share to more supple competitors. Digital technologies slash transaction and collaboration costs. Smart companies are making their boundaries porous, using the Internet to harness knowledge, resources and capabilities outside the company. Everywhere,leading firms set a context for innovation and then invite their customers, partners and other third parties to co-create their products and services.
Today’s economic engines are Internet-based clusters of businesses. While each company retains its identity, companies function together, creating more wealth than they could ever hope to create individually. Where corporations were once gigantic, new business ecosystems tend toward the amorphous.
Procter & Gamble now gets 60 per cent of its innovation from outside corporate walls. Boeing has built a massive ecosystem to design and manufacture jumbo jets. China’s motorcycle industry, which consists of dozens of companies collaborating with no single company pulling the strings, now comprises 40 per cent of global motorcycle production.
Looked at one way, Amazon.com is a website with many employees that ships books. Looked at another way, however, Amazon is a vast ecosystem that includes authors, publishers, customers who write reviews for the site, delivery companies like UPS, and tens of thousands of affiliates that market products and arrange fulfilment through the Amazon network. Hundreds of thousands of people are involved in Amazon’s viral marketing network.
This is leading to the biggest change to the corporation in a century and altering how we orchestrate capability to innovate, create goods and services and engage with the world. From now on, the ecosystem itself, not the corporation per se, should serve as the point of departure for every business strategist seeking to understand the new economy – and for every manager, entrepreneur and investor seeking to prosper in it.
Nor does the Internet tonic apply only to corporations. The Web is dropping transaction costs everywhere – enabling networked approaches to almost every institution in society, from government, media, science and health care to our energy grid, transportation systems and institutions for global problem solving.
Governments can change from being vertically integrated, industrial-age bureaucracies to become networks. By releasing their treasures of raw data, governments can now become platforms upon which companies, NGOs, academics, foundations, individuals and other government agencies can collaborate to create public value…”

Political Scientists Acknowledge Need to Make Stronger Case for Their Field


Beth McMurtrie in The Chronicle of Higher Education: “Back in March, Congress limited federal support for political-science research by the National Science Foundation to projects that promote national security or American economic interests. That decision was a victory for Sen. Tom Coburn, a Republican from Oklahoma who has long aimed to eliminate all NSF grants for political science, arguing that unlike the hard sciences it rarely produces concrete benefits to society.
Congress’s action has led to soul searching within the discipline about how effective academics have been in conveying the value of their work to the public. It has also revived a longstanding debate among political scientists about the shift toward more statistically sophisticated, mathematically esoteric research, and its usefulness outside of academe. Those discussions were out front at the annual conference of the American Political Science Association, held here last week.
Rogers M. Smith, a political-science professor at the University of Pennsylvania, was one of 13 members of a panel that discussed the controversy over NSF money for political-science studies. He put the problem bluntly: “We need to make a better case for ourselves.”
Few on the panel, in fact, seemed to think that political science had done a good job on that front. The association has created a task force—led by Arthur Lupia, a political-science professor at the University of Michigan at Ann Arbor—to improve public perceptions of political science’s value. He said his colleagues could learn from organizations like the American Association for the Advancement of Science, which holds special sessions for the news media at its annual conference to explain the work of its members to the public.”

Fighting for Reliable Evidence


New book by Judy Gueron and Howard Rolston: “Once primarily used in medical clinical trials, random assignment experimentation is now accepted among social scientists across a broad range of disciplines. The technique has been used in social experiments to evaluate a variety of programs, from microfinance and welfare reform to housing vouchers and teaching methods. How did randomized experiments move beyond medicine and into the social sciences, and can they be used effectively to evaluate complex social problems? Fighting for Reliable Evidence provides an absorbing historical account of the characters and controversies that have propelled the wider use of random assignment in social policy research over the past forty years.
Drawing from their extensive experience evaluating welfare reform programs, noted scholar practitioners Judith M. Gueron and Howard Rolston portray randomized experiments as a vital research tool to assess the impact of social policy. In a random assignment experiment, participants are sorted into either a treatment group that participates in a particular program, or a control group that does not. Because the groups are randomly selected, they do not differ from one another systematically. Therefore any subsequent differences between the groups can be attributed to the influence of the program or policy. The theory is elegant and persuasive, but many scholars worry that such an experiment is too difficult or expensive to implement in the real world. Can a control group be truly insulated from the treatment policy? Would staffers comply with the random allocation of participants? Would the findings matter?”

Can The "GitHub For Science" Convince Researchers To Open-Source Their Data?


Interview at Co.Labs: “Science has a problem: Researchers don’t share their data. A new startup wants to change that by melding GitHub and Google Docs…Nathan Jenkins is a condensed matter physicist and programmer who has worked at CERN, the European Organization for Nuclear Research. He recently left his post-doc program at New York University to cofound Authorea, a platform that helps scientists draft, collaborate on, share, and publish academic articles. We talked with him about the idea behind Authorea, the open science movement, and the future of scientific publishing.”