From Networked Publics to Issue Publics: Reconsidering the Public/Private Distinction in Web Science


New paper by Andreas Birkbak: “As an increasing part of everyday life becomes connected with the web in many areas of the globe, the question of how the web mediates political processes becomes still more urgent. Several scholars have started to address this question by thinking about the web in terms of a public space. In this paper, we aim to make a twofold contribution towards the development of the concept of publics in web science. First, we propose that although the notion of publics raises a variety of issues, two major concerns continue to be user privacy and democratic citizenship on the web. Well-known arguments hold that the complex connectivity of the web puts user privacy at risk and enables the enclosure of public debate in virtual echo chambers. Our first argument is that these concerns are united by a set of assumptions coming from liberal political philosophy that are rarely made explicit. As a second contribution, this paper points towards an alternative way to think about publics by proposing a pragmatist reorientation of the public/private distinction in web science, away from seeing two spheres that needs to be kept separate, towards seeing the public and the private as something that is continuously connected. The theoretical argument is illustrated by reference to a recently published case study of Facebook groups, and future research agendas for the study of web-mediated publics are proposed.”

Confronting Wicked Problems in the Metropolis


An APSA 2013 Annual Meeting Paper by Jered Carr and Brent Never: “These problems facing many metropolitan regions in the U.S. are complex, open-ended and seemingly intractable. The obstacles to regional governance created by these “wicked” problems are the root of the criticisms of the consensus-based “self-organizing” strategies described by frameworks such as New Regionalism and Institutional Collective Action. The self-organized solutions described by these frameworks require substantial consensus exist among the participants and this creates a bias toward solving low-conflict problems where consensus already exists. We discuss the limitations of these two influential research programs in the context of wicked problems and draw on the concept of nested institutional action situations to suggest a research agenda for studying intergovernmental collaboration on problems requiring the development of consensus about the nature of the problem and acceptable solutions. The Advocacy Coalitions and Institutional Analysis and Development frameworks have been effectively used to explain regional collaboration on wicked environmental problems and likely have insights for confronting the wicked fiscal and social problems of regional governance. The implications are that wicked problems are tamed through iterated games and that institution-making at the collective-choice level can then be scaled up to achieve agreement at the constitutional level of analysis.”

Project Anticipation


New site for the UNESCO Chair in Anticipatory Systems: “The purpose of the Chair in Anticipatory Systems is to both develop and promote the Discipline of Anticipation, thereby bringing a critical idea to life. To this end, we have a two pronged strategy consisting of knowledge development and communication. The two are equally important. While many academic projects naturally emphasize knowledge development, we must also reach a large and disparate audience, and open minds locked within the longstanding legacy of reactive science. Thus, from a practical standpoint, how we conceptualize and communicate the Discipline of Anticipation is as important as the Discipline of Anticipation itself….
The project’s main objective is the development of the Discipline of Anticipation, including the development of a system of anticipatory strategies and techniques. The more the culture of anticipation spreads, the easier it will be to develop socially acceptable anticipatory strategies. It will then be possible to accumulate relevant experience on how to think about the future and to use anticipatory methods. It will also be possible to try and develop a language and a body of practices that are more adapted for thinking about the future and for developing new ways to address threads and opportunities.
The following outcomes are envisaged:

  • Futures Literacy: Development of a set of protocols for the appropriate implementation on the ground of the different kinds of anticipation (under the rubric of futures literacy), together with syllabi and teaching materials on the Discipline of Anticipation.
  • Anticipatory Capability Profile: Development of a Anticipatory Capability Profile for communities and institutions, together with a set of recommendations on how a community, organization or institution may raise its anticipatory performance.
  • Resilience Profile: Setting of a resilience index and analysis of the resilience level of selected communities and regions, including a set of recommendations on how to raise their resilience level.”

New! Humanitarian Computing Library


Patrick Meier at iRevolution: “The field of “Humanitarian Computing” applies Human Computing and Machine Computing to address major information-based challengers in the humanitarian space. Human Computing refers to crowdsourcing and microtasking, which is also referred to as crowd computing. In contrast, Machine Computing draws on natural language processing and machine learning, amongst other disciplines. The Next Generation Humanitarian Technologies we are prototyping at QCRI are powered by Humanitarian Computing research and development (R&D).
My QCRI colleagues and I  just launched the first ever Humanitarian Computing Library which is publicly available here. The purpose of this library, or wiki, is to consolidate existing and future research that relate to Humanitarian Computing in order to support the development of next generation humanitarian tech. The repository currently holds over 500 publications that span topics such as Crisis Management, Trust and Security, Software and Tools, Geographical Analysis and Crowdsourcing. These publications are largely drawn from (but not limited to) peer-reviewed papers submitted at leading conferences around the world. We invite you to add your own research on humanitarian computing to this growing collection of resources.”

Linux Foundation Collaboration Gets Biological


eWeek: “The Linux Foundation is growing its roster of collaboration projects by expanding from the physical into the biological realm with the OpenBEL (Biological Expression Language). The Linux Foundation, best known as the organization that helps bring Linux vendors and developers together, is also growing its expertise as a facilitator for collaborative development projects…
OpenBEL got its start in June 2012 after being open-sourced by biotech firm Selventa. The effort now includes the participation of Foundation Medicine, AstraZeneca,The Fraunhofer Institute, Harvard Medical School, Novartis, Pfizer and the University of California at San Diego.
BEL offers researchers a language to clearly express scientific findings from the life sciences in a format that can be understood by computing infrastructure…..
The Linux Foundation currently hosts a number of different collaboration projects, including the Xen virtualization project, the OpenDaylight software-defined networking effort, Tizen for mobile phone development, and OpenMAMA for financial services information, among others.
The OpenBEL project will be similar to existing collaboration projects in that the contributors to the project want to accelerate their work through collaborative development, McPherson explained.”

Government Is a Good Venture Capitalist


Wall Street Journal: “In a knowledge-intensive economy, innovation drives growth. But what drives innovation? In the U.S., most conservatives believe that economically significant new ideas originate in the private sector, through either the research-and-development investments of large firms with deep pockets or the inspiration of obsessive inventors haunting shabby garages. In this view, the role of government is to secure the basic conditions for honest and efficient commerce—and then get out of the way. Anything more is bound to be “wasteful” and “burdensome.”
The real story is more complex and surprising. For more than four decades, R&D magazine has recognized the top innovations—100 each year—that have moved past the conceptual stage into commercial production and sales. Economic sociologists Fred Block and Matthew Keller decided to ask a simple question: Where did these award-winning innovations come from?
The data indicated seven kinds of originating entities: Fortune 500 companies; small and medium enterprises (including startups); collaborations among private entities; government laboratories; universities; spinoffs started by researchers at government labs or universities; and a grab bag of other public and nonprofit agencies.
Messrs. Block and Keller randomly selected three years in each of the past four decades and analyzed the resulting 1,200 innovations. About 10% originated in foreign entities; the sociologists focused on the domestic innovations, more than 1,050.
Two of their findings stand out. First, the number of award winners originating in Fortune 500 companies—either working alone or in collaboration with others—has declined steadily and sharply, from an annual average of 44 in the 1970s to only nine in the first decade of this century.
Second, the number of top innovations originating in federal laboratories, universities or firms formed by former researchers in those entities rose dramatically, from 18 in the 1970s to 37 in the 1980s and 55 in the 1990s before falling slightly to 49 in the 2000s. Without the research conducted in federal labs and universities (much of it federally funded), commercial innovation would have been far less robust…”

Radical Abundance: How a Revolution in Nanotechnology Will Change Civilization


Book review by José Luis Cordeiro:  Eric Drexler, popularly known as “the founding father of nanotechnology,” introduced the concept in his seminal 1981 paper in Proceedings of the National Academy of Sciences.
This paper established fundamental principles of molecular engineering and outlined development paths to advanced nanotechnologies.
He popularized the idea of nanotechnology in his 1986 book, Engines of Creation: The Coming Era of Nanotechnology, where he introduced a broad audience to a fundamental technology objective: using machines that work at the molecular scale to structure matter from the bottom up.
He went on to continue his PhD thesis at MIT, under the guidance of AI-pioneer Marvin Minsky, and published it in a modified form as a book in 1992 as Nanosystems: Molecular Machinery, Manufacturing, and Computation.

Drexler’s new book, Radical Abundance: How a Revolution in Nanotechnology Will Change Civilization, tells the story of nanotechnology from its small beginnings, then moves quickly towards a big future, explaining what it is and what it is not, and enlightening about what we can do with it for the benefit of humanity.
In his pioneering 1986 book, Engines of Creation, he defined nanotechnology as a potential technology with these features: “manufacturing using machinery based on nanoscale devices, and products built with atomic precision.”
In his 2013 sequel, Radical Abundance, Drexler expands on his prior thinking, corrects many of the misconceptions about nanotechnology, and dismisses fears of dystopian futures replete with malevolent nanobots and gray goo…
His new book clearly identifies nanotechnology with atomically precise manufacturing (APM)…Drexler makes many comparisons between the information revolution and what he now calls the “APM revolution.” What the first did with bits, the second will do with atoms: “Image files today will be joined by product files tomorrow. Today one can produce an image of the Mona Lisa without being able to draw a good circle; tomorrow one will be able to produce a display screen without knowing how to manufacture a wire.”
Civilization, he says, is advancing from a world of scarcity toward a world of abundance — indeed, radical abundance.”

Is Connectivity A Human Right?


Mark Zuckerberg (Facebook): For almost ten years, Facebook has been on a mission to make the world more open and connected. Today we connect more than 1.15 billion people each month, but as we started thinking about connecting the next 5 billion, we realized something important: the vast majority of people in the world don’t have access to the internet.
Today, only 2.7 billion people are online — a little more than one third of the world. That is growing by less than 9% each year, but that’s slow considering how early we are in the internet’s development. Even though projections show most people will get smartphones in the next decade, most people still won’t have data access because the cost of data remains much more expensive than the price of a smartphone.
Below, I’ll share a rough proposal for how we can connect the next 5 billion people, and a rough plan to work together as an industry to get there. We’ll discuss how we can make internet access more affordable by making it more efficient to deliver data, how we can use less data by improving the efficiency of the apps we build and how we can help businesses drive internet access by developing a new model to get people online.
I call this a “rough plan” because, like many long term technology projects, we expect the details to evolve. It may be possible to achieve more than we lay out here, but it may also be more challenging than we predict. The specific technical work will evolve as people contribute better ideas, and we welcome all feedback on how to improve this.
Connecting the world is one of the greatest challenges of our generation. This is just one small step toward achieving that goal. I’m excited to work together to make this a reality.
For the full version, click here.

Strengthening Local Capacity for Data-Driven Decisionmaking


A report by the National Neighborhood Indicators Partnership (NNIP): “A large share of public decisions that shape the fundamental character of American life are made at the local level; for example, decisions about controlling crime, maintaining housing quality, targeting social services, revitalizing low-income neighborhoods, allocating health care, and deploying early childhood programs. Enormous benefits would be gained if a much larger share of these decisions were based on sound data and analysis.
In the mid-1990s, a movement began to address the need for data for local decisionmaking.Civic leaders in several cities funded local groups to start assembling neighborhood and address-level data from multiple local agencies. For the first time, it became possible to track changing neighborhood conditions, using a variety of indicators, year by year between censuses. These new data intermediaries pledged to use their data in practical ways to support policymaking and community building and give priority to the interests of distressed neighborhoods. Their theme was “democratizing data,” which in practice meant making the data accessible to residents and community groups (Sawicki and Craig 1996).

The initial groups that took on this work formed the National Neighborhood Indicators Partnership (NNIP) to further develop these capacities and spread them to other cities. By 2012, NNIP partners were established in 37 cities, and similar capacities were in development in a number of others. The Urban Institute (UI) serves as the secretariat for the network. This report documents a strategic planning process undertaken by NNIP in 2012 and early 2013. The network’s leadership and funders re-examined the NNIP model in the context of 15 years of local partner experiences and the dramatic changes in technology and policy approaches that have occurred over that period. The first three sections explain NNIP functions and institutional structures and examine the potential role for NNIP in advancing the community information field in today’s environment.”

OpenCounter


Code for America: “OpenCounter’s mission is to empower entrepreneurs and foster local economic development by simplifying the process of registering a business.
Economic development happens in many forms, from projects like the revitalization of the Brooklyn Navy Yard or Hudson Rail Yards in New York City, to campaigns to encourage residents to shop at local merchants. While the majority of headlines will focus on a City’s effort to secure a major new employer (think Apple’s 1,000,000 square foot expansion in Austin, Texas), most economic development and job creation happens on a much smaller scale, as individuals stake their financial futures on creating a new product, store, service or firm.
But these new businesses aren’t in a position to accept tax breaks on capital equipment or enter into complex development and disposition agreements to build new offices or stores. Many new businesses can’t even meet the underwriting criteria of  SBA backed revolving-loan programs. Competition for local grants for facade improvements or signage assistance can be fierce….
Despite many cities’ genuine efforts to be “business-friendly,” their default user interface consists of florescent-lit formica, waiting lines, and stacks of forms. Online resources often remind one of a phone book, with little interactivity or specialization based on either the businesses’ function or location within a jurisdiction.
That’s why we built OpenCounter….See what we’re up to at opencounter.us or visit a live version of our software at http://opencounter.cityofsantacruz.com.”