The Tech Intellectuals


New Essay by Henry Farrell in Democracy: “A quarter of a century ago, Russell Jacoby lamented the demise of the public intellectual. The cause of death was an improvement in material conditions. Public intellectuals—Dwight Macdonald, I.F. Stone, and their like—once had little choice but to be independent. They had difficulty getting permanent well-paying jobs. However, as universities began to expand, they offered new opportunities to erstwhile unemployables. The academy demanded a high price. Intellectuals had to turn away from the public and toward the practiced obscurities of academic research and prose. In Jacoby’s description, these intellectuals “no longer need[ed] or want[ed] a larger public…. Campuses [were] their homes; colleagues their audience; monographs and specialized journals their media.”
Over the last decade, conditions have changed again. New possibilities are opening up for public intellectuals. Internet-fueled media such as blogs have made it much easier for aspiring intellectuals to publish their opinions. They have fostered the creation of new intellectual outlets (Jacobin, The New Inquiry, The Los Angeles Review of Books), and helped revitalize some old ones too (The Baffler, Dissent). Finally, and not least, they have provided the meat for a new set of arguments about how communications technology is reshaping society.
These debates have created opportunities for an emergent breed of professional argument-crafters: technology intellectuals. Like their predecessors of the 1950s and ’60s, they often make a living without having to work for a university. Indeed, the professoriate is being left behind. Traditional academic disciplines (except for law, which has a magpie-like fascination with new and shiny things) have had a hard time keeping up. New technologies, to traditionalists, are suspect: They are difficult to pin down within traditional academic boundaries, and they look a little too fashionable to senior academics, who are often nervous that their fields might somehow become publicly relevant.
Many of these new public intellectuals are more or less self-made. Others are scholars (often with uncomfortable relationships with the academy, such as Clay Shirky, an unorthodox professor who is skeptical that the traditional university model can survive). Others still are entrepreneurs, like technology and media writer and podcaster Jeff Jarvis, working the angles between public argument and emerging business models….
Different incentives would lead to different debates. In a better world, technology intellectuals might think more seriously about the relationship between technological change and economic inequality. Many technology intellectuals think of the culture of Silicon Valley as inherently egalitarian, yet economist James Galbraith argues that income inequality in the United States “has been driven by capital gains and stock options, mostly in the tech sector.”
They might think more seriously about how technology is changing politics. Current debates are still dominated by pointless arguments between enthusiasts who believe the Internet is a model for a radically better democracy, and skeptics who claim it is the dictator’s best friend.
Finally, they might pay more attention to the burgeoning relationship between technology companies and the U.S. government. Technology intellectuals like to think that a powerful technology sector can enhance personal freedom and constrain the excesses of government. Instead, we are now seeing how a powerful technology sector may enable government excesses. Without big semi-monopolies like Facebook, Google, and Microsoft to hoover up personal information, surveillance would be far more difficult for the U.S. government.
Debating these issues would require a more diverse group of technology intellectuals. The current crop are not diverse in some immediately obvious ways—there are few women, few nonwhites, and few non-English speakers who have ascended to the peak of attention. Yet there is also far less intellectual diversity than there ought to be. The core assumptions of public debates over technology get less attention than they need and deserve.”

Design for Multiple Motivations


in the Huffington Post: “Community is king for any collaboration platform or social network: Outcomes are primarily a function of the participating users and any software is strictly in their service.
With this in mind, when creating new platforms we define success criteria and ask ourselves who we might attract and engage. We ask ourselves where they congregate (online or offline) and ask what might motivate them to participate.
It’s often claimed that everyone’s incentives need to be the same in any successful system. However, no communities or individuals are identical, and their needs and interests will also differ. Instead, our experience has taught us that a site’s users might have wildly differing incentives and motivations, and if you value diverse input, that’s healthy…
I was fortunate to go to a lecture by Karim Lakhani of Harvard Business School when we began imagining OpenIDEO. He did a wonderful job of showing the range of potential incentives and motivations of different community members in this framework:

2013-09-04-MotivationsFrameworkas.png

Given that we agreed inclusivity was a design principle for OpenIDEO, it followed that we should design as many of these intrinsic and extrinsic motivations into the platform as possible and that our primary job was to design a system in which they were all aligned toward the same common goal….More recently, as we have begun applying the OI Engine platform to enterprises we have been pleasantly surprised at the power of these non-financial motivations in driving contributions from employees. In fact, one client’s toughest challenge when selling the platform into his international bank was in convincing the managers that prizes weren’t required to drive employee contribution and might instead compromise collaboration. He was vindicated when thousands of employees actively engaged.”

Frontiers in Massive Data Analysis


New report from the National Academy of Sciences: “Data mining of massive data sets is transforming the way we think about crisis response, marketing, entertainment, cybersecurity and national intelligence. Collections of documents, images, videos, and networks are being thought of not merely as bit strings to be stored, indexed, and retrieved, but as potential sources of discovery and knowledge, requiring sophisticated analysis techniques that go far beyond classical indexing and keyword counting, aiming to find relational and semantic interpretations of the phenomena underlying the data.
Frontiers in Massive Data Analysis examines the frontier of analyzing massive amounts of data, whether in a static database or streaming through a system. Data at that scale–terabytes and petabytes–is increasingly common in science (e.g., particle physics, remote sensing, genomics), Internet commerce, business analytics, national security, communications, and elsewhere. The tools that work to infer knowledge from data at smaller scales do not necessarily work, or work well, at such massive scale. New tools, skills, and approaches are necessary, and this report identifies many of them, plus promising research directions to explore. Frontiers in Massive Data Analysis discusses pitfalls in trying to infer knowledge from massive data, and it characterizes seven major classes of computation that are common in the analysis of massive data. Overall, this report illustrates the cross-disciplinary knowledge–from computer science, statistics, machine learning, and application disciplines–that must be brought to bear to make useful inferences from massive data.”

How to make a city great


New video and report by McKinsey: “What makes a great city? It is a pressing question because by 2030, 5 billion people—60 percent of the world’s population—will live in cities, compared with 3.6 billion today, turbocharging the world’s economic growth. Leaders in developing nations must cope with urbanization on an unprecedented scale, while those in developed ones wrestle with aging infrastructures and stretched budgets. All are fighting to secure or maintain the competitiveness of their cities and the livelihoods of the people who live in them. And all are aware of the environmental legacy they will leave if they fail to find more sustainable, resource-efficient ways of managing these cities.

To understand the core processes and benchmarks that can transform cities into superior places to live and work, McKinsey developed and analyzed a comprehensive database of urban economic, social, and environmental performance indicators. The research included interviewing 30 mayors and other leaders in city governments on four continents and synthesizing the findings from more than 80 case studies that sought to understand what city leaders did to improve processes and services from urban planning to financial management and social housing.
The result is How to make a city great (PDF–2.1MB), a new report arguing that leaders who make important strides in improving their cities do three things really well:

  • They achieve smart growth.
  • They do more with less. Great cities secure all revenues due, explore investment partnerships, embrace technology, make organizational changes that eliminate overlapping roles, and manage expenses. Successful city leaders have also learned that, if designed and executed well, private–public partnerships can be an essential element of smart growth, delivering lower-cost, higher-quality infrastructure and services.
  • They win support for change. Change is not easy, and its momentum can even attract opposition. Successful city leaders build a high-performing team of civil servants, create a working environment where all employees are accountable for their actions, and take every opportunity to forge a stakeholder consensus with the local population and business community. They take steps to recruit and retain top talent, emphasize collaboration, and train civil servants in the use of technology.”

Coase’s theories predicted Internet’s impact on how business is done


Don Tapscott in The Globe and Mail: “Renowned economist Ronald Coase died last week at the age of 102. Among his many achievements, Mr. Coase was awarded the 1991 Nobel Prize in Economics, largely for his inspiring 1937 paper The Nature of the Firm. The Nobel committee applauded the academic for his “discovery and clarification of the significance of transaction costs … for the institutional structure and functioning of the economy.”
Mr. Coase’s enduring legacy may well be that 60 years later, his paper and theories help us understand the Internet’s impact on business, the economy and all our institutions… Mr. Coase wondered why there was no market within the firm. Why is it unprofitable to have each worker, each step in the production process, become an independent buyer and seller? Why doesn’t the draftsperson auction their services to the engineer? Why is it that the engineer does not sell designs to the highest bidder? Mr. Coase argued that preventing this from happening created marketplace friction.
Mr. Coase argued that this friction gave rise to transaction costs – or to put it more broadly, collaboration or relationship costs. There are three types of these relationship costs. First are search costs, such as the hunt for appropriate suppliers. Second are contractual costs, including price and contract negotiations. Third are the co-ordination costs of meshing the different products and processes.
The upshot is that most vertically integrated corporations found it cheaper and simpler to perform most functions in-house, rather than incurring the cost, hassle and risk of constant transactions with outside partners….This is no longer the case. Many behemoths have lost market share to more supple competitors. Digital technologies slash transaction and collaboration costs. Smart companies are making their boundaries porous, using the Internet to harness knowledge, resources and capabilities outside the company. Everywhere,leading firms set a context for innovation and then invite their customers, partners and other third parties to co-create their products and services.
Today’s economic engines are Internet-based clusters of businesses. While each company retains its identity, companies function together, creating more wealth than they could ever hope to create individually. Where corporations were once gigantic, new business ecosystems tend toward the amorphous.
Procter & Gamble now gets 60 per cent of its innovation from outside corporate walls. Boeing has built a massive ecosystem to design and manufacture jumbo jets. China’s motorcycle industry, which consists of dozens of companies collaborating with no single company pulling the strings, now comprises 40 per cent of global motorcycle production.
Looked at one way, Amazon.com is a website with many employees that ships books. Looked at another way, however, Amazon is a vast ecosystem that includes authors, publishers, customers who write reviews for the site, delivery companies like UPS, and tens of thousands of affiliates that market products and arrange fulfilment through the Amazon network. Hundreds of thousands of people are involved in Amazon’s viral marketing network.
This is leading to the biggest change to the corporation in a century and altering how we orchestrate capability to innovate, create goods and services and engage with the world. From now on, the ecosystem itself, not the corporation per se, should serve as the point of departure for every business strategist seeking to understand the new economy – and for every manager, entrepreneur and investor seeking to prosper in it.
Nor does the Internet tonic apply only to corporations. The Web is dropping transaction costs everywhere – enabling networked approaches to almost every institution in society, from government, media, science and health care to our energy grid, transportation systems and institutions for global problem solving.
Governments can change from being vertically integrated, industrial-age bureaucracies to become networks. By releasing their treasures of raw data, governments can now become platforms upon which companies, NGOs, academics, foundations, individuals and other government agencies can collaborate to create public value…”

White House: "We Want Your Input on Building a More Open Government"


Nick Sinai at the White House Blog:”…We are proud of this progress, but recognize that there is always more we can do to build a more efficient, effective, and accountable government.  In that spirit, the Obama Administration has committed to develop a second National Action Plan on Open Government: “NAP 2.0.”
In order to develop a Plan with the most creative and ambitious solutions, we need all-hands-on-deck. That’s why we are asking for your input on what should be in the NAP 2.0:

  1. How can we better encourage and enable the public to participate in government and increase public integrity? For example, in the first National Action Plan, we required Federal enforcement agencies to make publicly available compliance information easily accessible, downloadable and searchable online – helping the public to hold the government and regulated entities accountable.
  • What other kinds of government information should be made more available to help inform decisions in your communities or in your lives?
  • How would you like to be able to interact with Federal agencies making decisions which impact where you live?
  • How can the Federal government better ensure broad feedback and public participation when considering a new policy?
  1. The American people must be able to trust that their Government is doing everything in its power to stop wasteful practices and earn a high return on every tax dollar that is spent.  How can the government better manage public resources? 
  • What suggestions do you have to help the government achieve savings while also improving the way that government operates?
  • What suggestions do you have to improve transparency in government spending?
  1. The American people deserve a Government that is responsive to their needs, makes information readily accessible, and leverages Federal resources to help foster innovation both in the public and private sector.   How can the government more effectively work in collaboration with the public to improve services?
  • What are your suggestions for ways the government can better serve you when you are seeking information or help in trying to receive benefits?
  • In the past few years, the government has promoted the use of “grand challenges,” ambitious yet achievable goals to solve problems of national priority, and incentive prizes, where the government identifies challenging problems and provides prizes and awards to the best solutions submitted by the public.  Are there areas of public services that you think could be especially benefited by a grand challenge or incentive prize?
  • What information or data could the government make more accessible to help you start or improve your business?

Please think about these questions and send your thoughts to opengov@ostp.gov by September 23. We will post a summary of your submissions online in the future.”

Citizen science versus NIMBY?


Ethan Zuckerman’s latest blog: “Safecast is a remarkable project born out of a desire to understand the health and safety implications of the release of radiation from the Fukushima Daiichi nuclear power plant in the wake of the March 11, 2011 earthquake and tsunami. Unsatisfied with limited and questionable information about radiation released by the Japanese government, Joi Ito, Peter, Sean and others worked to design, build and deploy GPS-enabled geiger counters which could be used by concerned citizens throughout Japan to monitor alpha, beta and gamma radiation and understand what parts of Japan have been most effected by the Fukushima disaster.

Screen Shot 2013-08-29 at 10.25.44 AM
The Safecast project has produced an elegant map that shows how complicated the Fukushima disaster will be for the Japanese government to recover from. While there are predictably elevated levels of radiation immediately around the Fukushima plant and in the 18 mile exclusion zones, there is a “plume” of increased radiation south and west of the reactors. The map is produced from millions of radiation readings collected by volunteers, who generally take readings while driving – Safecast’s bGeigie meter automatically takes readings every few seconds and stores them along with associated GPS coordinates for later upload to the server.
This long and thoughtful blog post about the progress of government decontamination efforts, the cost-benefit of those efforts, and the government’s transparency or opacity around cleanup gives a sense for what Safecast is trying to do: provide ways for citizens to check and verify government efforts and understand the complexity of decisions about radiation exposure. This is especially important in Japan, as there’s been widespread frustration over the failures of TEPCO to make progress on cleaning up the reactor site, leading to anger and suspicion about the larger cleanup process.
For me, Safecast raises two interesting questions:
– If you’re not getting trustworthy or sufficient information from your government, can you use crowdsourcing, citizen science or other techniques to generate that data?
– How does collecting data relate to civic engagement? Is it a path towards increased participation as an engaged and effective citizen?
To have some time to reflect on these questions, I decided I wanted to try some of my own radiation monitoring. I borrowed Joi Ito’s bGeigie and set off for my local Spent Nuclear Fuel and Greater-Than-Class C Low Level Radioactive Waste dry cask storage facility…

Projects like Safecast – and the projects I’m exploring this coming year under the heading of citizen infrastructure monitoring – have a challenge. Most participants aren’t going to uncover Ed Snowden-calibre information by driving around with a geiger counter or mapping wells in their communities. Lots of data collected is going to reveal that governments and corporations are doing their jobs, as my data suggests. It’s easy to track a path between collecting groundbreaking data and getting involved with deeper civic and political issues – will collecting data that the local nuclear plant is apparently safe get me more involved with issues of nuclear waste disposal?
It just might. One of the great potentials of citizen science and citizen infrastructure monitoring is the possibility of reducing the exotic to the routine….”

Assessing Zuckerberg’s Idea That Facebook Could Help Citizens Re-Make Their Government


Gregory Ferenstein in TechCrunch: “Mark Zuckerberg has a grand vision that Facebook will help citizens in developing countries decide their own governments. It’s a lofty and partially attainable goal. While Egypt probably won’t let citizens vote for their next president with a Like, it is theoretically possible to use Facebook to crowdsource expertise. Governments around the world are experimenting with radical online direct democracy, but it doesn’t always work out.

Very briefly, Zuckerberg laid out his broad vision for e-government to Wired’s Steven Levy, while defending Internet.org, a new consortium to bring broadband to the developing world.

“People often talk about how big a change social media had been for our culture here in the U.S. But imagine how much bigger a change it will be when a developing country comes online for the first time ever. We use things like Facebook to share news and keep in touch with our friends, but in those countries, they’ll use this for deciding what kind of government they want to have. Getting access to health care information for the first time ever.”

When he references “deciding … government,” Zuckerberg could be talking about voting, sharing ideas, or crafting a constitution. We decided to assess the possibilities of them all….
For citizens in the exciting/terrifying position to construct a brand-new government, American-style democracy is one of many options. Britain, for instance, has a parliamentary system and has no constitution. In other cases, a government may want to heed political scientists’ advice and develop a “consensus democracy,” where more than two political parties are incentivized to work collaboratively with citizens, business, and different branches of government to craft laws.
At least once, choosing a new style of democracy has been attempted through the Internet. After the global financial meltdown wrecked Iceland’s economy, the happy citizens of the grass-covered country decided to redo their government and solicit suggestions from the public (950 Icelanders chosen by lottery and general calls for ideas through social networks). After much press about Iceland’s “crowdsourced” constitution, it crashed miserably after most of the elected leaders rejected it.
Crafting law, especially a constitution, is legally complex; unless there is a systematic way to translate haphazard citizen suggestions into legalese, the results are disastrous.
“Collaborative drafting, at large scale, at low costs, and that is inclusive, is something that we still don’t know how to do,” says Tiago Peixoto, a World Bank Consultant on participatory democracy (and one of our Most Innovative People In Democracy).
Peixoto, who helps the Brazilian government conduct some of the world’s only online policymaking, says he’s optimistic that Facebook could be helpful, but he wouldn’t use it to draft laws just yet.
While technically it is possible for social networks to craft a new government, we just don’t know how to do it very well, and, therefore, leaders are likely to reject the idea. In other words, don’t expect Egypt to decide their future through Facebook likes.”

Weather Channel Now Also Forecasts What You'll Buy


Katie Rosman int the Wall Street Journal: “The Weather Channel knows the chance for rain in St. Louis on Friday, what the heat index could reach in Santa Fe on Saturday and how humid Baltimore may get on Sunday.
It also knows when you’re most likely to buy bug spray.
The enterprise is transforming from a cable network viewers flip to during hurricane season into an operation that forecasts consumer behavior by analyzing when, where and how often people check the weather. Last fall the Weather Channel Cos. renamed itself the Weather Co. to reflect the growth of its digital-data business.

The Atlanta-based company has amassed more than 75 years’ worth of information: temperatures, dew points, cloud-cover percentages and much more, across North America and elsewhere.
The company supplies information for many major smartphone weather apps and has invested in data-crunching algorithms. It uses this analysis to appeal to advertisers who want to fine-tune their pitches to consumers….
Weather Co. researchers are now diving into weather-sentiment analysis—how local weather makes people feel, and then act—in different regions of the country. To cull this data, Mr. Walsh’s weather-analytics team directly polls visitors to the Weather.com website, asking them about their moods and purchases on specifics days.
In a series of polls conducted between June 3 and Nov. 4 last year, residents of the Northeast region responded to the question, “Yesterday, what was your mood for most of the day?”

Index: The Data Universe


The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on the data universe and was originally published in 2013.

  • How much data exists in the digital universe as of 2012: 2.7 zetabytes*
  • Increase in the quantity of Internet data from 2005 to 2012: +1,696%
  • Percent of the world’s data created in the last two years: 90
  • Number of exabytes (=1 billion gigabytes) created every day in 2012: 2.5; that number doubles every month
  • Percent of the digital universe in 2005 created by the U.S. and western Europe vs. emerging markets: 48 vs. 20
  • Percent of the digital universe in 2012 created by emerging markets: 36
  • Percent of the digital universe in 2020 predicted to be created by China alone: 21
  • How much information in the digital universe is created and consumed by consumers (video, social media, photos, etc.) in 2012: 68%
  • Percent of which enterprises have liability or responsibility for (copyright, privacy, compliance with regulations, etc.): 80
  • Amount included in the Obama Administration’s 2-12 Big Data initiative: over $200 million
  • Amount the Department of Defense is investing annually on Big Data projects as of 2012: over $250 million
  • Data created per day in 2012: 2.5 quintillion bytes
  • How many terabytes* of data collected by the U.S. Library of Congress as of April 2011: 235
  • How many terabytes of data collected by Walmart per hour as of 2012: 2,560, or 2.5 petabytes*
  • Projected growth in global data generated per year, as of 2011: 40%
  • Number of IT jobs created globally by 2015 to support big data: 4.4 million (1.9 million in the U.S.)
  • Potential shortage of data scientists in the U.S. alone predicted for 2018: 140,000-190,000, in addition to 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions
  • Time needed to sequence the complete human genome (analyzing 3 billion base pairs) in 2003: ten years
  • Time needed in 2013: one week
  • The world’s annual effective capacity to exchange information through telecommunication networks in 1986, 2007, and (predicted) 2013: 281 petabytes, 65 exabytes, 667 exabytes
  • Projected amount of digital information created annually that will either live in or pass through the cloud: 1/3
  • Increase in data collection volume year-over-year in 2012: 400%
  • Increase in number of individual data collectors from 2011 to 2012: nearly double (over 300 data collection parties in 2012)

*1 zetabyte = 1 billion terabytes | 1 petabyte = 1,000 terabytes | 1 terabyte = 1,000 gigabytes | 1 gigabyte = 1 billion bytes

Sources