What ‘urban physics’ could tell us about how cities work


Ruth Graham at Boston Globe: “What does a city look like? If you’re walking down the street, perhaps it looks like people and storefronts. Viewed from higher up, patterns begin to emerge: A three-dimensional grid of buildings divided by alleys, streets, and sidewalks, nearly flat in some places and scraping the sky in others. Pull back far enough, and the city starts to look like something else entirely: a cluster of molecules.

At least, that’s what it looks like to Franz-Josef Ulm, an engineering professor at the Massachusetts Institute of Technology. Ulm has built a career as an expert on the properties, patterns, and environmental potential of concrete. Taking a coffee break at MIT’s Stata Center late one afternoon, he and a colleague were looking at a large aerial photograph of a city when they had a “eureka” moment: “Hey, doesn’t that look like a molecular structure?”
With colleagues, Ulm began analyzing cities the way you’d analyze a material, looking at factors such as the arrangement of buildings, each building’s center of mass, and how they’re ordered around each other. They concluded that cities could be grouped into categories: Boston’s structure, for example, looks a lot like an “amorphous liquid.” Seattle is another liquid, and so is Los Angeles. Chicago, which was designed on a grid, looks like glass, he says; New York resembles a highly ordered crystal.
So far Ulm and his fellow researchers have presented their work at conferences, but it has not yet been published in a scientific journal. If the analogy does hold up, Ulm hopes it will give planners a new tool to understand a city’s structure, its energy use, and possibly even its resilience to climate change.
Ulm calls his new work “urban physics,” and it places him among a number of scientists now using the tools of physics to analyze the practically infinite amount of data that cities produce in the 21st century, from population density to the number of patents produced to energy bill charges. Physicist Marta González, Ulm’s colleague at MIT, recently used cellphone data to analyze traffic patterns in Boston with unprecedented complexity, for example. In 2012, a theoretical physicist was named founding director of New York University’s Center for Urban Science and Progress, whose research is devoted to “urban informatics”; one of its first projects is helping to create the country’s first “quantified community” on the West Side of Manhattan.
In Ulm’s case, he and his colleagues have used freely available data, including street layouts and building coordinates, to plot the structures of 12 cities and analogize them to existing complex materials. In physics, an “order parameter” is a number between 0 and 1 that describes how atoms are arranged in relationship to other atoms nearby; Ulm applies this idea to city layouts. Boston, he says, has an “order parameter” of .52, equivalent to that of a liquid like water. This means its structure is notably disordered, which may have something to do with how it developed. “Boston has grown organically,” he said. “The city, in the way its buildings are organized today, carries that information from its historical evolution.”…

Americans hate Congress. They will totally teach it a lesson by not voting.


in the Washington Post: “Americans are angry at Congress — more so than basically ever before. So it’s time to throw the bums out, right?
Well, not really. In fact, Americans appear prepared to deal with their historic unhappiness using perhaps the least-productive response: Staying home.
A new study shows that Americans are on-track to set a new low for turnout in a midterm election, and a record number of states could set their own new records for lowest percentage of eligible citizens casting ballots.
The study, from the Center for the Study of the American Electorate, shows turnout in the 25 states that have held statewide primaries for both parties is down by nearly one-fifth from the last midterm, in 2010. While 18.3 percent of eligible voters cast ballots back then, it has been just 14.8 percent so far this year. Similarly, 15 of the 25 states that have held statewide primaries so far have recorded record-low turnout….
This is all the more depressing when you realize that, less than 50 years ago, primary turnout was twice as high.


Courtesy: Center for the Study of the American Electorate

But, really, this isn’t all that new. As you can see above, turnout has been dropping steadily for years….
More than that, though, the poll reinforces that, no matter how upset people are with Congress, they still aren’t really feeling the need to do much of anything about it. Some might argue that they feel powerless to affect real change, but failure to even vote suggests they’re not really interested in trying — or maybe they’re not really all that mad.”

The People’s Platform


Book Review by Tim Wu in the New York Times: “Astra Taylor is a documentary filmmaker who has described her work as the “steamed broccoli” in our cultural diet. Her last film, “Examined Life,” depicted philosophers walking around and talking about their ideas. She’s the kind of creative person who was supposed to benefit when the Internet revolution collapsed old media hierarchies. But two decades since that revolution began, she’s not impressed: “We are at risk of starving in the midst of plenty,” Taylor writes. “Free culture, like cheap food, incurs hidden costs.” Instead of serving as the great equalizer, the web has created an abhorrent cultural feudalism. The creative masses connect, create and labor, while Google, Facebook and Amazon collect the cash.
Taylor’s thesis is simply stated. The pre-Internet cultural industry, populated mainly by exploitative conglomerates, was far from perfect, but at least the ancien régime felt some need to cultivate cultural institutions, and to pay for talent at all levels. Along came the web, which swept away hierarchies — as well as paychecks, leaving behind creators of all kinds only the chance to be fleetingly “Internet famous.” And anyhow, she says, the web never really threatened to overthrow the old media’s upper echelons, whether defined as superstars, like Beyoncé, big broadcast television shows or Hollywood studios. Instead, it was the cultural industry’s middle ­classes that have been wiped out and replaced by new cultural plantations ruled over by the West Coast aggregators.
It is hard to know if the title, “The People’s Platform,” is aspirational or sarcastic, since Taylor believes the classless aura of the web masks an unfair power structure. “Open systems can be starkly inegalitarian,” she says, arguing that the web is afflicted by what the feminist scholar Jo Freeman termed a “tyranny of structurelessness.” Because there is supposedly no hierarchy, elites can happily deny their own existence. (“We just run a platform.”) But the effects are real: The web has reduced professional creators to begging for scraps of attention from a spoiled public, and forced creators to be their own brand.

The tech industry might be tempted to dismiss Taylor’s arguments as merely a version of typewriter manufacturers’ complaints circa 1984, but that would be a mistake. “The People’s Platform” should be taken as a challenge by the new media that have long claimed to be improving on the old order. Can they prove they are capable of supporting a sustainable cultural ecosystem, in a way that goes beyond just hosting parties at the Sundance Film ­Festival?
We see some of this in the tech firms that have begun to pay for original content, as with Netflix’s investments in projects like “Orange Is the New Black.” It’s also worth pointing out that the support of culture is actually pretty cheap. Consider the nonprofit ProPublica, which employs investigative journalists, and has already won two Pulitzers, all on a budget of just over $10 million a year. That kind of money is a rounding error for much of Silicon Valley, where losing billions on bad acquisitions is routinely defended as “strategic.” If Google, Apple, Facebook and Amazon truly believe they’re better than the old guard, let’s see it.”
See : THE PEOPLE’S PLATFORM. Taking Back Power and Culture in the Digital Age By Astra Taylor, 276 pp. Metropolitan Books/Henry Holt & Company.

Can Experts Solve Poverty?


The #GlobalPOV Project:We all have experts in our lives. Computer experts, plumbing experts, legal experts — you name the problem, and there is someone out there who specializes in addressing that problem. Whether it’s a broken car, a computer glitch, or even a broken heart – call the expert, they’ll fix us right up.
So who do we call when society is broken? Who do we call when over a billion people live in poverty, unable to meet the basic requirements to sustain their lives? Or when the wealthiest 2% of the world owns 50% of the world’s assets?
We call experts, of course: poverty experts. But — who is a poverty expert, and can experts solve poverty?
VIDEO:

The #GlobalPOV Project is a program of the Global Poverty and Practice (GPP) Minor. Based at the Blum Center for Developing Economies, University of California, Berkeley, the GPP Minor creates new ways of thinking about poverty, inequality and undertaking poverty action.
Website: http://blumcenter.berkeley.edu/globalpov

The Quiet Movement to Make Government Fail Less Often


in The New York Times: “If you wanted to bestow the grandiose title of “most successful organization in modern history,” you would struggle to find a more obviously worthy nominee than the federal government of the United States.

In its earliest stirrings, it established a lasting and influential democracy. Since then, it has helped defeat totalitarianism (more than once), established the world’s currency of choice, sent men to the moon, built the Internet, nurtured the world’s largest economy, financed medical research that saved millions of lives and welcomed eager immigrants from around the world.

Of course, most Americans don’t think of their government as particularly successful. Only 19 percent say they trust the government to do the right thing most of the time, according to Gallup. Some of this mistrust reflects a healthy skepticism that Americans have always had toward centralized authority. And the disappointing economic growth of recent decades has made Americans less enamored of nearly every national institution.

But much of the mistrust really does reflect the federal government’s frequent failures – and progressives in particular will need to grapple with these failures if they want to persuade Americans to support an active government.

When the federal government is good, it’s very, very good. When it’s bad (or at least deeply inefficient), it’s the norm.

The evidence is abundant. Of the 11 large programs for low- and moderate-income people that have been subject to rigorous, randomized evaluation, only one or two show strong evidence of improving most beneficiaries’ lives. “Less than 1 percent of government spending is backed by even the most basic evidence of cost-effectiveness,” writes Peter Schuck, a Yale law professor, in his new book, “Why Government Fails So Often,” a sweeping history of policy disappointments.

As Mr. Schuck puts it, “the government has largely ignored the ‘moneyball’ revolution in which private-sector decisions are increasingly based on hard data.”

And yet there is some good news in this area, too. The explosion of available data has made evaluating success – in the government and the private sector – easier and less expensive than it used to be. At the same time, a generation of data-savvy policy makers and researchers has entered government and begun pushing it to do better. They have built on earlier efforts by the Bush and Clinton administrations.

The result is a flowering of experiments to figure out what works and what doesn’t.

New York City, Salt Lake City, New York State and Massachusetts have all begun programs to link funding for programs to their success: The more effective they are, the more money they and their backers receive. The programs span child care, job training and juvenile recidivism.

The approach is known as “pay for success,” and it’s likely to spread to Cleveland, Denver and California soon. David Cameron’s conservative government in Britain is also using it. The Obama administration likes the idea, and two House members – Todd Young, an Indiana Republican, and John Delaney, a Maryland Democrat – have introduced a modest bill to pay for a version known as “social impact bonds.”

The White House is also pushing for an expansion of randomized controlled trials to evaluate government programs. Such trials, Mr. Schuck notes, are “the gold standard” for any kind of evaluation. Using science as a model, researchers randomly select some people to enroll in a government program and others not to enroll. The researchers then study the outcomes of the two groups….”

Networks and Hierarchies


on whether political hierarchy in the form of the state has met its match in today’s networked world in the American Interest: “…To all the world’s states, democratic and undemocratic alike, the new informational, commercial, and social networks of the internet age pose a profound challenge, the scale of which is only gradually becoming apparent. First email achieved a dramatic improvement in the ability of ordinary citizens to communicate with one another. Then the internet came to have an even greater impact on the ability of citizens to access information. The emergence of search engines marked a quantum leap in this process. The advent of laptops, smartphones, and other portable devices then emancipated electronic communication from the desktop. With the explosive growth of social networks came another great leap, this time in the ability of citizens to share information and ideas.
It was not immediately obvious how big a challenge all this posed to the established state. There was a great deal of cheerful talk about the ways in which the information technology revolution would promote “smart” or “joined-up” government, enhancing the state’s ability to interact with citizens. However, the efforts of Anonymous, Wikileaks and Edward Snowden to disrupt the system of official secrecy, directed mainly against the U.S. government, have changed everything. In particular, Snowden’s revelations have exposed the extent to which Washington was seeking to establish a parasitical relationship with the key firms that operate the various electronic networks, acquiring not only metadata but sometimes also the actual content of vast numbers of phone calls and messages. Techniques of big-data mining, developed initially for commercial purposes, have been adapted to the needs of the National Security Agency.
The most recent, and perhaps most important, network challenge to hierarchy comes with the advent of virtual currencies and payment systems like Bitcoin. Since ancient times, states have reaped considerable benefits from monopolizing or at least regulating the money created within their borders. It remains to be seen how big a challenge Bitcoin poses to the system of national fiat currencies that has evolved since the 1970s and, in particular, how big a challenge it poses to the “exorbitant privilege” enjoyed by the United States as the issuer of the world’s dominant reserve (and transaction) currency. But it would be unwise to assume, as some do, that it poses no challenge at all….”

U.S. Secretary of Commerce Penny Pritzker Announces Expansion and Enhancement of Commerce Data Programs


Press Release from the U.S. Secretary of Commerce:Department will hire first-ever Chief Data Officer

As “America’s Data Agency,” the Department of Commerce is prepared and well-positioned to foster the next phase in the open data revolution. In line with President Obama’s Year of Action, U.S. Secretary of Commerce Penny Pritzker today announced a series of steps taken to enhance and expand the data programs at the Department.
“Data is a key pillar of the Department’s “Open for Business Agenda,” and for the first time, we have made it a department-wide strategic priority,” said Secretary of Commerce Penny Pritzker. “No other department can rival the reach, depth and breadth of the Department of Commerce’s data programs. The Department of Commerce is working to unleash more of its data to strengthen the nation’s economic growth; make its data easier to access, understand, and use; and maximize the return of data investments for businesses, entrepreneurs, government, taxpayers, and communities.”
Secretary Pritzker made a number of major announcements today as a special guest speaker at the Environmental Systems Research Institute’s (Esri) User Conference in San Diego, California. She discussed the power and potential of open data, recognizing that data not only enable start-ups and entrepreneurs, move markets, and empower companies large and small, but also touch the lives of Americans every day.
In her remarks, Secretary Pritzker outlined new ways the Department of Commerce is working to unlock the potential of even more open data to make government smarter, including the following:
Chief Data Officer
Today, Secretary Pritzker announced the Commerce Department will hire its first-ever Chief Data Officer. This leader will be responsible for developing and implementing a vision for the future of the diverse data resources at Commerce.
The new Chief Data Officer will pull together a platform for all data sets; instigate and oversee improvements in data collection and dissemination; and ensure that data programs are coordinated, comprehensive, and strategic.
The Chief Data Officer will hold the key to unlocking more government data to help support a data-enabled Department and economy.
Trade Developer Portal
The International Trade Administration has launched its “Developer Portal,” an online toolkit to put diverse sets of trade and investment data in a single place, making it easier for the business community to use and better tap into the 95 percent of American customers that live overseas.
In creating this portal, the Commerce Department is making its data public to software developers, giving them access to authoritative information on U.S. exports and international trade to help U.S. businesses export and expand their operations in overseas markets. The developer community will be able to integrate the data into applications and mashups to help U.S. business owners compete abroad while also creating more jobs here at home.
Data Advisory Council
Open data requires open dialogue. To facilitate this, the Commerce Department is creating a data advisory council, comprised of 15 private sector leaders that will advise the Department on the best use of government data.
This new advisory council will help Commerce maximize the value of its data by:

  • discovering how to deliver data in more usable, timely, and accessible ways;
  • improving how data is utilized and shared to make businesses and governments more responsive, cost-effective, and efficient;
  • better anticipating customers’ needs; and
  • collaborating with the private sector to develop new data products and services.

The council’s primary focus will be on the accessibility and usability of Commerce data, as well as the transformation of the Department’s supporting infrastructure and procedures for managing data.
These data leaders will represent a broad range of business interests—reflecting the wide range of scientific, statistical, and other data that the Department of Commerce produces. Members will serve two-year terms and will meet about four times a year. The advisory council will be housed within the Economics and Statistics Administration.”
Commerce data inform decisions that help make government smarter, keep businesses more competitive and better inform citizens about their own communities – with the potential to guide up to $3.3 trillion in investments in the United States each year.

Do We Choose Our Friends Because They Share Our Genes?


Rob Stein at NPR: “People often talk about how their friends feel like family. Well, there’s some new research out that suggests there’s more to that than just a feeling. People appear to be more like their friends genetically than they are to strangers, the research found.
“The striking thing here is that friends are actually significantly more similar to one another than we were expecting,” says  James Fowler, a professor of medical genetics at the University of California, San Diego, who conducted the study with Nicholas A. Christakis, a social scientist at Yale University.
In fact, the study in Monday’s issue of the Proceedings of the National Academy of Sciences found that friends are as genetically similar as fourth cousins.
“It’s as if they shared a great- great- great-grandparent in common,” Fowler told Shots.
Some of the genes that friends were most likely to have in common involve smell. “We tend to smell things the same way that our friends do,” Fowler says. The study involved nearly 2,000 adults.
This suggests that as humans evolved, the ability to tolerate and be drawn to certain smells may have influenced where people hung out. Today we might call this the Starbucks effect.
“You may really love the smell of coffee. And you’re drawn to a place where other people have been drawn to who also love the smell of coffee,” Fowler says. “And so that might be the opportunity space for you to make friends. You’re all there together because you love coffee and you make friends because you all love coffee.”…”

Selected Readings on Crowdsourcing Expertise


The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of crowdsourcing was originally published in 2014.

Crowdsourcing enables leaders and citizens to work together to solve public problems in new and innovative ways. New tools and platforms enable citizens with differing levels of knowledge, expertise, experience and abilities to collaborate and solve problems together. Identifying experts, or individuals with specialized skills, knowledge or abilities with regard to a specific topic, and incentivizing their participation in crowdsourcing information, knowledge or experience to achieve a shared goal can enhance the efficiency and effectiveness of problem solving.

Selected Reading List (in alphabetical order)

Annotated Selected Reading List (in alphabetical order)

Börner, Katy, Michael Conlon, Jon Corson-Rikert, and Ying Ding. “VIVO: A Semantic Approach to Scholarly Networking and Discovery.” Synthesis Lectures on the Semantic Web: Theory and Technology 2, no. 1 (October 17, 2012): 1–178. http://bit.ly/17huggT.

  • This e-book “provides an introduction to VIVO…a tool for representing information about research and researchers — their scholarly works, research interests, and organizational relationships.”
  • VIVO is a response to the fact that, “Information for scholars — and about scholarly activity — has not kept pace with the increasing demands and expectations. Information remains siloed in legacy systems and behind various access controls that must be licensed or otherwise negotiated before access. Information representation is in its infancy. The raw material of scholarship — the data and information regarding previous work — is not available in common formats with common semantics.”
  • Providing access to structured information on the work and experience of a diversity of scholars enables improved expert finding — “identifying and engaging experts whose scholarly works is of value to one’s own. To find experts, one needs rich data regarding one’s own work and the work of potential related experts. The authors argue that expert finding is of increasing importance since, “[m]ulti-disciplinary and inter-disciplinary investigation is increasingly required to address complex problems. 

Bozzon, Alessandro, Marco Brambilla, Stefano Ceri, Matteo Silvestri, and Giuliano Vesci. “Choosing the Right Crowd: Expert Finding in Social Networks.” In Proceedings of the 16th International Conference on Extending Database Technology, 637–648. EDBT  ’13. New York, NY, USA: ACM, 2013. http://bit.ly/18QbtY5.

  • This paper explores the challenge of selecting experts within the population of social networks by considering the following problem: “given an expertise need (expressed for instance as a natural language query) and a set of social network members, who are the most knowledgeable people for addressing that need?”
  • The authors come to the following conclusions:
    • “profile information is generally less effective than information about resources that they directly create, own or annotate;
    • resources which are produced by others (resources appearing on the person’s Facebook wall or produced by people that she follows on Twitter) help increasing the assessment precision;
    • Twitter appears the most effective social network for expertise matching, as it very frequently outperforms all other social networks (either combined or alone);
    • Twitter appears as well very effective for matching expertise in domains such as computer engineering, science, sport, and technology & games, but Facebook is also very effective in fields such as locations, music, sport, and movies & tv;
    • surprisingly, LinkedIn appears less effective than other social networks in all domains (including computer science) and overall.”

Brabham, Daren C. “The Myth of Amateur Crowds.” Information, Communication & Society 15, no. 3 (2012): 394–410. http://bit.ly/1hdnGJV.

  • Unlike most of the related literature, this paper focuses on bringing attention to the expertise already being tapped by crowdsourcing efforts rather than determining ways to identify more dormant expertise to improve the results of crowdsourcing.
  • Brabham comes to two central conclusions: “(1) crowdsourcing is discussed in the popular press as a process driven by amateurs and hobbyists, yet empirical research on crowdsourcing indicates that crowds are largely self-selected professionals and experts who opt-in to crowdsourcing arrangements; and (2) the myth of the amateur in crowdsourcing ventures works to label crowds as mere hobbyists who see crowdsourcing ventures as opportunities for creative expression, as entertainment, or as opportunities to pass the time when bored. This amateur/hobbyist label then undermines the fact that large amounts of real work and expert knowledge are exerted by crowds for relatively little reward and to serve the profit motives of companies. 

Dutton, William H. Networking Distributed Public Expertise: Strategies for Citizen Sourcing Advice to Government. One of a Series of Occasional Papers in Science and Technology Policy, Science and Technology Policy Institute, Institute for Defense Analyses, February 23, 2011. http://bit.ly/1c1bpEB.

  • In this paper, a case is made for more structured and well-managed crowdsourcing efforts within government. Specifically, the paper “explains how collaborative networking can be used to harness the distributed expertise of citizens, as distinguished from citizen consultation, which seeks to engage citizens — each on an equal footing.” Instead of looking for answers from an undefined crowd, Dutton proposes “networking the public as advisors” by seeking to “involve experts on particular public issues and problems distributed anywhere in the world.”
  • Dutton argues that expert-based crowdsourcing can be successfully for government for a number of reasons:
    • Direct communication with a diversity of independent experts
    • The convening power of government
    • Compatibility with open government and open innovation
    • Synergy with citizen consultation
    • Building on experience with paid consultants
    • Speed and urgency
    • Centrality of documents to policy and practice.
  • He also proposes a nine-step process for government to foster bottom-up collaboration networks:
    • Do not reinvent the technology
    • Focus on activities, not the tools
    • Start small, but capable of scaling up
    • Modularize
    • Be open and flexible in finding and going to communities of experts
    • Do not concentrate on one approach to all problems
    • Cultivate the bottom-up development of multiple projects
    • Experience networking and collaborating — be a networked individual
    • Capture, reward, and publicize success.

Goel, Gagan, Afshin Nikzad and Adish Singla. “Matching Workers with Tasks: Incentives in Heterogeneous Crowdsourcing Markets.” Under review by the International World Wide Web Conference (WWW). 2014. http://bit.ly/1qHBkdf

  • Combining the notions of crowdsourcing expertise and crowdsourcing tasks, this paper focuses on the challenge within platforms like Mechanical Turk related to intelligently matching tasks to workers.
  • The authors’ call for more strategic assignment of tasks in crowdsourcing markets is based on the understanding that “each worker has certain expertise and interests which define the set of tasks she can and is willing to do.”
  • Focusing on developing meaningful incentives based on varying levels of expertise, the authors sought to create a mechanism that, “i) is incentive compatible in the sense that it is truthful for agents to report their true cost, ii) picks a set of workers and assigns them to the tasks they are eligible for in order to maximize the utility of the requester, iii) makes sure total payments made to the workers doesn’t exceed the budget of the requester.

Gubanov, D., N. Korgin, D. Novikov and A. Kalkov. E-Expertise: Modern Collective Intelligence. Springer, Studies in Computational Intelligence 558, 2014. http://bit.ly/U1sxX7

  • In this book, the authors focus on “organization and mechanisms of expert decision-making support using modern information and communication technologies, as well as information analysis and collective intelligence technologies (electronic expertise or simply e-expertise).”
  • The book, which “addresses a wide range of readers interested in management, decision-making and expert activity in political, economic, social and industrial spheres, is broken into five chapters:
    • Chapter 1 (E-Expertise) discusses the role of e-expertise in decision-making processes. The procedures of e-expertise are classified, their benefits and shortcomings are identified, and the efficiency conditions are considered.
    • Chapter 2 (Expert Technologies and Principles) provides a comprehensive overview of modern expert technologies. A special emphasis is placed on the specifics of e-expertise. Moreover, the authors study the feasibility and reasonability of employing well-known methods and approaches in e-expertise.
    • Chapter 3 (E-Expertise: Organization and Technologies) describes some examples of up-to-date technologies to perform e-expertise.
    • Chapter 4 (Trust Networks and Competence Networks) deals with the problems of expert finding and grouping by information and communication technologies.
    • Chapter 5 (Active Expertise) treats the problem of expertise stability against any strategic manipulation by experts or coordinators pursuing individual goals.

Holst, Cathrine. “Expertise and Democracy.” ARENA Report No 1/14, Center for European Studies, University of Oslo. http://bit.ly/1nm3rh4

  • This report contains a set of 16 papers focused on the concept of “epistocracy,” meaning the “rule of knowers.” The papers inquire into the role of knowledge and expertise in modern democracies and especially in the European Union (EU). Major themes are: expert-rule and democratic legitimacy; the role of knowledge and expertise in EU governance; and the European Commission’s use of expertise.
    • Expert-rule and democratic legitimacy
      • Papers within this theme concentrate on issues such as the “implications of modern democracies’ knowledge and expertise dependence for political and democratic theory.” Topics include the accountability of experts, the legitimacy of expert arrangements within democracies, the role of evidence in policy-making, how expertise can be problematic in democratic contexts, and “ethical expertise” and its place in epistemic democracies.
    • The role of knowledge and expertise in EU governance
      • Papers within this theme concentrate on “general trends and developments in the EU with regard to the role of expertise and experts in political decision-making, the implications for the EU’s democratic legitimacy, and analytical strategies for studying expertise and democratic legitimacy in an EU context.”
    • The European Commission’s use of expertise
      • Papers within this theme concentrate on how the European Commission uses expertise and in particular the European Commission’s “expertgroup system.” Topics include the European Citizen’s Initiative, analytic-deliberative processes in EU food safety, the operation of EU environmental agencies, and the autonomy of various EU agencies.

King, Andrew and Karim R. Lakhani. “Using Open Innovation to Identify the Best Ideas.” MIT Sloan Management Review, September 11, 2013. http://bit.ly/HjVOpi.

  • In this paper, King and Lakhani examine different methods for opening innovation, where, “[i]nstead of doing everything in-house, companies can tap into the ideas cloud of external expertise to develop new products and services.”
  • The three types of open innovation discussed are: opening the idea-creation process, competitions where prizes are offered and designers bid with possible solutions; opening the idea-selection process, ‘approval contests’ in which outsiders vote to determine which entries should be pursued; and opening both idea generation and selection, an option used especially by organizations focused on quickly changing needs.

Long, Chengjiang, Gang Hua and Ashish Kapoor. Active Visual Recognition with Expertise Estimation in Crowdsourcing. 2013 IEEE International Conference on Computer Vision. December 2013. http://bit.ly/1lRWFur.

  • This paper is focused on improving the crowdsourced labeling of visual datasets from platforms like Mechanical Turk. The authors note that, “Although it is cheap to obtain large quantity of labels through crowdsourcing, it has been well known that the collected labels could be very noisy. So it is desirable to model the expertise level of the labelers to ensure the quality of the labels. The higher the expertise level a labeler is at, the lower the label noises he/she will produce.”
  • Based on the need for identifying expert labelers upfront, the authors developed an “active classifier learning system which determines which users to label which unlabeled examples” from collected visual datasets.
  • The researchers’ experiments in identifying expert visual dataset labelers led to findings demonstrating that the “active selection” of expert labelers is beneficial in cutting through the noise of crowdsourcing platforms.

Noveck, Beth Simone. “’Peer to Patent’: Collective Intelligence, Open Review, and Patent Reform.” Harvard Journal of Law & Technology 20, no. 1 (Fall 2006): 123–162. http://bit.ly/HegzTT.

  • This law review article introduces the idea of crowdsourcing expertise to mitigate the challenge of patent processing. Noveck argues that, “access to information is the crux of the patent quality problem. Patent examiners currently make decisions about the grant of a patent that will shape an industry for a twenty-year period on the basis of a limited subset of available information. Examiners may neither consult the public, talk to experts, nor, in many cases, even use the Internet.”
  • Peer-to-Patent, which launched three years after this article, is based on the idea that, “The new generation of social software might not only make it easier to find friends but also to find expertise that can be applied to legal and policy decision-making. This way, we can improve upon the Constitutional promise to promote the progress of science and the useful arts in our democracy by ensuring that only worth ideas receive that ‘odious monopoly’ of which Thomas Jefferson complained.”

Ober, Josiah. “Democracy’s Wisdom: An Aristotelian Middle Way for Collective Judgment.” American Political Science Review 107, no. 01 (2013): 104–122. http://bit.ly/1cgf857.

  • In this paper, Ober argues that, “A satisfactory model of decision-making in an epistemic democracy must respect democratic values, while advancing citizens’ interests, by taking account of relevant knowledge about the world.”
  • Ober describes an approach to decision-making that aggregates expertise across multiple domains. This “Relevant Expertise Aggregation (REA) enables a body of minimally competent voters to make superior choices among multiple options, on matters of common interest.”

Sims, Max H., Jeffrey Bigham, Henry Kautz and Marc W. Halterman. Crowdsourcing medical expertise in near real time.” Journal of Hospital Medicine 9, no. 7, July 2014. http://bit.ly/1kAKvq7.

  • In this article, the authors discuss the develoment of a mobile application called DocCHIRP, which was developed due to the fact that, “although the Internet creates unprecedented access to information, gaps in the medical literature and inefficient searches often leave healthcare providers’ questions unanswered.”
  • The DocCHIRP pilot project used a “system of point-to-multipoint push notifications designed to help providers problem solve by crowdsourcing from their peers.”
  • Healthcare providers (HCPs) sought to gain intelligence from the crowd, which included 85 registered users, on questions related to medication, complex medical decision making, standard of care, administrative, testing and referrals.
  • The authors believe that, “if future iterations of the mobile crowdsourcing applications can address…adoption barriers and support the organic growth of the crowd of HCPs,” then “the approach could have a positive and transformative effect on how providers acquire relevant knowledge and care for patients.”

Spina, Alessandro. “Scientific Expertise and Open Government in the Digital Era: Some Reflections on EFSA and Other EU Agencies.” in Foundations of EU Food Law and Policy, eds. A. Alemmano and S. Gabbi. Ashgate, 2014. http://bit.ly/1k2EwdD.

  • In this paper, Spina “presents some reflections on how the collaborative and crowdsourcing practices of Open Government could be integrated in the activities of EFSA [European Food Safety Authority] and other EU agencies,” with a particular focus on “highlighting the benefits of the Open Government paradigm for expert regulatory bodies in the EU.”
  • Spina argues that the “crowdsourcing of expertise and the reconfiguration of the information flows between European agencies and teh public could represent a concrete possibility of modernising the role of agencies with a new model that has a low financial burden and an almost immediate effect on the legal governance of agencies.”
  • He concludes that, “It is becoming evident that in order to guarantee that the best scientific expertise is provided to EU institutions and citizens, EFSA should strive to use the best organisational models to source science and expertise.”

Online Tools Every Community Should Use


at NationSwell: “Larger cities like Chicago, San Francisco and New York continue to innovate civic technology and bridge the divide between citizens and government, while this progress is leaving small communities behind.
Without digital tools, staff or infrastructure in place to bring basic services online, small local governments and their citizens are suffering from a digital divide. But one Silicon Valley mind is determined to break that barrier and help smaller cities understand how they can join the digital movement…any civic technology should include the following eight tools:
Bullets: Crime-related data that give residents a sense of how safety is handled in the city.
Examples: CrimeAround.Us, Crime in Chicago, Oakland Crimespotting
Bills: Providing citizens with more transparency around legislative data.
ExamplesOpenGov’s AmericaDecoded, MySociety’s SayIt, Councilmatic
Budget: Making public finances and city spending available online.
Examples: OpenGov.com, OpenSpending, Look at Cook
Buses: Transportation tools to help residents with schedules, planning, etc.
Examples: OpenTripPlanner, OneBusAway
Data: Open, organized, municipal information.
Examples: Socrata, NuData, CKAN, OpenDataCatalog, Junar
411: An online information hotline used in the same regard as the phone version.
Examples: CityAnswers, MindMixer, OSQA
311: Non-emergency online assistance including reporting things like road repairs.
Examples: SeeClickFix, PublicStuff, Connected Bits, Service TrackerOpen311Mobile
211:  A social services hotline for services including health, jobs training and housing.
Examples: Aunt Bertha, Purple Binder, Connect Chicago
“The opportunity is that we have the chance to take all of these components that are being built as open-source tools and turn them into companies that offer them to cities as hosted platforms,” Nemani told Next City. “Even a 10-person shop can put in a credit card number and pay a hundred dollars a month for one of these tools.”
While Nemani admits each city will be different — some places are too small for transportation components — working towards a template is critical to make civic technology accessible for everyone. But by focusing on these eight tools, any town is off to a great start….”