The Power to Decide


Special Report by Antonio Regalado in MIT Technology Review: “Back in 1956, an engineer and a mathematician, William Fair and Earl Isaac, pooled $800 to start a company. Their idea: a score to handicap whether a borrower would repay a loan.
It was all done with pen and paper. Income, gender, and occupation produced numbers that amounted to a prediction about a person’s behavior. By the 1980s the three-digit scores were calculated on computers and instead took account of a person’s actual credit history. Today, Fair Isaac Corp., or FICO, generates about 10 billion credit scores annually, calculating 50 times a year for many Americans.
This machinery hums in the background of our financial lives, so it’s easy to forget that the choice of whether to lend used to be made by a bank manager who knew a man by his handshake. Fair and Isaac understood that all this could change, and that their company didn’t merely sell numbers. “We sell a radically different way of making decisions that flies in the face of tradition,” Fair once said.
This anecdote suggests a way of understanding the era of “big data”—terabytes of information from sensors or social networks, new computer architectures, and clever software. But even supercharged data needs a job to do, and that job is always about a decision.
In this business report, MIT Technology Review explores a big question: how are data and the analytical tools to manipulate it changing decision making today? On Nasdaq, trading bots exchange a billion shares a day. Online, advertisers bid on hundreds of thousands of keywords a minute, in deals greased by heuristic solutions and optimization models rather than two-martini lunches. The number of variables and the speed and volume of transactions are just too much for human decision makers.
When there’s a person in the loop, technology takes a softer approach (see “Software That Augments Human Thinking”). Think of recommendation engines on the Web that suggest products to buy or friends to catch up with. This works because Internet companies maintain statistical models of each of us, our likes and habits, and use them to decide what we see. In this report, we check in with LinkedIn, which maintains the world’s largest database of résumés—more than 200 million of them. One of its newest offerings is University Pages, which crunches résumé data to offer students predictions about where they’ll end up working depending on what college they go to (see “LinkedIn Offers College Choices by the Numbers”).
These smart systems, and their impact, are prosaic next to what’s planned. Take IBM. The company is pouring $1 billion into its Watson computer system, the one that answered questions correctly on the game show Jeopardy! IBM now imagines computers that can carry on intelligent phone calls with customers, or provide expert recommendations after digesting doctors’ notes. IBM wants to provide “cognitive services”—computers that think, or seem to (see “Facing Doubters, IBM Expands Plans for Watson”).
Andrew Jennings, chief analytics officer for FICO, says automating human decisions is only half the story. Credit scores had another major impact. They gave lenders a new way to measure the state of their portfolios—and to adjust them by balancing riskier loan recipients with safer ones. Now, as other industries get exposed to predictive data, their approach to business strategy is changing, too. In this report, we look at one technique that’s spreading on the Web, called A/B testing. It’s a simple tactic—put up two versions of a Web page and see which one performs better (see “Seeking Edge, Websites Turn to Experiments” and “Startups Embrace a Way to Fail Fast”).
Until recently, such optimization was practiced only by the largest Internet companies. Now, nearly any website can do it. Jennings calls this phenomenon “systematic experimentation” and says it will be a feature of the smartest companies. They will have teams constantly probing the world, trying to learn its shifting rules and deciding on strategies to adapt. “Winners and losers in analytic battles will not be determined simply by which organization has access to more data or which organization has more money,” Jennings has said.

Of course, there’s danger in letting the data decide too much. In this report, Duncan Watts, a Microsoft researcher specializing in social networks, outlines an approach to decision making that avoids the dangers of gut instinct as well as the pitfalls of slavishly obeying data. In short, Watts argues, businesses need to adopt the scientific method (see “Scientific Thinking in Business”).
To do that, they have been hiring a highly trained breed of business skeptics called data scientists. These are the people who create the databases, build the models, reveal the trends, and, increasingly, author the products. And their influence is growing in business. This could be why data science has been called “the sexiest job of the 21st century.” It’s not because mathematics or spreadsheets are particularly attractive. It’s because making decisions is powerful…”

Opening up open data: An interview with Tim O’Reilly


McKinsey: “The tech entrepreneur, author, and investor looks at how open data is becoming a critical tool for business and government, as well as what needs to be done for it to be more effective.

We’re increasingly living in a world of black boxes. We don’t understand the way things work. And open-source software, open data are critical tools. We see this in the field of computer security. People say, “Well, we have to keep this secret.” Well, it turns out that the strongest security protocols are those that are secure even when people know how they work.

It seems to me that almost every great advance is a platform advance. When we have common standards, so much more happens.
And you think about the standardization of railroad gauges, the standardization of communications, protocols. Think about the standardization of roads, how fundamental those are to our society. And that’s actually kind of a bridge for my work on open government, because I’ve been thinking a lot about the notion of government as a platform.

We should define a little bit what we mean by “open,” because there’s open as in it’s open source. Anybody can take it and reuse it in whatever way they want. And I’m not sure that’s always necessary. There’s a pragmatic open and there’s an ideological open. And the pragmatic open is that it’s available. It’s available in a timely way, in a nonpreferential way, so that some people don’t get better access than others.
And if you look at so many of our apps now on the web, because they are ad-supported and free, we get a lot of the benefits of open. When the cost is low enough, it does in fact create many of the same conditions as a commons. That being said, that requires great restraint, as I said earlier, on the part of companies, because it becomes easy for them to say, “Well, actually we just need to take a little bit more of the value for ourselves. And oh, we just need a bit more of that.” And before long, it really isn’t open at all.

Eric Ries, of Lean Startupfame, talks about a start-up as a machine for learning under conditions of extreme uncertainty.
He said it doesn’t have to do with being a small company, being anything new. He says it’s just whenever you’re trying to do something new, where you don’t know the answers, you have to experiment. You have to have a mechanism for measuring. You have to have mechanisms for changing what you do based on the response to that measurement…
That’s one of the biggest problems, I think, in our government today, that we put out programs. Somebody has a theory about what’s going to work and what the benefit will be. We don’t measure it. We don’t actually see if it did what we thought it was going to do. And we keep doing it. And then it doesn’t work, so we do something else. And then we layer on program after program that doesn’t actually meet its objectives. And if we actually brought in the mind-set that said, “No, actually we’re going to figure out if we actually accomplish what we set out to accomplish; and if we don’t, we’re going to change it,” that would be huge.”

Needed: A New Generation of Game Changers to Solve Public Problems


Beth Noveck: “In order to change the way we govern, it is important to train and nurture a new generation of problem solvers who possess the multidisciplinary skills to become effective agents of change. That’s why we at the GovLab have launched The GovLab Academy with the support of the Knight Foundation.
In an effort to help people in their own communities become more effective at developing and implementing creative solutions to compelling challenges, The Gov Lab Academy is offering two new training programs:
1) An online platform with an unbundled and evolving set of topics, modules and instructors on innovations in governance, including themes such as big and open data and crowdsourcing and forthcoming topics on behavioral economics, prizes and challenges, open contracting and performance management for governance;
2) Gov 3.0: A curated and sequenced, 14-week mentoring and training program.
While the online-platform is always freely available, Gov 3.0 begins on January 29, 2014 and we invite you to to participate. Please forward this email to your networks and help us spread the word about the opportunity to participate.
Please consider applying (individuals or teams may apply), if you are:

  • an expert in communications, public policy, law, computer science, engineering, business or design who wants to expand your ability to bring about social change;

  • a public servant who wants to bring innovation to your job;

  • someone with an important idea for positive change but who lacks key skills or resources to realize the vision;

  • interested in joining a network of like-minded, purpose-driven individuals across the country; or

  • someone who is passionate about using technology to solve public problems.

The program includes live instruction and conversation every Wednesday from 5:00– 6:30 PM EST for 14 weeks starting Jan 29, 2014. You will be able to participate remotely via Google Hangout.

Gov 3.0 will allow you to apply evolving technology to the design and implementation of effective solutions to public interest challenges. It will give you an overview of the most current approaches to smarter governance and help you improve your skills in collaboration, communication, and developing and presenting innovative ideas.

Over 14 weeks, you will develop a project and a plan for its implementation, including a long and short description, a presentation deck, a persuasive video and a project blog. Last term’s projects covered such diverse issues as post-Fukushima food safety, science literacy for high schoolers and prison reform for the elderly. In every case, the goal was to identify realistic strategies for making a difference quickly.  You can read the entire Gov 3.0 syllabus here.

The program will include national experts and instructors in technology and governance both as guests and as mentors to help you design your project. Last term’s mentors included current and former officials from the White House and various state, local and international governments, academics from a variety of fields, and prominent philanthropists.

People who complete the program will have the opportunity to apply for a special fellowship to pursue their projects further.

Previously taught only on campus, we are offering Gov 3.0 in beta as an online program. This is not a MOOC. It is a mentoring-intensive coaching experience. To maximize the quality of the experience, enrollment is limited.

Please submit your application by January 22, 2014. Accepted applicants (individuals and teams) will be notified on January 24, 2014. We hope to expand the program in the future so please use the same form to let us know if you would like to be kept informed about future opportunities.”

The Eight Key Issues of Digital Government


Andrea Di Maio (Gartner): “…Now, to set the record straight, I do believe digital government is profoundly different from e-government as well as from government 2.0 (although in some jurisdictions the latter terms still looks more relevant than “digital”). Whereas there are many differences as far as technologies and what they make possible,political will, and evolving citizen demand, my contention is that the single most fundamental difference is in the relevance of data and how new and unforeseen uses of data can truly transform the way governments deliver their services and perform their operations.
This is not at all just about government as a platform or open government, where government is primarily a provider of data that constituents – be they citizens, business or intermediaries – use and mash up in new ways. It is also about government themselves inventing new ways to user their own as well as constituents’ data. It is only by striking the right balance between being a data provider and being a data broker and consumer that governments will find the right path to being truly digital.
During the Gartner Symposia I attended last fall, I had numerous interesting conversations with people who are exploring very innovative ways of using its own data, such as:

  • tax authorities contemplating to use up-to-date financial information about taxpayers to proactively suggest investments that may provide tax breaks,
  • education institutions leveraging data about student location from their original purpose (giving parents information about students’ whereabouts) to providing new tools for teachers to understand behavioral patterns and relate those to more personalized learning
  • immigration authorities leveraging data coming from video analysis, whose role is to flag suspicious immigrants for secondary inspection, to inform public safety authorities or the hospitality sector about specific issues and opportunities with tourists.

In the second half of 2013, Gartner government analysts focused on distilling the fundamental components of a digital government initiative, in order to be able to shape our research and advice in ways that hit the most important issues that client face. The new government research agenda has just been published (see Agenda Overview for Government, 2014) and eight key issues, grouped in three distinct areas,  that need to be addressed to successfully transform into a digital government organization.
Engaging Citizens

  • Service Delivery Innovation: How will governments use technology to support innovative services that produce better results for society?
  • Open Government: How will governments create and sustain a digital ecosystem that citizens can trust and want to participate in?

Connecting Agencies

  • New Digital Business Models: What data-driven business models will emerge to meet the growing needs for adequate and sustainable public services?
  • Joint Governance: How will governance coordinate IT and service decisions across independent public and private organizations?
  • Scalable Interoperability: How much interoperability is needed to support connected government services and at what cost?

Resourcing Government

  • Workforce Innovation: How will the IT organization and role transform to support government workforce innovation?
  • Adaptive Sourcing: How will government IT organizations expand their sourcing strategies to take advantage of competitive cloud-based and consumer-grade solutions?
  • Sustainable Financing: How will government IT organizations obtain and manage the financial resources required to connect government and engage citizens?”

Innovation by Competition: How Challenges and Competition Get the Most Out of the Crowd


Innocentive: “Crowdsourcing has become the 21st century’s alternative to problem solving in place of traditional employee-based strategies. It has become the modern solution to provide for needed services, content, and ideas. Crowdsourced ideas are paving the way for today’s organizations to tackle innovation challenges that confront them in today’s competitive global marketplace. To put it all in perspective, crowds used to be thought of as angry mobs. Today, crowds are more like friendly and helpful contributors. What an interesting juxtaposition, eh?
Case studies proving the effectiveness of crowdsourcing to conquer innovation challenge, particularly in the fields of science and engineering abound. Despite this fact that success stories involving crowdsourcing are plentiful, very few firms are really putting its full potential to use. Advances in ALS and AIDS research have both made huge advances thanks to crowdsourcing, just to name a couple.
Biologists at the University of Washington were able to map the structure of an AIDS related virus thanks to the collaboration involved with crowdsourcing. How did they do this?  With the help of gamers playing a game designed to help get the information the University of Washington needed. It was a solution that remained unattainable for over a decade until enough top notch scientific minds were expertly probed from around the world with effective crowdsourcing techniques.
Dr. Seward Rutkove discovered an ALS biomarker to accurately measure the progression of the disease in patients through the crowdsourcing tactics utilized in a prize contest by an organization named Prize4Life, who utilized our Challenge Driven Innovation approach to engage the crowd.
The truth is, the concept of crowdsourcing to innovate has been around for centuries. But, with the growing connectedness of the world due to sheer Internet access, the power and ability to effectively crowdsource has increased exponentially. It’s time for corporations to realize this, and stop relying on stale sources of innovation. ..”

Crowdsourcing forecasts on science and technology events and innovations


Kurzweil News: “George Mason University launched today, Jan. 10, the largest and most advanced science and technology prediction market in the world: SciCast.
The federally funded research project aims to improve the accuracy of science and technology forecasts. George Mason research assistant professor Charles Twardy is the principal investigator of the project.
SciCast crowdsources forecasts on science and technology events and innovations from aerospace to zoology.
For example, will Amazon use drones for commercial package delivery by the end of 2017? Today, SciCast estimates the chance at slightly more than 50 percent. If you think that is too low, you can estimate a higher chance. SciCast will use your estimate to adjust the combined forecast.
Forecasters can update their forecasts at any time; in the above example, perhaps after the Federal Aviation Administration (FAA) releases its new guidelines for drones. The continually updated and reshaped information helps both the public and private sectors better monitor developments in a variety of industries. SciCast is a real-time indicator of what participants think is going to happen in the future.
“Combinatorial” prediction market better than simple average


How SciCast works (Credit: George Mason University)
The idea is that collective wisdom from diverse, informed opinions can provide more accurate predictions than individual forecasters, a notion borne out by other crowdsourcing projects. Simply taking an average is almost always better than going with the “best” expert. But in a two-year test on geopolitical questions, the SciCast method did 40 percent better than the simple average.
SciCast uses the first general “combinatorial” prediction market. In a prediction market, forecasters spend points to adjust the group forecast. Significant changes “cost” more — but “pay” more if they turn out to be right. So better forecasters gain more points and therefore more influence, improving the accuracy of the system.
In a combinatorial market like SciCast, forecasts can influence each other. For example, forecasters might have linked cherry production to honeybee populations. Then, if forecasters increase the estimated percentage of honeybee colonies lost this winter, SciCast automatically reduces the estimated 2014 cherry production. This connectivity among questions makes SciCast more sophisticated than other prediction markets.
SciCast topics include agriculture, biology and medicine, chemistry, computational sciences, energy, engineered technologies, global change, information systems, mathematics, physics, science and technology business, social sciences, space sciences and transportation….

Crowdsourcing forecasts on science and technology events and innovations

George Mason University’s just-launched SciCast is largest and most advanced science and technology prediction market in the world
January 10, 2014


Example of SciCast crowdsourced forecast (credit: George Mason University)
George Mason University launched today, Jan. 10, the largest and most advanced science and technology prediction market in the world: SciCast.
The federally funded research project aims to improve the accuracy of science and technology forecasts. George Mason research assistant professor Charles Twardy is the principal investigator of the project.
SciCast crowdsources forecasts on science and technology events and innovations from aerospace to zoology.
For example, will Amazon use drones for commercial package delivery by the end of 2017? Today, SciCast estimates the chance at slightly more than 50 percent. If you think that is too low, you can estimate a higher chance. SciCast will use your estimate to adjust the combined forecast.
Forecasters can update their forecasts at any time; in the above example, perhaps after the Federal Aviation Administration (FAA) releases its new guidelines for drones. The continually updated and reshaped information helps both the public and private sectors better monitor developments in a variety of industries. SciCast is a real-time indicator of what participants think is going to happen in the future.
“Combinatorial” prediction market better than simple average


How SciCast works (Credit: George Mason University)
The idea is that collective wisdom from diverse, informed opinions can provide more accurate predictions than individual forecasters, a notion borne out by other crowdsourcing projects. Simply taking an average is almost always better than going with the “best” expert. But in a two-year test on geopolitical questions, the SciCast method did 40 percent better than the simple average.
SciCast uses the first general “combinatorial” prediction market. In a prediction market, forecasters spend points to adjust the group forecast. Significant changes “cost” more — but “pay” more if they turn out to be right. So better forecasters gain more points and therefore more influence, improving the accuracy of the system.
In a combinatorial market like SciCast, forecasts can influence each other. For example, forecasters might have linked cherry production to honeybee populations. Then, if forecasters increase the estimated percentage of honeybee colonies lost this winter, SciCast automatically reduces the estimated 2014 cherry production. This connectivity among questions makes SciCast more sophisticated than other prediction markets.
SciCast topics include agriculture, biology and medicine, chemistry, computational sciences, energy, engineered technologies, global change, information systems, mathematics, physics, science and technology business, social sciences, space sciences and transportation.
Seeking futurists to improve forecasts, pose questions


(Credit: George Mason University)
“With so many science and technology questions, there are many niches,” says Twardy, a researcher in the Center of Excellence in Command, Control, Communications, Computing and Intelligence (C4I), based in Mason’s Volgenau School of Engineering.
“We seek scientists, statisticians, engineers, entrepreneurs, policymakers, technical traders, and futurists of all stripes to improve our forecasts, link questions together and pose new questions.”
Forecasters discuss the questions, and that discussion can lead to new, related questions. For example, someone asked,Will Amazon deliver its first package using an unmanned aerial vehicle by Dec. 31, 2017?
An early forecaster suggested that this technology is likely to first be used in a mid-sized town with fewer obstructions or local regulatory issues. Another replied that Amazon is more likely to use robots to deliver packages within a short radius of a conventional delivery vehicle. A third offered information about an FAA report related to the subject.
Any forecaster could then write a question about upcoming FAA rulings, and link that question to the Amazon drones question. Forecasters could then adjust the strength of the link.
“George Mason University has succeeded in launching the world’s largest forecasting tournament for science and technology,” says Jason Matheny, program manager of Forecasting Science and Technology at the Intelligence Advanced Research Projects Activity, based in Washington, D.C. “SciCast can help the public and private sectors to better understand a range of scientific and technological trends.”
Collaborative but Competitive
More than 1,000 experts and enthusiasts from science and tech-related associations, universities and interest groups preregistered to participate in SciCast. The group is collaborative in spirit but also competitive. Participants are rewarded for accurate predictions by moving up on the site leaderboard, receiving more points to spend influencing subsequent prognostications. Participants can (and should) continually update their predictions as new information is presented.
SciCast has partnered with the American Association for the Advancement of Science, the Institute of Electrical and Electronics Engineers, and multiple other science and technology professional societies.
Mason members of the SciCast project team include Twardy; Kathryn Laskey, associate director for the C4I and a professor in the Department of Systems Engineering and Operations Research; associate professor of economics Robin Hanson; C4I research professor Tod Levitt; and C4I research assistant professors Anamaria Berea, Kenneth Olson and Wei Sun.
To register for SciCast, visit www.SciCast.org, or for more information, e-mail support@scicast.org. SciCast is open to anyone age 18 or older.”

New Book: Open Data Now


New book by Joel Gurin (The GovLab): “Open Data is the world’s greatest free resource–unprecedented access to thousands of databases–and it is one of the most revolutionary developments since the Information Age began. Combining two major trends–the exponential growth of digital data and the emerging culture of disclosure and transparency–Open Data gives you and your business full access to information that has never been available to the average person until now. Unlike most Big Data, Open Data is transparent, accessible, and reusable in ways that give it the power to transform business, government, and society.
Open Data Now is an essential guide to understanding all kinds of open databases–business, government, science, technology, retail, social media, and more–and using those resources to your best advantage. You’ll learn how to tap crowds for fast innovation, conduct research through open collaboration, and manage and market your business in a transparent marketplace.
Open Data is open for business–and the opportunities are as big and boundless as the Internet itself. This powerful, practical book shows you how to harness the power of Open Data in a variety of applications:

  • HOT STARTUPS: turn government data into profitable ventures
  • SAVVY MARKETING: understand how reputational data drives your brand
  • DATA-DRIVEN INVESTING: apply new tools for business analysis
  • CONSUMER IN FORMATION: connect with your customers using smart disclosure
  • GREEN BUSINESS: use data to bet on sustainable companies
  • FAST R&D: turn the online world into your research lab
  • NEW OPPORTUNITIES: explore open fields for new businesses

Whether you’re a marketing professional who wants to stay on top of what’s trending, a budding entrepreneur with a billion-dollar idea and limited resources, or a struggling business owner trying to stay competitive in a changing global market–or if you just want to understand the cutting edge of information technology–Open Data Now offers a wealth of big ideas, strategies, and techniques that wouldn’t have been possible before Open Data leveled the playing field.
The revolution is here and it’s now. It’s Open Data Now.”

From Faith-Based to Evidence-Based: The Open Data 500 and Understanding How Open Data Helps the American Economy


Beth Noveck in Forbes: “Public funds have, after all, paid for their collection, and the law says that federal government data are not protected by copyright. By the end of 2009, the US and the UK had the only two open data one-stop websites where agencies could post and citizens could find open data. Now there are over 300 such portals for government data around the world with over 1 million available datasets. This kind of Open Data — including weather, safety and public health information as well as information about government spending — can serve the country by increasing government efficiency, shedding light on regulated industries, and driving innovation and job creation.

It’s becoming clear that open data has the potential to improve people’s lives. With huge advances in data science, we can take this data and turn it into tools that help people choose a safer hospital, pick a better place to live, improve the performance of their farm or business by having better climate models, and know more about the companies with whom they are doing business. Done right, people can even contribute data back, giving everyone a better understanding, for example of nuclear contamination in post-Fukushima Japan or incidences of price gouging in America’s inner cities.

The promise of open data is limitless. (see the GovLab index for stats on open data) But it’s important to back up our faith with real evidence of what works. Last September the GovLab began the Open Data 500 project, funded by the John S. and James L. Knight Foundation, to study the economic value of government Open Data extensively and rigorously.  A recent McKinsey study pegged the annual global value of Open Data (including free data from sources other than government), at $3 trillion a year or more. We’re digging in and talking to those companies that use Open Data as a key part of their business model. We want to understand whether and how open data is contributing to the creation of new jobs, the development of scientific and other innovations, and adding to the economy. We also want to know what government can do better to help industries that want high quality, reliable, up-to-date information that government can supply. Of those 1 million datasets, for example, 96% are not updated on a regular basis.

The GovLab just published an initial working list of 500 American companies that we believe to be using open government data extensively.  We’ve also posted in-depth profiles of 50 of them — a sample of the kind of information that will be available when the first annual Open Data 500 study is published in early 2014. We are also starting a similar study for the UK and Europe.

Even at this early stage, we are learning that Open Data is a valuable resource. As my colleague Joel Gurin, author of Open Data Now: the Secret to Hot Start-Ups, Smart Investing, Savvy Marketing and Fast Innovation, who directs the project, put it, “Open Data is a versatile and powerful economic driver in the U.S. for new and existing businesses around the country, in a variety of ways, and across many sectors. The diversity of these companies in the kinds of data they use, the way they use it, their locations, and their business models is one of the most striking things about our findings so far.” Companies are paradoxically building value-added businesses on top of public data that anyone can access for free….”

FULL article can be found here.

Entrepreneurs Shape Free Data Into Money


Angus Loten in the Wall Street Journal: “More cities are putting information on everything from street-cleaning schedules to police-response times and restaurant inspection reports in the public domain, in the hope that people will find a way to make money off the data.
Supporters of such programs often see them as a local economic stimulus plan, allowing software developers and entrepreneurs in cities ranging from San Francisco to South Bend, Ind., to New York, to build new businesses based on the information they get from government websites.
When Los Angeles Mayor Eric Garcetti issued an executive directive last month to launch the city’s open-data program, he cited entrepreneurs and businesses as important beneficiaries. Open-data promotes innovation and “gives companies, individuals, and nonprofit organizations the opportunity to leverage one of government’s greatest assets: public information,” according to the Dec. 18 directive.
A poster child for the movement might be 34-year-old Matt Ehrlichman of Seattle, who last year built an online business in part using Seattle work permits, professional licenses and other home-construction information gathered up by the city’s Department of Planning and Development.
While his website is free, his business, called Porch.com, has more than 80 employees and charges a $35 monthly fee to industry professionals who want to boost the visibility of their projects on the site.
The site gathers raw public data—such as addresses for homes under renovation, what they are doing, who is doing the work and how much they are charging—and combines it with photos and other information from industry professionals and homeowners. It then creates a searchable database for users to compare ideas and costs for projects near their own neighborhood.
…Ian Kalin, director of open-data services at Socrata, a Seattle-based software firm that makes the back-end applications for many of these government open-data sites, says he’s worked with hundreds of companies that were formed around open data.
Among them is Climate Corp., a San Francisco-based firm that collects weather and yield-forecasting data to help farmers decide when and where to plant crops. Launched in 2006, the firm was acquired in October by Monsanto Co. MON -2.90% , the seed-company giant, for $930 million.
Overall, the rate of new business formation declined nationally between 2006 and 2010. But according to the latest data from the Ewing Marion Kauffman Foundation, an entrepreneurship advocacy group in Kansas City, Mo., the rate of new business formation in Seattle in 2011 rose 9.41% in 2011, compared with the national average of 3.9%.
Other cities where new business formation was ahead of the national average include Chicago, Austin, Texas, Baltimore, and South Bend, Ind.—all cities that also have open-data programs. Still, how effective the ventures are in creating jobs is difficult to gauge.
One wrinkle: privacy concerns about the potential for information—such as property tax and foreclosure data—to be misused.
Some privacy advocates fear that government data that include names, addresses and other sensitive information could be used by fraudsters to target victims.”

Crowdsourcing Social Problems


Article by   in Reason: “reCAPTCHA and Duolingo both represent a distinctly 21st-century form of distributed problem solving. These Internet-enabled approaches tend to be faster, far less expensive, and far more resilient than the heavyweight industrial-age methods of solving big social problems that we’ve grown accustomed to over the past century. They typically involve highly diverse resources-volunteer time, crowdfunding, the capabilities of multinational corporations, entrepreneurial capital, philanthropic funding-aligned around common objectives such as reducing congestion, providing safe drinking water, or promoting healthy living. Crowdsourcing offers not just a better way of doing things, but a radical challenge to the bureaucratic status quo.
Here are several ways public, private, and nonprofit organizations can use lightweight, distributed approaches to solve societal problems faster and cheaper than the existing sclerotic models.
Chunk the Problem
The genius of reCAPTCHA and Duolingo is that they divide labor into small increments, performed for free, often by people who are unaware of the project they’re helping to complete. This strategy has wide public-policy applications, even in dealing with potholes….
Meanwhile, Finland’s DigitalKoot project enlisted volunteers to digitize their own libraries by playing a computer game that challenged them to transcribe scans of antique manuscripts.
Governments can set up a microtasking platform, not just for citizen engagement but as a way to harness the knowledge and skills of public employees across multiple departments and agencies. If microtasking can work to connect people outside the “four walls” of an organization, think of its potential as a platform to connect people and conduct work inside an organization-even an organization as bureaucratic as government.

Decentralize Service to the Self
A young woman slices her finger on a knife. As she compresses the bleeding with gauze, she needs to know if her wound warrants stitches. So she calls up Blue Cross’ 24-hour nurse hotline, where patients call to learn if they should see a doctor. The nurse asks her to describe the depth of the cut. He explains she should compress it with gauze and skip the ER. In aggregate, savings like this amount to millions of dollars of avoided emergency room visits.
Since 2003, Blue Cross has been shifting the work of basic triage and risk mitigation to customers. Britain’s National Health Service (NHS) implemented a similar program, NHS Direct, in 1998. NHS estimates that the innovation has saved it £44 million a year….
Gamify Drudgery
Finland’s national library houses an enormous archive of antique texts, which officials hoped to scan and digitize into ordinary, searchable text documents. Rather than simply hire people for the tedium of correcting garbled OCR scans, the library invited the public to play a game. An online program called DigitalKoot lets people transcribe scanned words, and by typing accurately, usher a series of cartoon moles safely across a bridge….
Build a Two-Sided Market
Road infrastructure costs government five cents per driver per mile, according to the Victoria Transport Policy Institute. “That’s a dollar the government paid for the paving of that road and the maintaining of that infrastructure…just for you, not the other 3,000 people that travelled that same segment of highway in that same hour that you did,” says Sean O’Sullivan, founder of Carma, a ridesharing application.
Ridesharing companies such as Carma, Lyft, and Zimride are attempting to recruit private cars for the public transit network, by letting riders pay a small fee to carpool. A passenger waits at a designated stop, and the app alerts drivers, who can scan a profile of their potential rider. It’s a prime example of a potent new business model…
Remove the Middleman
John McNair dropped out of high school at age 16. By his thirties, he became an entrepreneur, producing and selling handmade guitars, but carpentry alone wouldn’t grow his business. So the founder of Red Dog Guitars enrolled in a $20 class on Skillshare.com, taught by the illustrator John Contino, to learn to brand his work with hand lettered product labels. Soon, a fellow businessman was asking McNair for labels to market guitar pickups.
Traditionally, the U.S. government might invest in retraining someone like John. Instead, peer-to-peer technology has allowed a community of designers to help John develop his skills. Peer-to-peer strategies enable citizens to meet each other’s needs, cheaply. Peer-to-peer solutions can help fix problems, deliver services, and supplement traditional approaches.
Peer-to-peer can lessen our dependence on big finance. Kickstarter lets companies skip the energy of convincing a banker that their product is viable. They just need to convince customers…”