Crowdsourced transit app shows what time the bus will really come


Springwise: “The problem with most transport apps is that they rely on fixed data from transport company schedules and don’t truly reflect exactly what’s going on with the city’s trains and buses at any given moment. Operating like a Waze for public transport, Israel’s Ototo app crowdsources real-time information from passengers to give users the best suggestions for their commute.
The app relies on a community of ‘Riders’, who allow anonymous location data to be sent from their smartphone whenever they’re using public transport. By collating this data together, Ototo offers more realistic information about bus and train routes. While a bus may be due in five minutes, a Rider currently on that bus might be located more than five minutes away, indicating that the bus isn’t on time. Ototo can then suggest a quicker route for users. According to Fast Company, the service currently has a 12,000-strong global Riders community that powers its travel recommendations. On top of this, the app is designed in an easy-to-use infographic format that quickly and efficiently tells users where they need to be going and how long it will take. The app is free to download from the App Store, and the video below offers a demonstration:


Ototo faces competition from similar services such as New York City’s Moovit, which also details how crowded buses are.”

Charities Try New Ways to Test Ideas Quickly and Polish Them Later


Ben Gose in the Chronicle of Philanthropy: “A year ago, a division of TechSoup Global began working on an app to allow donors to buy a hotel room for victims of domestic violence when no other shelter is available. Now that app is a finalist in a competition run by a foundation that combats human trafficking—and a win could mean a grant worth several hundred thousand dollars. The app’s evolution—adding a focus on sex slaves to the initial emphasis on domestic violence—was hardly accidental.
Caravan Studios, the TechSoup division that created the app, has embraced a new management approach popular in Silicon Valley known as “lean start-up.”
The principles, which are increasingly popular among nonprofits, emphasize experimentation over long-term planning and urge groups to get products and services out to clients as early as possible so the organizations can learn from feedback and make changes.
When the app, known as SafeNight, was still early in the design phase, Caravan posted details about the project on its website, including applications for grants that Caravan had not yet received. In lean-start-up lingo, Caravan put out a “minimal viable product” and hoped for feedback that would lead to a better app.
Caravan soon heard from antitrafficking organizations, which were interested in the same kind of service. Caravan eventually teamed up with the Polaris Project and the State of New Jersey, which were working on a similar app, to jointly create an app for the final round of the antitrafficking contest. Humanity United, the foundation sponsoring the contest, plans to award $1.8-million to as many as three winners later this month.
Marnie Webb, CEO of Caravan, which is building an array of apps designed to curb social problems, says lean-start-up principles help Caravan work faster and meet real needs.
“The central idea is that any product that we develop will get better if it lives as much of its life as possible outside of our office,” Ms. Webb says. “If we had kept SafeNight inside and polished it and polished it, it would have been super hard to bring on a partner because we would have invested too much.”….
Nonprofits developing new tech tools are among the biggest users of lean-start-up ideas.
Upwell, an ocean-conservation organization founded in 2011, scans the web for lively ocean-related discussions and then pushes to turn them into full-fledged movements through social-media campaigns.
Lean principles urge groups to steer clear of “vanity metrics,” such as site visits, that may sound impressive but don’t reveal much. Upwell tracks only one number—“social mentions”—the much smaller group of people who actually say something about an issue online.
After identifying a hot topic, Upwell tries to assemble a social-media strategy within 24 hours—what it calls a “minimum viable campaign.”
“We do the least amount of work to get something out the door that will get results and information,” says Rachel Dearborn, Upwell’s campaign director.
Campaigns that don’t catch on are quickly scrapped. But campaigns that do catch on get more time, energy, and money from Upwell.
After Hurricane Sandy, in 2012, a prominent writer on ocean issues and others began pushing the idea that revitalizing the oyster beds near New York City could help protect the shore from future storm surges. Upwell’s “I (Oyster) New York” campaign featured a catchy logo and led to an even bigger spike in attention.

‘Build-Measure-Learn’

Some organizations that could hardly be called start-ups are also using lean principles. GuideStar, the 20-year-old aggregator of financial information about charities, is using the lean approach to develop tools more quickly that meet the needs of its users.
The lean process promotes short “build-measure-learn” cycles, in which a group frequently updates a product or service based on what it hears from its customers.
GuideStar and the Nonprofit Finance Fund have developed a tool called Financial Scan that allows charities to see how they compare with similar groups on various financial measures, such as their mix of earned revenue and grant funds.
When it analyzed who was using the tool, GuideStar found heavy interest from both foundations and accounting firms, says Evan Paul, GuideStar’s senior director of products and marketing.
In the future, he says, GuideStar may create three versions of Financial Scan to meet the distinct interests of charities, foundations, and accountants.
“We want to get more specific about how people are using our data to make decisions so that we can help make those decisions better and faster,” Mr. Paul says….


Lean Start-Up: a Glossary of Terms for a Hot New Management Approach

Build-Measure-Learn

Instead of spending considerable time developing a product or service for a big rollout, organizations should consider using a continuous feedback loop: “build” a program or service, even if it is not fully fleshed out; “measure” how clients are affected; and “learn” by improving the program or going in a new direction. Repeat the cycle.

Minimum Viable Product

An early version of a product or service that may be lacking some features. This approach allows an organization to obtain feedback from clients and quickly determine the usefulness of a product or service and how to improve it.

Get Out of the Building

To determine whether a product or service is needed, talk to clients and share your ideas with them before investing heavily.

A/B Testing

Create two versions of a product or service, show them to different groups, and see which performs best.

Failing Fast

By quickly realizing that a product or service isn’t viable, organizations save time and money and gain valuable information for their next effort.

Pivot

Making a significant change in strategy when the early testing of a minimum viable product shows that the product or service isn’t working or isn’t needed.

Vanity Metrics

Measures that seem to provide a favorable picture but don’t accurately capture the impact of a product. An example might be a tally of website page views. A more meaningful measure—or an “actionable metric,” in the lean lexicon—might be the number of active users of an online service.
Sources: The Lean Startup, by Eric Ries; The Ultimate Dictionary of Lean for Social Good, a publication by Lean Impact”

New Field Guide Explores Open Data Innovations in Disaster Risk and Resilience


Worldbank: “From Indonesia to Bangladesh to Nepal, community members armed with smartphones and GPS systems are contributing to some of the most extensive and versatile maps ever created, helping inform policy and better prepare their communities for disaster risk.
In Jakarta, more than 500 community members have been trained to collect data on thousands of hospitals, schools, private buildings, and critical infrastructure. In Sri Lanka, government and academic volunteers mapped over 30,000 buildings and 450 km of roadways using a collaborative online resource called OpenStreetMaps.
These are just a few of the projects that have been catalyzed by the Open Data for Resilience Initiative (OpenDRI), developed by the World Bank’s Global Facility for Disaster Reduction and Recovery (GFDRR). Launched in 2011, OpenDRI is active in more than 20 countries today, mapping tens of thousands of buildings and urban infrastructure, providing more than 1,000 geospatial datasets to the public, and developing innovative application tools.
To expand this work, the World Bank Group has launched the OpenDRI Field Guide as a showcase of successful projects and a practical guide for governments and other organizations to shape their own open data programs….
The field guide walks readers through the steps to build open data programs based on the OpenDRI methodology. One of the first steps is data collation. Relevant datasets are often locked because of proprietary arrangements or fragmented in government bureaucracies. The field guide explores tools and methods to enable the participatory mapping projects that can fill in gaps and keep existing data relevant as cities rapidly expand.

GeoNode: Mapping Disaster Damage for Faster Recovery
One example is GeoNode, a locally controlled and open source cataloguing tool that helps manage and visualize geospatial data. The tool, already in use in two dozen countries, can be modified and easily be integrated into existing platforms, giving communities greater control over mapping information.
GeoNode was used extensively after Typhoon Yolanda (Haiyan) swept the Philippines with 300 km/hour winds and a storm surge of over six meters last fall. The storm displaced nearly 11 million people and killed more than 6,000.
An event-specific GeoNode project was created immediately and ultimately collected more than 72 layers of geospatial data, from damage assessments to situation reports. The data and quick analysis capability contributed to recovery efforts and is still operating in response mode at Yolandadata.org.
InaSAFE: Targeting Risk Reduction
A sister project, InaSAFE, is an open, easy-to-use tool for creating impact assessments for targeted risk reduction. The assessments are based on how an impact layer – such as a tsunami, flood, or earthquake – affects exposure data, such as population or buildings.
With InaSAFE, users can generate maps and statistical information that can be easily disseminated and even fed back into projects like GeoNode for simple, open source sharing.
The initiative, developed in collaboration with AusAID and the Government of Indonesia, was put to the test in the 2012 flood season in Jakarta, and its successes provoked a rapid national rollout and widespread interest from the international community.
Open Cities: Improving Urban Planning & Resilience
The Open Cities project, another program operating under the OpenDRI platform, aims to catalyze the creation, management and use of open data to produce innovative solutions for urban planning and resilience challenges across South Asia.
In 2013, Kathmandu was chosen as a pilot city, in part because the population faces the highest mortality threat from earthquakes in the world. Under the project, teams from the World Bank assembled partners and community mobilizers to help execute the largest regional community mapping project to date. The project surveyed more than 2,200 schools and 350 health facilities, along with road networks, points of interest, and digitized building footprints – representing nearly 340,000 individual data nodes.”

Climate Data Initiative Launches with Strong Public and Private Sector Commitments


John Podesta and Dr. John P. Holdren at the White House blog:  “…today, delivering on a commitment in the President’s Climate Action Plan, we are launching the Climate Data Initiative, an ambitious new effort bringing together extensive open government data and design competitions with commitments from the private and philanthropic sectors to develop data-driven planning and resilience tools for local communities. This effort will help give communities across America the information and tools they need to plan for current and future climate impacts.
The Climate Data Initiative builds on the success of the Obama Administration’s ongoing efforts to unleash the power of open government data. Since data.gov, the central site to find U.S. government data resources, launched in 2009, the Federal government has released troves of valuable data that were previously hard to access in areas such as health, energy, education, public safety, and global development. Today these data are being used by entrepreneurs, researchers, tech innovators, and others to create countless new applications, tools, services, and businesses.
Data from NOAA, NASA, the U.S. Geological Survey, the Department of Defense, and other Federal agencies will be featured on climate.data.gov, a new section within data.gov that opens for business today. The first batch of climate data being made available will focus on coastal flooding and sea level rise. NOAA and NASA will also be announcing an innovation challenge calling on researchers and developers to create data-driven simulations to help plan for the future and to educate the public about the vulnerability of their own communities to sea level rise and flood events.
These and other Federal efforts will be amplified by a number of ambitious private commitments. For example, Esri, the company that produces the ArcGIS software used by thousands of city and regional planning experts, will be partnering with 12 cities across the country to create free and open “maps and apps” to help state and local governments plan for climate change impacts. Google will donate one petabyte—that’s 1,000 terabytes—of cloud storage for climate data, as well as 50 million hours of high-performance computing with the Google Earth Engine platform. The company is challenging the global innovation community to build a high-resolution global terrain model to help communities build resilience to anticipated climate impacts in decades to come. And the World Bank will release a new field guide for the Open Data for Resilience Initiative, which is working in more than 20 countries to map millions of buildings and urban infrastructure….”

“Open-washing”: The difference between opening your data and simply making them available


Christian Villum at the Open Knowledge Foundation Blog:  “Last week, the Danish it-magazine Computerworld, in an article entitled “Check-list for digital innovation: These are the things you must know“, emphasised how more and more companies are discovering that giving your users access to your data is a good business strategy. Among other they wrote:

(Translation from Danish) According to Accenture it is becoming clear to many progressive businesses that their data should be treated as any other supply chain: It should flow easily and unhindered through the whole organisation and perhaps even out into the whole eco-system – for instance through fully open API’s.

They then use Google Maps as an example, which firstly isn’t entirely correct, as also pointed out by the Neogeografen, a geodata blogger, who explains how Google Maps isn’t offering raw data, but merely an image of the data. You are not allowed to download and manipulate the data – or run it off your own server.

But secondly I don’t think it’s very appropriate to highlight Google and their Maps project as a golden example of a business that lets its data flow unhindered to the public. It’s true that they are offering some data, but only in a very limited way – and definitely not as open data – and thereby not as progressively as the article suggests.

Surely it’s hard to accuse Google of not being progressive in general. The article states how Google Maps’ data are used by over 800,000 apps and businesses across the globe. So yes, Google has opened its silo a little bit, but only in a very controlled and limited way, which leaves these 800,000 businesses dependent on the continual flow of data from Google and thereby not allowing them to control the very commodities they’re basing their business on. This particular way of releasing data brings me to the problem that we’re facing: Knowing the difference between making data available and making them open.

Open data is characterized by not only being available, but being both legally open (released under an open license that allows full and free reuse conditioned at most to giving credit to it’s source and under same license) and technically available in bulk and in machine readable formats – contrary to the case of Google Maps. It may be that their data are available, but they’re not open. This – among other reasons – is why the global community around the 100% open alternative Open Street Map is growing rapidly and an increasing number of businesses choose to base their services on this open initiative instead.

But why is it important that data are open and not just available? Open data strengthens the society and builds a shared resource, where all users, citizens and businesses are enriched and empowered, not just the data collectors and publishers. “But why would businesses spend money on collecting data and then give them away?” you ask. Opening your data and making a profit are not mutually exclusive. Doing a quick Google search reveals many businesses that both offer open data and drives a business on them – and I believe these are the ones that should be highlighted as particularly progressive in articles such as the one from Computerworld….

We are seeing a rising trend of what can be termed “open-washing” (inspired by “greenwashing“) – meaning data publishers that are claiming their data is open, even when it’s not – but rather just available under limiting terms. If we – at this critical time in the formative period of the data driven society – aren’t critically aware of the difference, we’ll end up putting our vital data streams in siloed infrastructure built and owned by international corporations. But also to give our praise and support to the wrong kind of unsustainable technological development.”

Expanding Opportunity through Open Educational Resources


Hal Plotkin and Colleen Chien at the White House: “Using advanced technology to dramatically expand the quality and reach of education has long been a key priority for the Obama Administration.
In December 2013, the President’s Council of Advisors on Science and Technology (PCAST) issued a report exploring the potential of Massive Open Online Courses (MOOCs) to expand access to higher education opportunities. Last month, the President announced a $2B down payment, and another $750M in private-sector commitments to deliver on the President’s ConnectEd initiative, which will connect 99% of American K-12 students to broadband by 2017 at no cost to American taxpayers.
This week, we are happy to be joining with educators, students, and technologists worldwide to recognize and celebrate Open Education Week.
Open Educational Resources (“OER”) are educational resources that are released with copyright licenses allowing for their free use, continuous improvement, and modification by others. The world is moving fast, and OER enables educators and students to access, customize, and remix high-quality course materials reflecting the latest understanding of the world and materials that incorporate state of the art teaching methods – adding their own insights along the way. OER is not a silver bullet solution to the many challenges that teachers, students and schools face. But it is a tool increasingly being used, for example by players like edX and the Kahn Academy, to improve learning outcomes and create scalable platforms for sharing educational resources that reach millions of students worldwide.
Launched at MIT in 2001, OER became a global movement in 2007 when thousands of educators around the globe endorsed the Cape Town Declaration on Open Educational Resources. Another major milestone came in 2011, when Secretary of Education Arne Duncan and then-Secretary of Labor Hilda Solis unveiled the four-year, $2B Trade Adjustment Assistance Community College and Career Training Grant Program (TAACCCT). It was the first Federal program to leverage OER to support the development of a new generation of affordable, post-secondary educational programs that can be completed in two years or less to prepare students for careers in emerging and expanding industries….
Building on this record of success, OSTP and the U.S. Agency for International Development (USAID) are exploring an effort to inspire and empower university students through multidisciplinary OER focused on one of the USAID Grand Challenges, such as securing clean water, saving lives at birth, or improving green agriculture. This effort promises to  be a stepping stone towards leveraging OER to help solve other grand challenges such as the NAE Grand Challenges in Engineering or Grand Challenges in Global Health.
This is great progress, but there is more work to do. We look forward to keeping the community updated right here. To see the winning videos from the U.S. Department of Education’s “Why Open Education Matters” Video Contest, click here.”

Why the wealthiest countries are also the most open with their data


Emily Badger in the Washington Post: “The Oxford Internet Institute this week posted a nice visualization of the state of open data in 70 countries around the world, reflecting the willingness of national governments to release everything from transportation timetables to election results to machine-readable national maps. The tool is based on the Open Knowledge Foundation’s Open Data Index, an admittedly incomplete but telling assessment of who willingly publishes updated, accurate national information on, say, pollutants (Sweden) and who does not (ahem, South Africa).

Oxford Internet Institute
Tally up the open data scores for these 70 countries, and the picture looks like this, per the Oxford Internet Institute (click on the picture to link through to the larger interactive version):
Oxford Internet Institute
…With apologies for the tiny, tiny type (and the fact that many countries aren’t listed here at all), a couple of broad trends are apparent. For one, there’s a prominent global “openness divide,” in the words of the Oxford Internet Institute. The high scores mostly come from Europe and North America, the low scores from Asia, Africa and Latin America. Wealth is strongly correlated with “openness” by this measure, whether we look at World Bank income groups or Gross National Income per capita. By the OII’s calculation, wealth accounts for about a third of the variation in these Open Data Index scores.
Perhaps this is an obvious correlation, but the reasons why open data looks like the luxury of rich economies are many, and they point to the reality that poor countries face a lot more obstacles to openness than do places like the United States. For one thing, openness is also closely correlated with Internet penetration. Why open your census results if people don’t have ways to access it (or means to demand it)? It’s no easy task to do this, either.”

The intelligent citizen


John Bell in AlJazeera: “A quarter century after the demise of the Soviet Union, is the Western model of government under threat? …. The pressures are coming from several directions.
All states are feeling the pressure from unregulated global flows of capital that create obscene concentrations of wealth, and an inability of the nation-state to respond.Relatedly, citizens either ignore or distrust traditional institutions, and ethnic groups demand greater local autonomy.
A recent Pew survey shows that Americans aged 18-33 mostly identify as political independents and distrust institutions. The classic model is indeed frayed, and new developments have made it feel like very old cloth.
One natural reflex is to assert even greater control, a move suited to the authoritarians, such as China, Russia or General Abdel Fattah al-Sisi‘s Egypt: Strengthen the nation by any means to withstand the pressures. The reality, however, is that all systems, democracies or otherwise, were designed for an industrial age, and the management of an anonymous mass, and cannot cope with globalised economics and the information world of today.
The question remains: What can effectively replace the Western model? The answer may not lie only in the invention of new structures, but in the improvement of a key component found in all: The citizen.
The citizen today is mostly a consumer, focused on the purchase of goods or services, or the insistent consumption of virtual information, often as an ersatz politics. Occasionally, when a threat rises, he or she also becomes a demandeur of absolute security from the state. Indeed, some are using the new technologies for democratic purposes, they are better informed, criticise abuse or corruption, and organise and rally movements.
But, the vast majority is politically disengaged and cynical of political process; many others are too busy trying to survive to even care. A grand apathy has set in, the stage left vacant for a very few extremists, or pied pipers of the old tunes of nationalisms and tribal belonging disguised as leaders. The citizen is either invisible in this circus, an endangered species in the 21st century, or increasingly drawn to dark and polarising forces.
Some see the glass as half full and believe that technology and direct democracy can bridge the gaps. Indeed, the internet provides a plethora of information and a greater sense of empowerment. Lesser-known protests in Bosnia have led to direct democracy plenums, and the Swiss do revert to national referenda. However, whether direct or representative, democracy will still depend on the quality of the citizen, and his or her decisions.
Opinion, dogma and bias
Opinion, dogma and bias remain common political operating system and, as a result, our politics are still an unaffordable game of chance. The optimists may be right, but discussions in social media on issues ranging from Ukraine to gun control reveal more deep bias and the lure of excitement than the pursuit of a constructive answer.
People crave excitement in their politics. Whether it is through asserting their own opinion or in battling others, politics offers a great ground for this high. The cost, however, comes in poor judgment and dangerous decisions. George W. Bush was elected twice, Vladimir Putin has much support, climate change is denied, and an intoxicated Mayor of Toronto, Rob Ford, may be re-elected.
Few are willing to admit their role in this state of affairs, but they will gladly see the ill in others. Even fewer, including often myself, will admit that they don’t really know how to think through a challenge, political or otherwise. This may seem absurd, thinking feels as natural as walking, but the formation of political opinion is a complex affair, a flawed duet between our minds and outside input. Media, government propaganda, family, culture, and our own unique set of personal experiences, from traumas to chance meetings, all play into the mix. High states of emotion, “excitement”, also weigh in, making us dumb and easily manipulated….
This step may also be a precursor for another that involves the ordinary person. Today being a citizen involves occasional voting, politics as spectator sport, and, among some, being a watchdog against corruption or street activism. What may be required is more citizens’ participation in local democracy, not just in assemblies or casting votes, but in local management and administration.
This will help people understand the complexities of politics, gain greater responsibility, and mitigate some of the vices of centralisation and representative democracy. It may also harmonise with the information age, where everyone, it seems, wishes to be empowered.
Do people have time in their busy lives? A rotational involvement in local affairs can help solve this, and many might even find the engagement enjoyable. This injection of a real citizen into the mix may improve the future of politics while large institutions continue to hum their tune.
In the end, a citizen who has responsibility for his actions can probably make any structure work, while rejecting any that would endanger his independence and dignity. The rise of a more intelligent and committed citizen may clarify politics, improve liberal democracies, and make populism and authoritarianism less appealing and likely paths.”

Protect the open web and the promise of the digital age


Richard Waters in the Financial Times:  “There is much to be lost if companies and nations put up fences around our digital open plains
For all the drawbacks, it is not hard to feel nostalgic about the early days of the web. Surfing between slow-loading, badly designed sites on a dial-up internet connection running at 56 kilobits per second could be frustrating. No wonder it was known as the “world wide wait”. But the “wow” factor was high. There was unparalleled access to free news and information, even if some of it was deeply untrustworthy. Then came that first, revelatory search on Google, which untangled the online jumble with almost miraculous speed.
Later, an uproarious outbreak of blogs converted what had been a passive medium into a global rant. And, with YouTube and Facebook, a mass audience found digital self-expression for the first time.
As the world wide web turns 25, it is easy to take all this for granted. For a generation that has grown up digital, it is part of the fabric of life.
It is also easy to turn away without too many qualms. More than 80 per cent of time spent on smartphones and tablets does not involve the web at all: it is whiled away in apps, which offer the instant gratification that comes from a tap or swipe of a finger.
Typing a URL on a small device, trying to stretch or shrink a web page to fit the small screen, browsing through Google links in a mobile browser: it is all starting to seem so, well, anachronistic.
But if the world wide web is coming to play a smaller part in everyday life, the significance of its relative decline should be kept in perspective. After all, the web is only one of the applications that rides on top of the internet: it is the standards and technologies of the internet itself that provide the main foundation for the modern, connected world. As long as all bits flow freely (and cheaply), the promise of the digital age will remain intact.
Before declaring the web era over and moving on, however, it is worth dwelling on what it represents – and what could be lost if this early manifestation of digital life were to be consigned to history.
Sir Tim Berners-Lee, who wrote the technical paper a quarter of a century ago that laid out the architecture of the web, certainly senses the threat. The open technical standards and open access that lie at the heart of the web – based on the freedom to link any online document to any other – are not guaranteed. What is needed, he argued this week, is nothing less than a digital bill of rights: a statement that would enshrine the ideals on which the medium was founded.
As this suggests, the web has always been much more than a technology. It is a state of mind, a dream of participation, a call to digital freedom that transcends geography. What place it finds in the connected world of tomorrow will help define what it means to be a digital citizen…”

Open Data is a Civil Right


Yo Yoshida, Founder & CEO, Appallicious in GovTech: “As Americans, we expect a certain standardization of basic services, infrastructure and laws — no matter where we call home. When you live in Seattle and take a business trip to New York, the electric outlet in the hotel you’re staying in is always compatible with your computer charger. When you drive from San Francisco to Los Angeles, I-5 doesn’t all-of-a-sudden turn into a dirt country road because some cities won’t cover maintenance costs. If you take a 10-minute bus ride from Boston to the city of Cambridge, you know the money in your wallet is still considered legal tender.

But what if these expectations of consistency were not always a given? What if cities, counties and states had absolutely zero coordination when it came to basic services? This is what it is like for us in the open data movement. There are so many important applications and products that have been built by civic startups and concerned citizens. However, all too often these efforts are confided to city limits, and unavailable to anyone outside of them. It’s time to start reimagining the way cities function and how local governments operate. There is a wealth of information housed in local governments that should be public by default to help fuel a new wave of civic participation.
Appallicious’ Neighborhood Score provides an overall health and sustainability score, block-by-block for every neighborhood in the city of San Francisco. The first time metrics have been applied to neighborhoods so we can judge how government allocates our resources, so we can better plan how to move forward. But, if you’re thinking about moving to Oakland, just a subway stop away from San Francisco and want to see the score for a neighborhood, our app can’t help you, because that city has yet to release the data sets we need.
In Contra Costa County, there is the lifesaving PulsePoint app, which notifies smartphone users who are trained in CPR when someone nearby may be in need of help. This is an amazing app—for residents of Contra Costa County. But if someone in neighboring Alameda County needs CPR, the app, unfortunately, is completely useless.
Buildingeye visualizes planning and building permit data to allow users to see what projects are being proposed in their area or city. However, buildingeye is only available in a handful of places, simply because most cities have yet to make permits publicly available. Think about what this could do for the construction sector — an industry that has millions of jobs for Americans. Buildingeye also gives concerned citizens access to public documents like never before, so they can see what might be built in their cities or on their streets.
Along with other open data advocates, I have been going from city-to-city, county-to-county and state-to-state, trying to get governments and departments to open up their massive amounts of valuable data. Each time one city, or one county, agrees to make their data publicly accessible, I can’t help but think it’s only a drop in the bucket. We need to think bigger.
Every government, every agency and every department in the country that has already released this information to the public is a case study that points to the success of open data — and why every public entity should follow their lead. There needs to be a national referendum that instructs that all government data should be open and accessible to the public.
Last May, President Obama issued an executive order requiring that going forward, any data generated by the federal government must be made available to the public in open, machine-readable formats. In the executive order, Obama stated that, “openness in government strengthens our democracy, promotes the delivery of efficient and effective services to the public, and contributes to economic growth.”
If this is truly the case, Washington has an obligation to compel local and state governments to release their data as well. Many have tried to spur this effort. California Lt. Gov. Gavin Newsom created the Citizenville Challenge to speed up adoption on the local level. The U.S. Conference of Mayors has also been vocal in promoting open data efforts. But none of these initiatives could have the same effect of a federal mandate.
What I am proposing is no small feat, and it won’t happen overnight. But there should be a concerted effort by those in the technology industry, specifically civic startups, to call on Congress to draft legislation that would require every city in the country to make their data open, free and machine readable. Passing federal legislation will not be an easy task — but creating a “universal open data” law is possible. It would require little to no funding, and it is completely nonpartisan. It’s actually not a political issue at all; it is, for lack of a better word, and administrative issue.
Often good legislation is blocked because lawmakers and citizens are concerned about project funding. While there should be support to help cities and towns achieve the capability of opening their data, a lot of the time, they don’t need it. In 2009, the city and county of San Francisco opened up its data with zero dollars. Many other cities have done the same. There will be cities and municipalities that will need financial assistance to accomplish this. But it is worth it, and it will not require a significant investment for a substantial return. There are free online open data portals, like ckan, dkan and a new effort from Accela, CivicData.com, to centralize open data efforts.
When the UK Government recently announced a £1.5 million investment to support open data initiatives, its Cabinet Office Minister said, “We know that it creates a more accountable, efficient and effective government. Open Data is a raw material for economic growth, supporting the creation of new markets, business and jobs and helping us compete in the global race.”
We should not fall behind these efforts. There is too much at stake for our citizens, not to mention our economy. A recent McKinsey report found that making open data has the potential to create $3 trillion in value worldwide.
Former Speaker Tip O’Neil famously said, “all politics are local.” But we in the civic startup space believe all data is local. Data is reporting potholes in your neighborhood and identifying high crime areas in your communities. It’s seeing how many farmers’ markets there are in your town compared to liquor stores. Data helps predict which areas of a city are most at risk during a heat wave and other natural disasters. A federal open data law would give the raw material needed to create tools to improve the lives of all Americans, not just those who are lucky enough to live in a city that has released this information on its own.
It’s a different way of thinking about how a government operates and the relationship it has with its citizens. Open data gives the public an amazing opportunity to be more involved with governmental decisions. We can increase accountability and transparency, but most importantly we can revolutionize the way local residents communicate and work with their government.
Access to this data is a civil right. If this is truly a government by, of and for the people, then its data needs to be available to all of us. By opening up this wealth of information, we will design a better government that takes advantage of the technology and skills of civic startups and innovative citizens….”