Open Data at Core of New Governance Paradigm


GovExec: “Rarely are federal agencies compared favorably with Facebook, Instagram, or other modern models of innovation, but there is every reason to believe they can harness innovation to improve mission effectiveness. After all, Aneesh Chopra, former U.S. Chief Technology Officer, reminded the Excellence in Government 2014 audience that government has a long history of innovation. From nuclear fusion to the Internet, the federal government has been at the forefront of technological development.
According to Chopra, the key to fueling innovation and economic prosperity today is open data. But to make the most of open data, government needs to adapt its culture. Chopra outlined three essential elements of doing so:

  1. Involve external experts – integrating outside ideas is second to none as a source of innovation.
  2. Leverage the experience of those on the front lines – federal employees who directly execute their agency’s mission often have the best sense of what does and does not work, and what can be done to improve effectiveness.
  3. Look to the public as a value multiplier – just as Facebook provides a platform for tens of thousands of developers to provide greater value, federal agencies can provide the raw material for many more to generate better citizen services.

In addition to these three broad elements, Chopra offered four specific levers government can use to help enact this paradigm shift:

  1. Democratize government data – opening government data to the public facilitates innovation. For example, data provided by the National Oceanic and Atmospheric Administration helps generate a 5 billion dollar industry by maintaining almost no intellectual property constraints on its weather data.
  2. Collaborate on technical standards – government can act as a convener of industry members to standardize technological development, and thereby increase the value of data shared.
  3. Issue challenges and prizes – incentivizing the public to get involved and participate in efforts to create value from government data enhances the government’s ability to serve the public.
  4. Launch government startups – programs like the Presidential Innovation Fellows initiative helps challenge rigid bureaucratic structures and permeate a culture of innovation.

Federal leaders will need a strong political platform to sustain this shift. Fortunately, this blueprint is also bipartisan, says Chopra. Political leaders on both sides of the aisle are already getting behind the movement to bring innovation to the core of government..

The rise of open data driven businesses in emerging markets


Alla Morrison at the Worldbank blog:

Key findings —

  • Many new data companies have emerged around the world in the last few years. Of these companies, the majority use some form of government data.
  • There are a large number of data companies in sectors with high social impact and tremendous development opportunities.
  • An actionable pipeline of data-driven companies exists in Latin America and in Asia. The most desired type of financing is equity, followed by quasi-equity in the amounts ranging from $100,000 to $5 million, with averages of between $2 and $3 million depending on the region. The total estimated need for financing may exceed $400 million.

“The economic value of open data is no longer a hypothesis
How can one make money with open data which is akin to air – free and open to everyone? Should the World Bank Group be in the catalyzer role for a sector that is just emerging?  And if so, what set of interventions would be the most effective? Can promoting open data-driven businesses contribute to the World Bank Group’s twin goals of fighting poverty and boosting shared prosperity?
These questions have been top of the mind since the World Bank Open Finances team convened a group of open data entrepreneurs from across Latin America to share their business models, success stories and challenges at the Open Data Business Models workshop in Uruguay in June 2013. We were in Uruguay to find out whether open data could lead to the creation of sustainable new businesses and jobs. To do so, we tested a couple of hypotheses: open data has economic value, beyond the benefits of increased transparency and accountability; and open data companies with sustainable business models already exist in emerging economies.
Encouraged by our findings in Uruguay we set out to further explore the economic development potential of open data, with a focus on:

  • Contribution of open data to countries’ GDP;
  • Innovative solutions to tackle social problems in key sectors like agriculture, health, education, transportation, climate change, financial services, especially those benefiting low income populations;
  • Economic benefits of governments’ buy-in into the commercial value of open data and resulting release of new datasets, which in turn would lead to increased transparency in public resource management (reductions in misallocations, a more level playing field in procurement) and better service delivery; and
  • Creation of data-related private sector jobs, especially suited for the tech savvy young generation.

We proposed a joint IFC/World Bank approach (From open data to development impact – the crucial role of private sector) that envisages providing financing to data-driven companies through a dedicated investment fund, as well as loans and grants to governments to create a favorable enabling environment. The concept was received enthusiastically for the most part by a wide group of peers at the Bank, the IFC, as well as NGOs, foundations, DFIs and private sector investors.
Thanks also in part to a McKinsey report last fall stating that open data could help unlock more than $3 trillion in value every year, the potential value of open data is now better understood. The acquisition of Climate Corporation (whose business model holds enormous potential for agriculture and food security, if governments open up the right data) for close to a billion dollars last November and the findings of the Open Data 500 project led by GovLab of the NYU further substantiated the hypothesis. These days no one asks whether open data has economic value; the focus has shifted to finding ways for companies, both startups and large corporations, and governments to unlock it. The first question though is – is it still too early to plan a significant intervention to spur open data driven economic growth in emerging markets?”

Conceptualizing Open Data ecosystems: A timeline analysis of Open Data development in the UK


New paper by Tom Heath et al: “In this paper, we conceptualize Open Data ecosystems by analysing the major stakeholders in the UK. The conceptualization is based on a review of popular Open Data definitions and business ecosystem theories, which we applied to empirical data using a timeline analysis. Our work is informed by a combination of discourse analysis and in-depth interviews, undertaken during the summer of 2013. Drawing on the UK as a best practice example, we identify a set of structural business ecosystem properties: circular flow of resources, sustainability, demand that encourages supply, and dependence developing between suppliers, intermediaries, and users. However, significant gaps and shortcomings are found to remain. Most prominently, demand is not yet fully encouraging supply and actors have yet to experience fully mutual interdependence.”

Obama Signs Nation's First 'Open Data' Law


William Welsh in Information Week: “President Barack Obama enacted the nation’s first open data law, signing into law on May 9 bipartisan legislation that requires federal agencies to publish their spending data in a standardized, machine-readable format that the public can access through USASpending.gov.
The Digital Accountability and Transparency Act of 2014 (S. 994) amends the eight-year-old Federal Funding Accountability and Transparency Act to make available to the public specific classes of federal agency spending data “with more specificity and at a deeper level than is currently reported,” a White House statement said….
Advocacy groups applauded the bipartisan legislation, which is being heralded the nation’s first open data law and furnishes a legislative mandate for Obama’s one-year-old Open Data Policy.
“The DATA Act will unlock a new public resource that innovators, watchdogs, and citizens can mine for valuable and unprecedented insight into federal spending,” said Hudson Hollister, executive director of the Data Transparency Coalition. “America’s tech sector already has the tools to deliver reliable, standardized, open data. [The] historic victory will put our nation’s open data pioneers to work for the common good.”
The DATA Act requires agencies to establish government-wide standards for financial data, adopt accounting approaches developed by the Recovery Act’s Recovery Accountability and Transparency Board (RATB), and streamline agency reporting requirements.
The DATA Act empowers the Secretary of the Treasury to establish a data analytics center, which is modeled on the successful Recovery Operations Center. The new center will support inspectors general and law enforcement agencies in criminal and other investigations, as well as agency program offices in the prevention of improper payments. Assets of the RATB related to the Recovery Operations Center would transfer to the Treasury Department when the board’s authorization expires.
The treasury secretary and the Director of the White House’s Office of Management and Budget are jointly tasked with establishing the standards required to achieve the goals and objectives of the new statute.
To ensure that agencies comply with the reporting requirements, agency inspectors general will report on the quality and accuracy of the financial data provided to USASpending.gov. The Government Accountability Office also will report on the data quality and accuracy and create a Government-wide assessment of the financial data reported…”

Believe the hype: Big data can have a big social impact


Annika Small at the Guardian: “Given all the hype around so called big data at the moment, it would be easy to dismiss it as nothing more than the latest technology buzzword. This would be a mistake, given that the application and interpretation of huge – often publicly available – data sets is already supporting new models of creativity, innovation and engagement.
To date, stories of big data’s progress and successes have tended to come from government and the private sector, but we’ve heard little about its relevance to social organisations. Yet big data can fuel big social change.
It’s already playing a vital role in the charitable sector. Some social organisations are using existing open government data to better target their services, to improve advocacy and fundraising, and to support knowledge sharing and collaboration between different charities and agencies. Crowdsourcing of open data also offers a new way for not-for-profits to gather intelligence, and there is a wide range of freely available online tools to help them analyse the information.
However, realising the potential of big and open data presents a number of technical and organisational challenges for social organisations. Many don’t have the required skills, awareness and investment to turn big data to their advantage. They also tend to lack the access to examples that might help demystify the technicalities and focus on achievable results.
Overcoming these challenges can be surprisingly simple: Keyfund, for example, gained insight into what made for a successful application to their scheme through using a free, online tool to create word clouds out of all the text in their application forms. Many social organisations could use this same technique to better understand the large volume of unstructured text that they accumulate – in doing so, they would be “doing big data” (albeit in a small way). At the other end of the scale, Global Giving has developed its own sophisticated set of analytical tools to better understand the 57,000+ “stories” gathered from its network.
Innovation often happens when different disciplines collide and it’s becoming apparent that most value – certainly most social value – is likely to be created at the intersection of government, private and social sector data. That could be the combination of data from different sectors, or better “data collaboration” within sectors.
The Housing Association Charitable Trust (HACT) has produced two original tools that demonstrate this. Its Community Insight tool combines data from different sectors, allowing housing providers easily to match information about their stock to a large store of well-maintained open government figures. Meanwhile, its Housing Big Data programme is building a huge dataset by combining stats from 16 different housing providers across the UK. While Community Insight allows each organisation to gain better individual understanding of their communities (measuring well-being and deprivation levels, tracking changes over time, identifying hotspots of acute need), Housing Big Data is making progress towards a much richer network of understanding, providing a foundation for the sector to collaboratively identify challenges and quantify the impact of their interventions.
Alongside this specific initiative from HACT, it’s also exciting to see programmes such as 360giving, which forge connections between a range of private and social enterprises, and lays foundations for UK social investors to be a significant source of information over the next decade. Certainly, The Big Lottery Fund’s publication of open data late last year is a milestone which also highlights how far we have to travel as a sector before we are truly “data-rich”.
At Nominet Trust, we have produced the Social Tech Guide to demonstrate the scale and diversity of social value being generated internationally – much of which is achieved through harnessing the power of big data. From Knewton creating personally tailored learning programmes, to Cellslider using the power of the crowd to advance cancer research, there is no shortage of inspiration. The UN’s Global Pulse programme is another great example, with its focus on how we can combine private and public sources to pin down the size and shape of a social challenge, and calibrate our collective response.
These examples of data-driven social change demonstrate the huge opportunities for social enterprises to harness technology to generate insights, to drive more effective action and to fuel social change. If we are to realise this potential, we need to continue to stretch ourselves as social enterprises and social investors.”

Continued Progress and Plans for Open Government Data


Steve VanRoekel, and Todd Park at the White House:  “One year ago today, President Obama signed an executive order that made open and machine-readable data the new default for government information. This historic step is helping to make government-held data more accessible to the public and to entrepreneurs while appropriately safeguarding sensitive information and rigorously protecting privacy.
Freely available data from the U.S. government is an important national resource, serving as fuel for entrepreneurship, innovation, scientific discovery, and economic growth. Making information about government operations more readily available and useful is also core to the promise of a more efficient and transparent government. This initiative is a key component of the President’s Management Agenda and our efforts to ensure the government is acting as an engine to expand economic growth and opportunity for all Americans. The Administration is committed to driving further progress in this area, including by designating Open Data as one of our key Cross-Agency Priority Goals.
Over the past few years, the Administration has launched a number of Open Data Initiatives aimed at scaling up open data efforts across the Health, Energy, Climate, Education, Finance, Public Safety, and Global Development sectors. The White House has also launched Project Open Data, designed to share best practices, examples, and software code to assist federal agencies with opening data. These efforts have helped unlock troves of valuable data—that taxpayers have already paid for—and are making these resources more open and accessible to innovators and the public.
Other countries are also opening up their data. In June 2013, President Obama and other G7 leaders endorsed the Open Data Charter, in which the United States committed to publish a roadmap for our nation’s approach to releasing and improving government data for the public.
Building upon the Administration’s Open Data progress, and in fulfillment of the Open Data Charter, today we are excited to release the U.S. Open Data Action Plan. The plan includes a number of exciting enhancements and new data releases planned in 2014 and 2015, including:

  • Small Business Data: The Small Business Administration’s (SBA) database of small business suppliers will be enhanced so that software developers can create tools to help manufacturers more easily find qualified U.S. suppliers, ultimately reducing the transaction costs to source products and manufacture domestically.
  • Smithsonian American Art Museum Collection: The Smithsonian American Art Museum’s entire digitized collection will be opened to software developers to make educational apps and tools. Today, even museum curators do not have easily accessible information about their art collections. This information will soon be available to everyone.
  • FDA Adverse Drug Event Data: Each year, healthcare professionals and consumers submit millions of individual reports on drug safety to the Food and Drug Administration (FDA). These anonymous reports are a critical tool to support drug safety surveillance. Today, this data is only available through limited quarterly reports. But the Administration will soon be making these reports available in their entirety so that software developers can build tools to help pull potentially dangerous drugs off shelves faster than ever before.

We look forward to implementing the U.S. Open Data Action Plan, and to continuing to work with our partner countries in the G7 to take the open data movement global”.

How Helsinki Became the Most Successful Open-Data City in the World


Olli Sulopuisto in Atlantic Cities:  “If there’s something you’d like to know about Helsinki, someone in the city administration most likely has the answer. For more than a century, this city has funded its own statistics bureaus to keep data on the population, businesses, building permits, and most other things you can think of. Today, that information is stored and freely available on the internet by an appropriately named agency, City of Helsinki Urban Facts.
There’s a potential problem, though. Helsinki may be Finland’s capital and largest city, with 620,000 people. But it’s only one of more than a dozen municipalities in a metropolitan area of almost 1.5 million. So in terms of urban data, if you’re only looking at Helsinki, you’re missing out on more than half of the picture.
Helsinki and three of its neighboring cities are now banding together to solve that problem. Through an entity called Helsinki Region Infoshare, they are bringing together their data so that a fuller picture of the metro area can come into view.
That’s not all. At the same time these datasets are going regional, they’re also going “open.” Helsinki Region Infoshare publishes all of its data in formats that make it easy for software developers, researchers, journalists and others to analyze, combine or turn into web-based or mobile applications that citizens may find useful. In four years of operation, the project has produced more than 1,000 “machine-readable” data sources such as a map of traffic noise levels, real-time locations of snow plows, and a database of corporate taxes.
A global leader
All of this has put the Helsinki region at the forefront of the open-data movement that is sweeping cities across much of the world. The concept is that all kinds of good things can come from assembling city data, standardizing it and publishing it for free. Last month, Helsinki Region Infoshare was presented with the European Commission’s prize for innovation in public administration.

The project is creating transparency in government and a new digital commons. It’s also fueling a small industry of third-party application developers who take all this data and turn it into consumer products.
For example, Helsinki’s city council has a paperless system called Ahjo for handling its agenda items, minutes and exhibits that accompany council debates. Recently, the datasets underlying Ahjo were opened up. The city built a web-based interface for browsing the documents, but a software developer who doesn’t even live in Helsinki created a smartphone app for it. Now anyone who wants to keep up with just about any decision Helsinki’s leaders have before them can do so easily.
Another example is a product called BlindSquare, a smartphone app that helps blind people navigate the city. An app developer took the Helsinki region’s data on public transport and services, and mashed it up with location data from the social networking app Foursquare as well as mapping tools and the GPS and artificial voice capabilities of new smartphones. The product now works in dozens of countries and languages and sells for about €17 ($24 U.S.)

Helsinki also runs competitions for developers who create apps with public-sector data. That’s nothing new — BlindSquare won the Apps4Finland and European OpenCities app challenges in 2012. But this year, they’re trying a new approach to the app challenge concept, funded by the European Commission’s prize money and Sitra.
It’s called Datademo. Instead of looking for polished but perhaps random apps to heap fame and prize money on, Datademo is trying to get developers to aim their creative energies toward general goals city leaders think are important. The current competition specifies that apps have to use open data from the Helsinki region or from Finland to make it easier for citizens to find information and participate in democracy. The competition also gives developers seed funding upfront.
Datademo received more than 40 applications in its first round. Of those, the eight best suggestions were given three months and €2,000 ($2,770 U.S) to implement their ideas. The same process will be repeated two times, resulting in dozens of new app ideas that will get a total of €48,000 ($66,000 U.S.) in development subsidies. Keeping with the spirit of transparency, the voting and judging process is open to all who submit an idea for each round….”

Is Your City’s Crime Data Private Property?


Adam Wisnieski at the Crime Report: “In February, the Minneapolis Police Department (MPD) announced it was moving into a new era of transparency and openness with the launch of a new public crime map.
“Crime analysis and mapping data is now in the hands of the city’s citizens,” reads the first line of the press release.
According to the release, the MPD will feed incident report data to RAIDS (Regional Analysis and Information Data Sharing) Online, a nationwide crime map operated by crime analysis software company BAIR Analytics.
Since the announcement, Minneapolis residents have used RAIDS to look at reports of murder, robbery, burglary, assault, rape and other crimes reported in their neighborhoods on a sleek, easy-to-use map, which includes data as recent as yesterday.
On the surface, it’s a major leap forward for transparency in Minneapolis. But some question why the data feed is given exclusively to a single private company.
Transparency advocates argue in fact that the data is not truly in the hands of the city’s residents until citizens can download the raw data so they can analyze, chart or map it on their own.
“For it to actually be open data, it needs to be available to the public in machine readable format,” said Lauren Reid, senior public affairs manager for Code for America, a national non-profit that promotes participation in government through technology.
“Anybody should be able to go download it and read it if they want. That’s open data.”
The Open Knowledge Foundation, a national non-profit that advocates for more government openness, argues open data is important so citizens can participate and engage government in a way that was not possible before.
“Much of the time, citizens are only able to engage with their own governance sporadically — maybe just at an election every 4 or 5 years,” reads the Open Knowledge website. “By opening up data, citizens are enabled to be much more directly informed and involved in decision-making.
“This is more than transparency: it’s about making a full ‘read/write’ society — not just about knowing what is happening in the process of governance, but being able to contribute to it.”.
Minneapolis is not alone.
As Americans demand more information on criminal activity from the government, police departments are flocking to private companies to help them get the information into the public domain.
For many U.S. cities, hooking up with these third-party mapping vendors is the most public their police department has ever been. But the trend has started a messy debate about how “public” the public data actually is.
Outsourcing Makes It Easy
For police departments, outsourcing the presentation of their crime data to a private firm is an easy decision.
Most of the crime mapping sites are free or cost very little. (The Omega Group’s CrimeMapping.com charges between $600 and $2,400 per year, depending on the size of the agency.)
The department chooses what information it wants to provide. Once the system is set up, the data flows to the companies and then to the public without a lot of effort on behalf of the department.
For the most part, the move doesn’t need legislative approval, just a memorandum of understanding. A police department can even fulfill a new law requiring a public crime map by releasing report data through one of these vendors.
Commander Scott Gerlicher of the MPD’s Strategic Information and Crime Analysis Division says the software has saved the department time.
“I don’t think we are entertaining quite as many requests from the media or the public,” he told The Crime Report. “Plus the price was right: it was free.”
The companies that run some of the most popular sites — The Omega Group’s CrimeMapping.com, Public Engines’ CrimeReports and BAIR Analytics’ RAIDS — are in the business of selling crime analysis and mapping software to police departments.
Some departments buy internal software from these companies; though some cities, like Minneapolis, just use RAIDS’ free map and have no contracts with BAIR for internal software.
Susan Smith, director of operations at BAIR Analytics, said the goal of RAIDS is to create one national map that includes all crime reports from across all jurisdictions and departments (state and local police).
For people who live near or at the edge of a city line, finding relevant crime data can be hard.
The MPD’s Gerlicher said that was one reason his department chose RAIDS — because many police agencies in the Minneapolis area had already hooked up with the firm.
The operators of these crime maps say they provide a community service.
“We try to get as many agencies as we possibly can. We truly believe this is a good service for the community,” says Gabriela Coverdale, a marketing director at the Omega Group.
Raw Data ‘Off Limits’
However, the sites do not allow the public to download any of the raw data and prohibit anyone from “scraping,” using a program to automatically pull the data from their maps.
In Minneapolis, the police department continues to post PDFs and excel spreadsheets with data, but only RAIDS gets a feed with the most recent data.
Alan Palazzolo, a Code for America fellow who works as an interactive developer for the online non-profit newspaper MinnPost, used monthly reports from the MPD to build a crime application with a map and geographic-oriented chart of crime in Minneapolis.
Nevertheless, he finds the new tool limiting.
“[The MPD’s] ability to actually put out more data, and more timely data, really opens things up,” he said. “It’s great, but they are not doing that with us.”
According to Palazzolo, the arrangement gives BAIR a market advantage that effectively prevents its data from being used for purposes it cannot control.
“Having granular, complete, and historical data would allow us to do more in-depth analysis,” wrote Palazzolo and Kaeti Hinck in an article in MinnPost last year.
“Granular data would allow us to look at smaller areas,” reads the article. “[N]eighborhoods are a somewhat arbitrary boundary when it comes to crime. Often high crime is isolated to a couple of blocks, but aggregated data does not allow us to explore this.
“More complete data would allow us to look at factors like exact locations, time of day, demographic issues, and detailed categories (like bike theft).”
The question of preference gets even messier when looking at another national crime mapping website called SpotCrime.
Unlike the other third-party mapping sites, SpotCrime is not in the business of selling crime analysis software to police departments. It operates more like a newspaper — a newspaper focused solely on the police blotter pages — and makes money off advertising.
Years ago, SpotCrime requested and received crime report data via e-mail from the Minneapolis Police Department and mapped the data on its website. According to SpotCrime owner Colin Drane, the MPD stopped sending e-mails when terminals were set up in the police department for the public to access the data.
So he instead started going through the painstaking process of transferring data from PDFs the MPD posted online and mapping them.
When the MPD hooked up with RAIDS in February, Drane asked for the same feed and was denied. He says more and more police departments around the country are hooking up with one of his competitors and not giving him the same timely data.
The MPD said it prefers RAIDS over SpotCrime and criticized some of the advertisements on SpotCrime.
“We’re not about supporting ad money,” said Gerlicher.
Drane believes all crime data in every city should be open to everyone, in order to prevent any single firm from monopolizing how the information is presented and used.
“The onus needs to be on the public agencies,” he adds. “They need to be fair with the data and they need to be fair with the public.” he said.
Transparency advocates worry that the trend is going in the opposite direction.
Ohio’s Columbus Police Department, for example, recently discontinued its public crime statistic feed and started giving the data exclusively to RAIDS.
The Columbus Dispatch wrote that the new system had less information than the old…”

Open Data Could Unlock $230 Billion In Energy-Efficiency Savings


Jeff McMahon at Forbes: “Energy-efficiency startups just need access to existing data—on electricity usage, housing characteristics, renovations and financing—to unlock hundreds of billions of dollars in savings, two founders of  startups said in Chicago Tuesday.
“One of the big barriers to scaling energy efficiency is the lack of data in the market,” said Andy Frank of Sealed, a startup that encourages efficiency improvements by guaranteeing homeowners a lower bill than they’re paying now.
In a forum hosted by the Energy Policy Institute at Chicago, Frank and Matt Gee, founder of Effortless Energy, advocated an open-energy-data warehouse that would collect anonymized data from utilities, cities, contractors, and financiers, to make the data available for research, government, and industry.
“There needs to be some sort of entity that organizes all this information and has it in some sort of standard format,” said Gee, whose startup pays for home improvements up front and then splits the savings with investors and the homeowner.
According to Gee, the current $9.5 billion energy-efficiency market operates without data on the actual savings it produces for homeowners. He outlined the current market like this:

  1. A regulatory body, usually a public utility commission, mandates that a utility spend money on efficiency.
  2. The utility passes on the cost to customers through an efficiency surcharge (this is how the $9.5 billion is raised).
  3. The utility hires a program implementer.
  4. The program implementer sends auditors to customer homes.
  5. Potential savings from improvements like new insulation or new appliances are estimated based on models.
  6. Those modeled estimates determine what the contractor can do in the home.
  7. The modeled estimates determine what financing is available.

In some cases, utilities will hire consultants to estimate the savings generated from these improvements. California utilities spend $40 million a year estimating savings, Gee said, but actual savings are neither verified nor integrated in the process.
“Nowhere in this process do actual savings enter,” Gee said. “They don’t drive anyone’s incentives, which is just absolutely astounding, right? The opportunity here is that energy efficiency actually pays for itself. It should be something that’s self-financed.”
For that to happen, the market needs reliable information on how much energy is currently being wasted and how much is saved by improvements….”

The "Accessibility Map"


Webby 2014 Nominee: “Project Goal is to make information about accessible venues accessible to people.

About venues where people with disabilities can engage in sports and recreational activities, and live full lives without any barriers or stereotypes.

The Solution

To develop a website where everyone can not only find accessible venues in each city, but also add new venues to the website’s database.
Creating the accessibility rating list for russian cities to get an idea how accessible a particular city is, will draw the local governement’ s attention to this problem.
The foundation of the website is an interactive map of accessible venues in Russia, which can help people with disabilities find locations where they can participate in sports, take classes or recreate.
All you need to do is choose the necessary city and street, and the map will show all the accessible venues in the city.

The Result

After a few months of operation:
over 14 000 venues
over 600 cities
millions of people with disabilities have become able to live full lives

Project’s Website: kartadostupnosti.ru