Adam Wisnieski at the Crime Report: “In February, the Minneapolis Police Department (MPD) announced it was moving into a new era of transparency and openness with the launch of a new public crime map.
“Crime analysis and mapping data is now in the hands of the city’s citizens,” reads the first line of the press release.
According to the release, the MPD will feed incident report data to RAIDS (Regional Analysis and Information Data Sharing) Online, a nationwide crime map operated by crime analysis software company BAIR Analytics.
Since the announcement, Minneapolis residents have used RAIDS to look at reports of murder, robbery, burglary, assault, rape and other crimes reported in their neighborhoods on a sleek, easy-to-use map, which includes data as recent as yesterday.
On the surface, it’s a major leap forward for transparency in Minneapolis. But some question why the data feed is given exclusively to a single private company.
Transparency advocates argue in fact that the data is not truly in the hands of the city’s residents until citizens can download the raw data so they can analyze, chart or map it on their own.
“For it to actually be open data, it needs to be available to the public in machine readable format,” said Lauren Reid, senior public affairs manager for Code for America, a national non-profit that promotes participation in government through technology.
“Anybody should be able to go download it and read it if they want. That’s open data.”
The Open Knowledge Foundation, a national non-profit that advocates for more government openness, argues open data is important so citizens can participate and engage government in a way that was not possible before.
“Much of the time, citizens are only able to engage with their own governance sporadically — maybe just at an election every 4 or 5 years,” reads the Open Knowledge website. “By opening up data, citizens are enabled to be much more directly informed and involved in decision-making.
“This is more than transparency: it’s about making a full ‘read/write’ society — not just about knowing what is happening in the process of governance, but being able to contribute to it.”.
Minneapolis is not alone.
As Americans demand more information on criminal activity from the government, police departments are flocking to private companies to help them get the information into the public domain.
For many U.S. cities, hooking up with these third-party mapping vendors is the most public their police department has ever been. But the trend has started a messy debate about how “public” the public data actually is.
Outsourcing Makes It Easy
For police departments, outsourcing the presentation of their crime data to a private firm is an easy decision.
Most of the crime mapping sites are free or cost very little. (The Omega Group’s CrimeMapping.com charges between $600 and $2,400 per year, depending on the size of the agency.)
The department chooses what information it wants to provide. Once the system is set up, the data flows to the companies and then to the public without a lot of effort on behalf of the department.
For the most part, the move doesn’t need legislative approval, just a memorandum of understanding. A police department can even fulfill a new law requiring a public crime map by releasing report data through one of these vendors.
Commander Scott Gerlicher of the MPD’s Strategic Information and Crime Analysis Division says the software has saved the department time.
“I don’t think we are entertaining quite as many requests from the media or the public,” he told The Crime Report. “Plus the price was right: it was free.”
The companies that run some of the most popular sites — The Omega Group’s CrimeMapping.com, Public Engines’ CrimeReports and BAIR Analytics’ RAIDS — are in the business of selling crime analysis and mapping software to police departments.
Some departments buy internal software from these companies; though some cities, like Minneapolis, just use RAIDS’ free map and have no contracts with BAIR for internal software.
Susan Smith, director of operations at BAIR Analytics, said the goal of RAIDS is to create one national map that includes all crime reports from across all jurisdictions and departments (state and local police).
For people who live near or at the edge of a city line, finding relevant crime data can be hard.
The MPD’s Gerlicher said that was one reason his department chose RAIDS — because many police agencies in the Minneapolis area had already hooked up with the firm.
The operators of these crime maps say they provide a community service.
“We try to get as many agencies as we possibly can. We truly believe this is a good service for the community,” says Gabriela Coverdale, a marketing director at the Omega Group.
Raw Data ‘Off Limits’
However, the sites do not allow the public to download any of the raw data and prohibit anyone from “scraping,” using a program to automatically pull the data from their maps.
In Minneapolis, the police department continues to post PDFs and excel spreadsheets with data, but only RAIDS gets a feed with the most recent data.
Alan Palazzolo, a Code for America fellow who works as an interactive developer for the online non-profit newspaper MinnPost, used monthly reports from the MPD to build a crime application with a map and geographic-oriented chart of crime in Minneapolis.
Nevertheless, he finds the new tool limiting.
“[The MPD’s] ability to actually put out more data, and more timely data, really opens things up,” he said. “It’s great, but they are not doing that with us.”
According to Palazzolo, the arrangement gives BAIR a market advantage that effectively prevents its data from being used for purposes it cannot control.
“Having granular, complete, and historical data would allow us to do more in-depth analysis,” wrote Palazzolo and Kaeti Hinck in an article in MinnPost last year.
“Granular data would allow us to look at smaller areas,” reads the article. “[N]eighborhoods are a somewhat arbitrary boundary when it comes to crime. Often high crime is isolated to a couple of blocks, but aggregated data does not allow us to explore this.
“More complete data would allow us to look at factors like exact locations, time of day, demographic issues, and detailed categories (like bike theft).”
The question of preference gets even messier when looking at another national crime mapping website called SpotCrime.
Unlike the other third-party mapping sites, SpotCrime is not in the business of selling crime analysis software to police departments. It operates more like a newspaper — a newspaper focused solely on the police blotter pages — and makes money off advertising.
Years ago, SpotCrime requested and received crime report data via e-mail from the Minneapolis Police Department and mapped the data on its website. According to SpotCrime owner Colin Drane, the MPD stopped sending e-mails when terminals were set up in the police department for the public to access the data.
So he instead started going through the painstaking process of transferring data from PDFs the MPD posted online and mapping them.
When the MPD hooked up with RAIDS in February, Drane asked for the same feed and was denied. He says more and more police departments around the country are hooking up with one of his competitors and not giving him the same timely data.
The MPD said it prefers RAIDS over SpotCrime and criticized some of the advertisements on SpotCrime.
“We’re not about supporting ad money,” said Gerlicher.
Drane believes all crime data in every city should be open to everyone, in order to prevent any single firm from monopolizing how the information is presented and used.
“The onus needs to be on the public agencies,” he adds. “They need to be fair with the data and they need to be fair with the public.” he said.
Transparency advocates worry that the trend is going in the opposite direction.
Ohio’s Columbus Police Department, for example, recently discontinued its public crime statistic feed and started giving the data exclusively to RAIDS.
The Columbus Dispatch wrote that the new system had less information than the old…”
Open Data Could Unlock $230 Billion In Energy-Efficiency Savings
Jeff McMahon at Forbes: “Energy-efficiency startups just need access to existing data—on electricity usage, housing characteristics, renovations and financing—to unlock hundreds of billions of dollars in savings, two founders of startups said in Chicago Tuesday.
“One of the big barriers to scaling energy efficiency is the lack of data in the market,” said Andy Frank of Sealed, a startup that encourages efficiency improvements by guaranteeing homeowners a lower bill than they’re paying now.
In a forum hosted by the Energy Policy Institute at Chicago, Frank and Matt Gee, founder of Effortless Energy, advocated an open-energy-data warehouse that would collect anonymized data from utilities, cities, contractors, and financiers, to make the data available for research, government, and industry.
“There needs to be some sort of entity that organizes all this information and has it in some sort of standard format,” said Gee, whose startup pays for home improvements up front and then splits the savings with investors and the homeowner.
According to Gee, the current $9.5 billion energy-efficiency market operates without data on the actual savings it produces for homeowners. He outlined the current market like this:
- A regulatory body, usually a public utility commission, mandates that a utility spend money on efficiency.
- The utility passes on the cost to customers through an efficiency surcharge (this is how the $9.5 billion is raised).
- The utility hires a program implementer.
- The program implementer sends auditors to customer homes.
- Potential savings from improvements like new insulation or new appliances are estimated based on models.
- Those modeled estimates determine what the contractor can do in the home.
- The modeled estimates determine what financing is available.
In some cases, utilities will hire consultants to estimate the savings generated from these improvements. California utilities spend $40 million a year estimating savings, Gee said, but actual savings are neither verified nor integrated in the process.
“Nowhere in this process do actual savings enter,” Gee said. “They don’t drive anyone’s incentives, which is just absolutely astounding, right? The opportunity here is that energy efficiency actually pays for itself. It should be something that’s self-financed.”
For that to happen, the market needs reliable information on how much energy is currently being wasted and how much is saved by improvements….”
The "Accessibility Map"
Webby 2014 Nominee: “Project Goal is to make information about accessible venues accessible to people.
The Solution
To develop a website where everyone can not only find accessible venues in each city, but also add new venues to the website’s database.
Creating the accessibility rating list for russian cities to get an idea how accessible a particular city is, will draw the local governement’ s attention to this problem.
The foundation of the website is an interactive map of accessible venues in Russia, which can help people with disabilities find locations where they can participate in sports, take classes or recreate.
All you need to do is choose the necessary city and street, and the map will show all the accessible venues in the city.
The Result
After a few months of operation:
— over 14 000 venues
— over 600 cities
— millions of people with disabilities have become able to live full lives
Project’s Website: kartadostupnosti.ru“
Sharing in a Changing Climate
Helen Goulden in the Huffington Post: “Every month, a social research agency conducts a public opinion survey on 30,000 UK households. As part of this households are asked about what issues they think are the most important; things such as crime, unemployment, inequality, public health etc. Climate change has ranked so consistently low on these surveys that they don’t both asking any more.
On first glance, it would appear that most people don’t care about a changing climate.
Yet, that’s simply not true. Many people care deeply, but fleetingly – in the same way they may consider their own mortality before getting back to thinking about what to have for tea. And others care, but fail to change their behaviour in a way that’s proportionate to their concerns. Certainly that’s my unhappy stomping ground.
Besides what choices do we really have? Even the most progressive, large organisations have been glacial to move towards any form of real form of sustainability. For many years we have struggled with the Frankenstein-like task of stitching ‘sustainability’ onto existing business and economic models and the results, I think, speak for themselves.
That the Collaborative Economy presents us with an opportunity – in Napster-like ways – to disrupt and evolve toward something more sustainable is compelling idea. Looking out to a future filled with opportunities to reconfigure how we produce, consume and dispose of the things we want and need to live, work and play.
Whether the journey toward sustainability is short or long, it will be punctuated with a good degree of turbulence, disruption and some largely unpredictable events. How we deal with those events and what role communities, collaboration and technology play may set the framework and tone for how that future evolves. Crises and disruption to our entrenched living patterns present ripe opportunities for innovation and space for adopting new behaviours and practices.
No-one is immune from the impact of erratic and extreme weather events. And if we accept that these events are going to increase in frequency, we must draw the conclusion that emergency state and government resources may be drawn more thinly over time.
Across the world, there is a fairly well organised state and international infrastructure for dealing with emergencies , involving everyone from the Disaster Emergency Committee, the UN, central and local government and municipalities, not for profit organisations and of course, the military. There is a clear reason why we need this kind of state emergency response; I’m not suggesting that we don’t.
But through the rise of open data and mass participation in platforms that share location, identity and inventory, we are creating a new kind of mesh; a social and technological infrastructure that could considerably strengthen our ability to respond to unpredictable events.
In the last few years we have seen a sharp rise in the number of tools and crowdsourcing platforms and open source sensor networks that are focused on observing, predicting or responding to extreme events:
• Apps like Shake Alert, which emits a minute warning that an earthquake is coming
• Rio’s sensor network, which measures rainfall outside the city and can predict flooding
• Open Source sensor software Arduino which is being used to crowd-source weather and pollution data
• Propeller Health, which is using Asthma sensors on inhalers to crowd-source pollution hotspots
• Safecast, which was developed for crowdsourcing radiation levels in Japan
Increasingly we have the ability to deploy open source, distributed and networked sensors and devices for capturing and aggregating data that can help us manage our responses to extreme weather (and indeed, other kinds of) events.
Look at platforms like LocalMind and Foursquare. Today, I might be using them to find out whether there’s a free table at a bar or finding out what restaurant my friends are in. But these kind of social locative platforms present an infrastructure that could be life-saving in any kind of situation where you need to know where to go quickly to get out of trouble. We know that in the wake of disruptive events and disasters, like bombings, riots etc, people now intuitively and instinctively take to technology to find out what’s happening, where to go and how to co-ordinate response efforts.
During the 2013 Bart Strike in San Francisco, ventures like Liquid Space and SideCar enabled people to quickly find alternative places to work, or alternatives to public transport, to mitigate the inconvenience of the strike. The strike was a minor inconvenience compared to the impact of a hurricane and flood but nevertheless, in both those instances, ventures decided waive their fees; as did AirBnB when 1,400 New York AirBnB hosts opened their doors to people who had been left homeless through Hurricane Sandy in 2012.
The impulse to help is not new. The matching of people’s offers of help and resources to on-the-ground need, in real time, is.”
On the barriers for local government releasing open data
Paper by Peter Conradie and Dr. Sunil Choenni in Government Information Quarterly: “Due to expected benefits such as citizen participation and innovation, the release of Public Sector Information as open data is getting increased attention on various levels of government. However, currently data release by governments is still novel and there is little experience and knowledge thus far about its benefits, costs and barriers. This is compounded by a lack of understanding about how internal processes influence data release. Our aim in this paper is to get a better understanding of these processes and how they influence data release, i.e., to find determinants for the release of public sector information. For this purpose, we conducted workshops, interviews, questionnaires, desk research and practice based cases in the education program of our university, involving six local public sector organizations. We find that the way data is stored, the way data is obtained and the way data is used by a department are crucial indicators for open data release. We conclude with the lessons learned based on our research findings. These findings are: we should take a nuanced approach towards data release, avoid releasing data for its own sake, and take small incremental steps to explore data release.”
Open Government Data Gains Global Momentum
Wyatt Kash in Information Week: “Governments across the globe are deepening their strategic commitments and working more closely to make government data openly available for public use, according to public and private sector leaders who met this week at the inaugural Open Government Data Forum in Abu Dhabi, hosted by the United Nations and the United Arab Emirates, April 28-29.
Data experts from Europe, the Middle East, the US, Canada, Korea, and the World Bank highlighted how one country after another has set into motion initiatives to expand the release of government data and broaden its use. Those efforts are gaining traction due to multinational organizations, such as the Open Government Partnership, the Open Data Institute, The World Bank, and the UN’s e-government division, that are trying to share practices and standardize open data tools.
In the latest example, the French government announced April 24 that it is joining the Open Government Partnership, a group of 64 countries working jointly to make their governments more open, accountable, and responsive to citizens. The announcement caps a string of policy shifts, which began with the formal release of France’s Open Data Strategy in May 2011 and which parallel similar moves by the US.
The strategy committed France to providing “free access and reuse of public data… using machine-readable formats and open standards,” said Romain Lacombe, head of innovation for the French prime minister’s open government task force, Etalab. The French government is taking steps to end the practice of selling datasets, such as civil and case-law data, and is making them freely reusable. France launched a public data portal, Data.gouv.fr, in December 2011 and joined a G8 initiative to engage with open data innovators worldwide.
For South Korea, open data is not just about achieving greater transparency and efficiency, but is seen as digital fuel for a nation that by 2020 expects to achieve “ambient intelligence… when all humans and things are connected together,” said Dr. YoungSun Lee, who heads South Korea’s National Information Society Agency.
He foresees open data leading to a shift in the ways government will function: from an era of e-government, where information is delivered to citizens, to one where predictive analysis will foster a “creative government,” in which “government provides customized services for each individual.”
The open data movement is also propelling innovative programs in the United Arab Emirates. “The role of open data in directing economic and social decisions pertaining to investments… is of paramount importance” to the UAE, said Dr. Ali M. Al Khouri, director general of the Emirates Identity Authority. It also plays a key role in building public trust and fighting corruption, he said….”
LEG/EX – Legislative Explorer
LEG/EX: Legislative Explorer:Data Driven Discovery: “A one of a kind interactive visualization that allows anyone to explore actual patterns of lawmaking in Congress.
Compare the bills and resolutions introduced by Senators and Representatives and follow their progress from the beginning to the end of a two year Congress.
Filter by topic, type of legislation, chamber, party, member, or even search for a specific bill.
The Legislative Process from the Library of Congress
Who’s your Representative or Senator?
LegSim — a student run simulation for government courses”
Passage Of The DATA Act Is A Major Advance In Government Transparency
OpEd by Hudson Hollister in Forbes: “Even as the debate over official secrecy grows on Capitol Hill, basic information about our government’s spending remains hidden in plain sight.
Information that is technically public — federal finance, awards, and expenditures — is effectively locked within a disconnected disclosure system that relies on outdated paper-based technology. Budgets, grants, contracts, and disbursements are reported manually and separately, using forms and spreadsheets. Researchers seeking insights into federal spending must invest time and resources crafting data sets out of these documents. Without common data standards across all government spending, analyses of cross-agency spending trends require endless conversions of apples to oranges.
For a nation whose tech industry leads the world, there is no reason to allow this antiquated system to persist.
That’s why we’re excited to welcome Thursday’s unanimous Senate approval of the Digital Accountability and Transparency Act — known as the DATA Act.
The DATA Act will mandate government-wide standards for federal spending data. It will also require agencies to publish this information online, fully searchable and open to everyone.
Watchdogs and transparency advocates from across the political spectrum have endorsed the DATA Act because all Americans will benefit from clear, accessible information about how their tax dollars are being spent.
It is darkly appropriate that the only organized opposition to this bill took place behind closed doors. In January, Senate sponsors Mark Warner (D-VA) and Rob Portman (R-OH) rejected amendments offered privately by the White House Office of Management and Budget. These nonpublic proposals would have gutted the DATA Act’s key data standards requirement. But Warner and Portman went public with their opposition, and Republicans and Democrats agreed to keep a strong standards mandate.
We now await swift action by the House of Representatives to pass this bill and put it on the President’s desk.
The tech industry is already delivering the technology and expertise that will use federal spending data, once it is open and standardized, to solve problems.
If the DATA Act is fully enforced, citizens will be able to track government spending on a particular contractor or from a particular program, payment by payment. Agencies will be able to deploy sophisticated Big Data analytics to illuminate, and eliminate, waste and fraud. And states and universities will be able to automate their complex federal grant reporting tasks, freeing up more tax dollars for their intended use. Our industry can perform these tasks — as soon as we get the data.
Chairman Earl Devaney’s Recovery Accountability and Transparency Board proved this is possible. Starting in 2009, the Recovery Board applied data standards to track stimulus spending. Our members’ software used that data to help inspectors general prevent and recover over $100 million in spending on suspicious grantees and contractors. The DATA Act applies that approach across the whole of government spending.
Congress is now poised to pass this landmark legislative mandate to transform spending from disconnected documents into open data. Next , the executive branch must implement that mandate.
So our Coalition’s work continues. We will press the Treasury Department and the White House to adopt robust, durable, and nonproprietary data standards for federal spending.
And we won’t stop with spending transparency. The American people deserve access to open data across all areas of government activity — financial regulatory reporting, legislative actions, judicial filings, and much more….”
The Open Data 500: Putting Research Into Action
TheGovLab Blog: “On April 8, the GovLab made two significant announcements. At an open data event in Washington, DC, I was pleased to announce the official launch of the Open Data 500, our study of 500 companies that use open government data as a key business resource. We also announced that the GovLab is now planning a series of Open Data Roundtables to bring together government agencies with the businesses that use their data – and that five federal agencies have agreed to participate. Video of the event, which was hosted by the Center for Data Innovation, is available here.
The Open Data 500, funded by the John S. and James L. Knight Foundation, is the first comprehensive study of U.S.-based companies that rely on open government data. Our website at OpenData500.com includes searchable, sortable information on 500 of these companies. Our data about them comes from responses to a survey we’ve sent to all the companies (190 have responded) and what we’ve been able to learn from research using public information. Anyone can now explore this website, read about specific companies or groups of companies, or download our data to analyze it. The website features an interactive tool on the home page, the Open Data Compass, that shows the connections between government agencies and different categories of companies visually.
We began work on the Open Data 500 study last fall with three goals. First, we wanted to collect information that will ultimately help calculate the economic value of open data – an important question for policymakers and others. Second, we wanted to present examples of open data companies to inspire others to use this important government resource in new ways. And third – and perhaps most important – we’ve hoped that our work will be a first step in creating a dialogue between the government agencies that provide open data and the companies that use it.
That dialogue is critically important to make government open data more accessible and useful. While open government data is a huge potential resource, and federal agencies are working to make it more available, it’s too often trapped in legacy systems that make the data difficult to find and to use. To solve this problem, we plan to connect agencies to their clients in the business community and help them work together to find and liberate the most valuable datasets.
We now plan to convene and facilitate a series of Open Data Roundtables – a new approach to bringing businesses and government agencies together. In these Roundtables, which will be informed by the Open Data 500 study, companies and the agencies that provide their data will come together in structured, results-oriented meetings that we will facilitate. We hope to help figure out what can be done to make the most valuable datasets more available and usable quickly.
We’ve been gratified by the immediate positive response to our plan from several federal agencies. The Department of Commerce has committed to help plan and participate in the first of our Roundtables, now being scheduled for May. By the time we announced our launch on April 8, the Departments of Labor, Transportation, and Treasury had also signed up. And at the end of the launch event, the Deputy Chief Information Officer of the USDA publicly committed her agency to participate as well…”
Benchmarking open government: An open data perspective
Paper by N Veljković, S Bogdanović-Dinić, and L Stoimenov in Government Information Quarterly: “This paper presents a benchmark proposal for the Open Government and its application from the open data perspective using data available on the U.S. government’s open data portal (data.gov). The benchmark is developed over the adopted Open Government conceptual model, which describes Open Government through data openness, transparency, participation and collaboration. Resulting in two measures, that is, one known as the e-government openness index (eGovOI) and the other Maturity, the benchmark indicates the progress of government over time, the efficiency of recognizing and implementing new concepts and the willingness of the government to recognize and embrace innovative ideas.”