Can Business And Tech Transform The Way Our Government Works By 2020?


Ben Schiller at Co.Exist: “The rise of open data, crowd-sourcing, predictive analytics, and other big tech trends, aren’t just for companies to contend with. They’re also a challenge for government. New technology gives public agencies the opportunity to develop and deliver services in new ways, track results more accurately, and open up decision-making.
Deloitte’s big new Government 2020 report looks at the trends impacting government and lays out a bunch of ideas for how they can innovate. We picked out a few below. There are more infographics in the slide show.

Consumerization of public services

Deloitte expects entrepreneurs to “develop innovative and radically user-friendly approaches to satisfy unmet consumer demand for better public services.” Startups like Uber or Lyft “reinvigorated transportation.” Now it expects a similar “focus on seamless customer experiences” in education and health care.

Open workforce

Deloitte expects governments to become looser: collections of people doing a job, rather than large hierarchical structures. “Governments [will] expand their talent networks to include ‘partnership talent’ (employees who are parts of joint ventures), ‘borrowed talent’ (employees of contractors), ‘freelance talent’ (independent, individual contractors) and ‘open-source talent,'” the report says.

Outcome based legislation

Just as big data analytics allows companies to measure the effectiveness of marketing campaigns, so it allows governments to measure how well legislation and regulation is working. They can “shift from a concentration on processes to the achievement of specific targets.” And, if the law isn’t working, someone has the data to throw it out….”

Data is Law


Mark Headd at Civic Innovations: The Future is Open: “In his famous essay on the importance of the technological underpinnings of the Internet, Lawrence Lessig described the potential threat if the architecture of cyberspace was built on values that diverged from those we believe are important to the proper functioning of our democracy. The central point of this seminal work seems to grow in importance each day as technology and the Internet become more deeply embedded into our daily lives.
But increasingly, another kind of architecture is becoming central to the way we live and interact with each other – and to the way in which we are governed and how we interact with those that govern us. This architecture is used by governments at the federal, state and local level to share data with the public.
This data – everything from weather data, economic data, education data, crime data, environmental data – is becoming increasingly important for how we view the world around us and our perception of how we are governed. It is quite easy for us to catalog the wide range of personal decisions – some rote, everyday decisions like what to wear based on the weather forecast, and some much more substantial like where to live or where to send our children to school – that are influenced by data collected, maintained or curated by government.
It seems to me that Lessig’s observations from a decade and a half ago about the way in which the underlying architecture of the Internet may affect our democracy can now be applied to data. Ours is the age of data – it pervades every aspect of our lives and influences how we raise our children, how we spend our time and money and who we elect to public office.
But even more fundamental to our democracy, how well our government leaders are performing the job we empower them to do depends on data. How effective is policing in reducing the number of violent crimes? How effective are environmental regulations in reducing dangerous emissions? How well are programs performing to lift people out of poverty and place them in gainful employment? How well are schools educating our children?
These are all questions that we answer – in whole or in part – by looking at data. Data that governments themselves are largely responsible for compiling and publishing….
Having access to open data is no longer an option for participating effectively in our modern democracy, it’s a requirement. Data – to borrow Lessig’s argument – has become law.”

Governments and Citizens Getting to Know Each Other? Open, Closed, and Big Data in Public Management Reform


New paper by Amanda Clarke and Helen Margetts in Policy and Internet: “Citizens and governments live increasingly digital lives, leaving trails of digital data that have the potential to support unprecedented levels of mutual government–citizen understanding, and in turn, vast improvements to public policies and services. Open data and open government initiatives promise to “open up” government operations to citizens. New forms of “big data” analysis can be used by government itself to understand citizens’ behavior and reveal the strengths and weaknesses of policy and service delivery. In practice, however, open data emerges as a reform development directed to a range of goals, including the stimulation of economic development, and not strictly transparency or public service improvement. Meanwhile, governments have been slow to capitalize on the potential of big data, while the largest data they do collect remain “closed” and under-exploited within the confines of intelligence agencies. Drawing on interviews with civil servants and researchers in Canada, the United Kingdom, and the United States between 2011 and 2014, this article argues that a big data approach could offer the greatest potential as a vehicle for improving mutual government–citizen understanding, thus embodying the core tenets of Digital Era Governance, argued by some authors to be the most viable public management model for the digital age (Dunleavy, Margetts, Bastow, & Tinkler, 2005, 2006; Margetts & Dunleavy, 2013).”
 

Climaps


Climaps: “This website presents the results of the EU research project EMAPS, as well as its process: an experiment to use computation and visualization to harness the increasing availability of digital data and mobilize it for public debate. To do so, EMAPS gathered a team of social and data scientists, climate experts and information designers. It also reached out beyond the walls of Academia and engaged with the actors of the climate debate.

The climate is changing. Efforts to reduce greenhouse emissions have so far been ineffective or, at least, insufficient. As the impacts of global warming are emerging, our societies experience an unprecedented pressure. How to live with climate change without giving up fighting it? How to share the burden of adaptation among countries, regions and communities? How to be fair to all human and non-human beings affected by such a planetary transition? Since our collective life depends on these questions, they deserve discussion, debate and even controversy. To provide some help to navigate in the uncharted territories that lead to our future, here is an electronic atlas. It proposes a series of maps and stories related to climate adaptation issues. They are not exhaustive or error-proof. They are nothing but sketches of the new world in which we will have to live. Such a world remains undetermined and its atlas can be but tentative…(More)”

Gamifying Cancer Research Crowdsources the Race for the Cure


Jason Brick at PSFK: “Computer time and human hours are among of the biggest obstacles in the face of progress in the fight against cancer. Researchers have terabytes of data, but only so many processors and people with which to analyze it. Much like the SETI program (Search for Extra Terrestrial Intelligence), it’s likely that big answers are already in the information we’ve collected. They’re just waiting for somebody to find them.
Reverse the Odds, a free mobile game from Cancer Research UK, accesses the combined resources of geeks and gamers worldwide. It’s a simple app game, the kind you play in line at the bank or while waiting at the dentist’s office, in which you complete mini puzzles and buy upgrades to save an imaginary world.
Each puzzle of the game is a repurposing of cancer data. Players find patterns in the data — the exact kind of analysis grad students and volunteers in a lab look for — and the results get compiled by Cancer Research UK for use in finding a cure. Errors are expected and accounted for because the thousands of players expected will round out the occasional mistake….(More)”

Launching Disasters.Data.Gov


Meredith Lee, Heather King, and Brian Forde at the OSTP Blog: “Strengthening our Nation’s resilience to disasters is a shared responsibility, with all community members contributing their unique skills and perspectives. Whether you’re a data steward who can unlock information and foster a culture of open data, an innovator who can help address disaster preparedness challenges, or a volunteer ready to join the “Innovation for Disasters” movement, we are excited for you to visit the new disasters.data.gov site, launching today.
First previewed at the White House Innovation for Disaster Response and Recovery Initiative Demo Day, disasters.data.gov is designed to be a public resource to foster collaboration and the continual improvement of disaster-related open data, free tools, and new ways to empower first responders, survivors, and government officials with the information needed in the wake of a disaster.
A screenshot from the new disasters.data.gov web portal.
Today, the Administration is unveiling the first in a series of Innovator Challenges that highlight pressing needs from the disaster preparedness community. The inaugural Innovator Challenge focuses on a need identified from firsthand experience of local emergency management, responders, survivors, and Federal departments and agencies. The challenge asks innovators across the nation: “How might we leverage real-time sensors, open data, social media, and other tools to help reduce the number of fatalities from flooding?”
In addition to this first Innovator Challenge, here are some highlights from disasters.data.gov:….(More)”

How Government Can Unlock Economic Benefits from Open Data


at GovTech: “Zillow, the fast-growing online real estate marketplace, couldn’t exist without public data. More specifically, it probably couldn’t exist without online public data relating to real estate sales information. The nation has more than 3,000 counties, each with its own registry of deeds where routine but vital data are recorded on every transaction involving the sale of homes, businesses and land. Until recently, much of that information resided in paper documents stored in filing cabinets. But as that information has moved online, its value has increased, making it possible for firms like Zillow to use the data in new ways, creating its popular “zestimate” forecast on home values.

Zillow is a prime example of how open data creates economic value. The Seattle-based company has grown rapidly since its launch in 2006, generating more than $78 million in revenue in its last financial quarter and employing more than 500 workers. But real estate firms aren’t the only businesses benefiting from data collected and published by government.
GovLab, a research laboratory run by New York University, publishes the Open Data 500, a list of companies that benefit from open data produced by the federal government. The list contains more than 15 categories of businesses, ranging from health care and education to energy, finance, legal and the environment. And the data flows from all the major agencies, including NASA, Defense, Transportation, Homeland Security and Labor….
Zillow’s road to success underscores the challenges that lie ahead if local government is going to grab its share of open data’s economic bonanza. One of the company’s biggest hurdles was to create a system that could integrate government data from thousands of databases in county government. “There’s no standard format, which is very frustrating,” Stan Humphries, Zillow’s chief economist, told Computerworld.com. “It’s up to us to figure out 3,000 different ways to ingest data and make sense of it…. More at GovTech

An Introduction to the Economic Analysis of Open Data


Research Note by Soichiro Takagi: “Open data generally refers to a movement in which public organizations provide data in a machine-readable format to the public, so that anyone can reuse the data. Open data is becoming an important phenomenon in Japan. At this moment, utilization of open data in Japan is emerging with collaborative efforts among small units of production such as individuals. These collaborations have been also observed in the Open Source Software (OSS) movement, but collaboration in open data is somewhat different in respect to small-scale, distributed collaboration. The aim of this research note is to share the phenomena of open data as an object of economic analysis with readers by describing the movement and providing a preliminary analysis. This note discusses how open data is associated with mass collaboration from the viewpoint of organizational economics. It also provides the results of empirical analysis on how the regional characteristics of municipalities affect the decision of local governments to conduct open data initiatives.”

The Free 'Big Data' Sources Everyone Should Know


Bernard Marr at Linkedin Pulse: “…The moves by companies and governments to put large amounts of information into the public domain have made large volumes of data accessible to everyone….here’s my rundown of some of the best free big data sources available today.

Data.gov

The US Government pledged last year to make all government data available freely online. This site is the first stage and acts as a portal to all sorts of amazing information on everything from climate to crime. To check it out, click here.

US Census Bureau

A wealth of information on the lives of US citizens covering population data, geographic data and education. To check it out, click here. To check it out, click here.

European Union Open Data Portal

As the above, but based on data from European Union institutions. To check it out, click here.

Data.gov.uk

Data from the UK Government, including the British National Bibliography – metadata on all UK books and publications since 1950. To check it out, click here.

The CIA World Factbook

Information on history, population, economy, government, infrastructure and military of 267 countries. To check it out, click here.

Healthdata.gov

125 years of US healthcare data including claim-level Medicare data, epidemiology and population statistics. To check it out, click here.

NHS Health and Social Care Information Centre

Health data sets from the UK National Health Service. To check it out, click here.

Amazon Web Services public datasets

Huge resource of public data, including the 1000 Genome Project, an attempt to build the most comprehensive database of human genetic information and NASA’s database of satellite imagery of Earth. To check it out, click here.

Facebook Graph

Although much of the information on users’ Facebook profile is private, a lot isn’t – Facebook provide the Graph API as a way of querying the huge amount of information that its users are happy to share with the world (or can’t hide because they haven’t worked out how the privacy settings work). To check it out, click here.

Gapminder

Compilation of data from sources including the World Health Organization and World Bank covering economic, medical and social statistics from around the world. To check it out, click here.

Google Trends

Statistics on search volume (as a proportion of total search) for any given term, since 2004. To check it out, click here.

Google Finance

40 years’ worth of stock market data, updated in real time. To check it out, click here.

Google Books Ngrams

Search and analyze the full text of any of the millions of books digitised as part of the Google Books project. To check it out, click here.

National Climatic Data Center

Huge collection of environmental, meteorological and climate data sets from the US National Climatic Data Center. The world’s largest archive of weather data. To check it out, click here.

DBPedia

Wikipedia is comprised of millions of pieces of data, structured and unstructured on every subject under the sun. DBPedia is an ambitious project to catalogue and create a public, freely distributable database allowing anyone to analyze this data. To check it out, click here.

Topsy

Free, comprehensive social media data is hard to come by – after all their data is what generates profits for the big players (Facebook, Twitter etc) so they don’t want to give it away. However Topsy provides a searchable database of public tweets going back to 2006 as well as several tools to analyze the conversations. To check it out, click here.

Likebutton

Mines Facebook’s public data – globally and from your own network – to give an overview of what people “Like” at the moment. To check it out, click here.

New York Times

Searchable, indexed archive of news articles going back to 1851. To check it out, click here.

Freebase

A community-compiled database of structured data about people, places and things, with over 45 million entries. To check it out, click here.

Million Song Data Set

Metadata on over a million songs and pieces of music. Part of Amazon Web Services. To check it out, click here.”
See also Bernard Marr‘s blog at Big Data Guru

4 Tech Trends Changing How Cities Operate


at Governing: “Louis Brandeis famously characterized states as laboratories for democracy, but cities could be called labs for innovation or new practices….When Government Technology magazine (produced by Governing’s parent company, e.Republic, Inc.) published its annual Digital Cities Survey, the results provided an interesting look at how local governments are using technology to improve how they deliver services, increase production and streamline operations…the survey also showed four technology trends changing how local government operates and serves its citizens:

1. Open Data

…Big cities were the first to open up their data and gained national attention for their transparency. New York City, which passed an open data law in 2012, leads all cities with more than 1,300 data sets open to the public; Chicago started opening up data to the public in 2010 following an executive order and is second among cities with more than 600; and San Francisco, which was the first major city to open the doors to transparency in 2009, had the highest score from the U.S. Open Data Census for the quality of its open data.
But the survey shows that a growing number of mid-sized jurisdictions are now getting involved, too. Tacoma, Wash., has a portal with 40 data sets that show how the city is spending tax dollars on public works, economic development, transportation and public safety. Ann Arbor, Mich., has a financial transparency tool that reveals what the city is spending on a daily basis, in some cases….

2. ‘Stat’ Programs and Data Analytics

…First, the so-called “stat” programs are proliferating. Started by the New York Police Department in the 1980s, CompStat was a management technique that merged data with staff feedback to drive better performance by police officers and precinct captains. Its success led to many imitations over the years and, as the digital survey shows, stat programs continue to grow in importance. For example, Louisville has used its “LouieStat” program to cut the city’s bill for unscheduled employee overtime by $23 million as well as to spot weaknesses in performance.
Second, cities are increasing their use of data analytics to measure and improve performance. Denver, Jacksonville, Fla., and Phoenix have launched programs that sift through data sets to find patterns that can lead to better governance decisions. Los Angeles has combined transparency with analytics to create an online system that tracks performance for the city’s economy, service delivery, public safety and government operations that the public can view. Robert J. O’Neill Jr., executive director of the International City/County Management Association, said that both of these tech-driven performance trends “enable real-time decision-making.” He argued that public leaders who grasp the significance of these new tools can deliver government services that today’s constituents expect.

3. Online Citizen Engagement

…Avondale, Ariz., population 78,822, is engaging citizens with a mobile app and an online forum that solicits ideas that other residents can vote up or down.
In Westminster, Colo., population 110,945, a similar forum allows citizens to vote online about community ideas and gives rewards to users who engage with the online forum on a regular basis (free passes to a local driving range or fitness program). Cities are promoting more engagement activities to combat a decline in public trust in government. The days when a public meeting could provide citizen engagement aren’t enough in today’s technology-dominated  world. That’s why social media tools, online surveys and even e-commerce rewards programs are popping up in cities around the country to create high-value interaction with its citizens.

4. Geographic Information Systems

… Cities now use them to analyze financial decisions to increase performance, support public safety, improve public transit, run social service activities and, increasingly, engage citizens about their city’s governance.
Augusta, Ga., won an award for its well-designed and easy-to-use transit maps. Sugar Land, Texas, uses GIS to support economic development and, as part of its citizen engagement efforts, to highlight its capital improvement projects. GIS is now used citywide by 92 percent of the survey respondents. That’s significant because GIS has long been considered a specialized (and expensive) technology primarily for city planning and environmental projects….”