City Governments Are Using Yelp to Tell You Where Not to Eat


Michael Luca and Luther Lowe at HBR Blog: “…in recent years consumer-feedback platforms like TripAdvisor, Foursquare, and Chowhound have transformed the restaurant industry (as well as the hospitality industry), becoming important guides for consumers. Yelp has amassed about 67 million reviews in the last decade. So it’s logical to think that these platforms could transform hygiene awareness too — after all, people who contribute to review sites focus on some of the same things inspectors look for.

It turns out that one way user reviews can transform hygiene awareness is by helping health departments better utilize their resources. The deployment of inspectors is usually fairly random, which means time is often wasted on spot checks at clean, rule-abiding restaurants. Social media can help narrow the search for violators.
Within a given city or area, it’s possible to merge the entire history of Yelp reviews and ratings — some of which contain telltale words or phrases such as “dirty” and “made me sick” — with the history of hygiene violations and feed them into an algorithm that can predict the likelihood of finding problems at reviewed restaurants. Thus inspectors can be allocated more efficiently.
In San Francisco, for example, we broke restaurants into the top half and bottom half of hygiene scores. In a recent paper, one of us (Michael Luca, with coauthor Yejin Choi and her graduate students) showed that we could correctly classify more than 80% of restaurants into these two buckets using only Yelp text and ratings. In the next month, we plan to hold a contest on DrivenData to get even better algorithms to help cities out (we are jointly running the contest). Similar algorithms could be applied in any city and in other sorts of prediction tasks.
Another means for transforming hygiene awareness is through the sharing of health-department data with online review sites. The logic is simple: Diners should be informed about violations before they decide on a destination, rather than after.
Over the past two years, we have been working with cities to help them share inspection data with Yelp through an open-data standard that Yelp created in 2012 to encourage officials to put their information in places that are more useful to consumers. In San Francisco, Los Angeles, Raleigh, and Louisville, Kentucky, customers now see hygiene data alongside Yelp reviews. There’s evidence that users are starting to pay attention to this data — click-through rates are similar to those for other features on Yelp ….

And there’s no reason this type of data sharing should be limited to restaurant-inspection reports. Why not disclose data about dentists’ quality and regulatory compliance via Yelp? Why not use data from TripAdvisor to help spot bedbugs? Why not use Twitter to understand what citizens are concerned about, and what cities can do about it? Uses of social media data for policy, and widespread dissemination of official data through social media, have the potential to become important means of public accountability. (More)

Opening travel spending through civic intelligence, participation and co-creation


Joel Salas Suárez at the Open Government Partnership Blog: “When we were appointed by the Senate as Commissioners of the Access to Information Institute in Mexico (IFAI), we identified two high profile issues that had negatively affected the Institute’s image: the acquisition of its new building and the lack of transparency on international travel expenditure of the former Commissioners.

IFAI has to lead by example, so my fellow commissioners and I decided to tackle these two problems with transparency actions to send a clear message to the Mexican society and the international community in our first hundred days in office. First we created the website sede.ifai.mx to publish all the information about the new building procurement (a 45.6 million USD lease). Secondly, we decided to start our first civic innovation project, a joint venture with civil society organizations, to find the best way to publish information related to travel spending by IFAI’s public servants.
Travel expenditure of IFAI is comparatively smaller. During 2013 it allotted to 186,760 USD, 0.5% of the Institute’s budget (38.2 million USD). However, this expenditure has historically been of public interest and it should be. According to the 2013 Mexican Government Expenditure Review (the latest available) the Federal Level (Executive, Legislative and Judicial Powers, and Autonomous organs) spent close to 633 million USD in official travel (Chapter 3000, concept 3700). Therefore, we decided to tackle the problem and design a platform that would allow us to effectively publish information related to the public money spent on travel by public officials and the results obtained during these trips.
In order to do this, we worked with civil society experts in public participation, accountability and technology, Codeando México, SocialTIC and IMCO. Together we launched a public challenge to create an open source web application to publish information on official travel spending.
The challenge #RetoViajesTransparentes was a very successful experience. Close to a hundred participants registered 14 projects that competed to develop an app that IFAI would officially use and to win a 3,500 USD prize. The jury selected 3 finalists, who presented their projects on a public Google Hangout. The winner app is named Viajes Claros and is being used to publish travel expenditure information of IFAI at viajesclaros.ifai.mx.
This challenge has allowed us to shift focus from the inputs of official travel (i.e. the money spent) to the outputs or results attained in each trip. Viajes Claros opens relevant information to understand and evaluate the activities performed by the public servants during their trips. It also allowed us to co-create with society an open source tool that can be replicated in Mexico and other countries….(More)”.

Is Transparency a Recipe for Innovation?


Paper by Dr. Bastiaan Heemsbergen:Innovation is a key driver in organizational sustainability, and yes, openness and transparency are a recipe for innovation. But, according to Tapscott and Williams, “when it comes to innovation, competitive advantage and organizational success, ‘openness’ is rarely the first word one would use to describe companies and other societal organizations like government agencies or medical institutions. For many, words like ‘insular,’ ‘bureaucratic,’ ‘hierarchical,’ ‘secretive’ and ‘closed’ come to mind instead.”1 And yet a few months ago, The Tesla Model S just became the world’s first open-source car. Elon Musk, CEO of Tesla Motor Vehicles, shared all the patents on Tesla’s electric car technology, allowing anyone — including competitors — to use them without fear of litigation. Elon wrote in his post “Yesterday, there was a wall of Tesla patents in the lobby of our Palo Alto headquarters. That is no longer the case. They have been removed, in the spirit of the open source movement, for the advancement of electric vehicle technology.”2
In the public sector, terms such as open government, citizen sourcing, and wiki government are also akin to the notion of open innovation and transparency. As Hilgers and Ihl report, “a good example of this approach is the success of the Future Melbourne program, a Wiki and blog-based approach to shaping the future urban landscape of Australia’s second largest city. The program allowed citizens to directly edit and comment on the plans for the future development of the city. It attracted more than 30,000 individuals, who submitted hundreds of comments and suggestions (futuremelbourne.com.au). Basically, problems concerning design and creativity, future strategy and local culture, and even questions of management and service innovation can be broadcasted on such web-platforms.”3 The authors suggest that there are three dimensions to applying the concept of open innovation to the public sector: citizen ideation and innovation (tapping knowledge and creativity), collaborative administration (user generated new tasks and processes), and collaborative democracy (improve public participation in the policy process)….(More)”.

inBloom and the Failure of Innovation 1.0


Blog by Steven Hodas at The Center on Reinventing Public Education (CRPE): “Michael Horn’s recent piece on the failure of inBloom captures why it was the very opposite of a disruptive innovation from a markets perspective, as well the fatal blind spots and judgment errors present from its inception.
inBloom was a textbook example of what I call “Innovation 1.0”, which thinks of innovation as a noun, a thing with transformative transitive properties that magically make its recipient “innovative.” It’s the cargo–cult theory of innovation: I give you this innovative thing (a tablet, a data warehouse, an LMS) and you thereby become innovative yourself. This Innovation 1.0 approach to both product and policy has characterized a great deal of foundation and Federal efforts over the past ten years.
But as Michael points out (and as real innovators and entrepreneurs understand viscerally), “innovation” is not a noun but a verb. It is not a thing but a process, a frame of mind, a set of reflexes. He correctly notes the essential iterative approach that characterizes innovation–as–a–verb, its make–something–big–by–making–something–small theory of action (this is fundamentally different from piloting or focus–grouping, but that’s another topic).
But it’s important to go deeper and understand why iteration is important. Simply, it is a means to bake into the process, the product, or the policy a respect for users’ subjectivity and autonomy. In short, functional iteration requires that you listen.
True, durable innovation, “Innovation 2.0” is not some thing I can give to you, do to you, or even do for you: it must be a process I do with you. Lean Startup theory—with its emphasis on iteration, an assumption of the innovator’s fallibility and limited perspective, and the importance of low–cost, low–stakes discovery of product–market fit that Michael describes—is essentially a cookbook for baking empathy into the development of products, services, or policies…..
That doesn’t mean inBloom was a bad idea. But the failure to anticipate its vehement visceral rejection—however misinformed and however cynically exploited by those with larger agendas—was a profound failure of imagination, of empathy, of the respect for user subjectivity that characterizes Innovation 2.0….(More).”

New Implementation Guide for Local Government Innovation


Living Cities Blog and Press Release: “Living Cities, with support from the Citi Foundation, today released a toolkit to help local governments adopt cutting-edge approaches to innovation as part of the City Accelerator program. The implementation guide offers practical guidance to local government officials on how to build a durable culture and practice of innovation that draws from leading practices with promising results from cities around the United States, as well as from the private sector. The guide was developed as part of the City Accelerator, a $3 million program of Living Cities with the Citi Foundation to speed the spread of innovation with the potential to benefit low-income people in local governments. The implementation guide – authored by Nigel Jacob, co-founder of the Mayor’s Office of New Urban Mechanics in Boston and Urban Technologist-in-Residence at Living Cities – addresses some of the key barriers that local governments face when looking to incorporate innovation in their cities, and introduces fresh ideas as well…(More)”

‘Frontier methods’ offer a powerful but accessible approach for measuring the efficiency of public sector organisations


EUROPP Blog of the LSE: “How can the efficiency of public sector organisations best be measured? Jesse Stroobants and Geert Bouckaert write that while the efficiency of an organisation is typically measured using performance indicators, there are some notable problems with this approach, such as the tendency for different indicators to produce conflicting conclusions on organisational performance. As an alternative, they outline so called ‘frontier methods’, which use direct comparisons between different organisations to create a benchmark or standard for performance. They argue that the frontier approach not only alleviates some of the problems associated with performance indicators, but is also broadly accessible for those employed in public administration….
However, despite their merits, there are some drawbacks to using performance indicators. First, they provide only an indirect or partial indication of performance. For instance with respect to efficiency, indicators will be single-input/single-output indicators. Second, they may provide conflicting results: an organisation that appears to do well on one indicator may perform less successfully when considered using another.
In this context, ‘frontier methods’ offer alternative techniques for measuring and evaluating the performance of a group of comparable entities. Unlike single factor measures that reflect only partial aspects of performance, frontier techniques can be applied to assess overall performance by handling multiple inputs and outputs at the same time. Specifically, Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH) have proven to be useful tools for assessing the relative efficiency of entities….
At this point you may be thinking that the term ‘frontier methods’ sounds overly complex or that these techniques are only likely to be of any use to academic specialists. Yet there are a number of reasons why this interpretation would be incorrect. It is indeed true that DEA and FDH have been used predominantly by economists and econometricians, and only rarely by those employed in public administration. We should re-establish this bridge. Therefore, in a recent article, we have provided a step-by-step application of DEA/FDH to benchmark the efficiency of comparable public sector organisations (in the article’s case: public libraries in Flanders). With this gradual approach, we want to offer both academics and practitioners a basic grounding in more advanced efficiency measurement techniques….(More)”.

The 18F Hub


18F Blog: “Clear, organized, and easy access to information is critical to supporting team growth and promoting the kind of culture change that 18F, the U.S. Digital Service, and fellow Innovators and Early Adopters aim to produce throughout federal IT development. As a small step towards that goal, we’d like to announce the 18F Hub, a Jekyll-based documentation platform that aims to help development teams organize and easily share their information, and to enable easy exploration of the connections between team members, projects, and skill sets. It’s still very much alpha-stage software, but over time, we’ll incrementally improve it until it serves as the go-to place for all our team’s working information, whether that information is integrated into the Hub directly or provided as links to other sources. It also serves as a lightweight tool that other teams can experiment with and deploy with a minimum of setup.

While we at 18F strongly believe in the value of transparency and collaboration across government, the details of our team, our projects, and our activities aren’t the real reason we’re launching the Hub: the exposure of our domain knowledge, working models and processes through tangible artifacts that people can adapt to their own environments is… (More).

Open policy making in action: Empowering divorcing couples and separating families to create sustainable solutions


at Open Policy Making Blog (UK Cabinet): “Set up in April 2014, Policy Lab brings new tools and techniques, new insights and practical experimentation to policy-making. This second demonstrator project has over the past two months resulted in learning about how policy professionals can work in a more open, user-centred way to engage with others and generate novel solutions to policy issues.
The project, with the Ministry of Justice (MoJ), is concerned with family mediation during divorce and separation….
The main findings from the Lab’s perspective are in three areas.
 Clarifying what user perspectives bring to policy-making.
The project gave us some insights into the potential value of ethnography in policy-making. It was centred around people’s whole experience of divorce or separation, not just their interactions with mediators or lawyers. The research explored what it was like for people now, and the creative activities in the workshop proposed what it could be like for people in the future.  Unexpected insights included that some people going through separation and divorce lacked confidence in their ability to make decisions about their futures.
Using person-centred techniques in the workshop made participants accountable to the users.  Their stories were read, interpreted and discussed at the start. Throughout the workshop, participants repeatedly raised questions about what a proposed new solution might be like for these personas. It was as if these participants were now accountable to these individuals.
Reconstituting the issue of family mediation.
Another result of this project was to shift from seeing policy-making as primarily as the province of the MoJ towards a collective activity in which many actors and different kinds of expertise needed to be involved. The project constituted policy-making as a complex configuration of socio-cultural, organizational and technological actors, processes, data and resources – more of a living system than a mechanical object with inputs, outputs and policy “levers”.
Starting and ending with people’s lives, not government-funded or delivered services, as the driver to innovate.  
Finally, this Lab project looked broadly at people’s lives, not just as users of mediation or court services…. (More)”

Launching Disasters.Data.Gov


Meredith Lee, Heather King, and Brian Forde at the OSTP Blog: “Strengthening our Nation’s resilience to disasters is a shared responsibility, with all community members contributing their unique skills and perspectives. Whether you’re a data steward who can unlock information and foster a culture of open data, an innovator who can help address disaster preparedness challenges, or a volunteer ready to join the “Innovation for Disasters” movement, we are excited for you to visit the new disasters.data.gov site, launching today.
First previewed at the White House Innovation for Disaster Response and Recovery Initiative Demo Day, disasters.data.gov is designed to be a public resource to foster collaboration and the continual improvement of disaster-related open data, free tools, and new ways to empower first responders, survivors, and government officials with the information needed in the wake of a disaster.
A screenshot from the new disasters.data.gov web portal.
Today, the Administration is unveiling the first in a series of Innovator Challenges that highlight pressing needs from the disaster preparedness community. The inaugural Innovator Challenge focuses on a need identified from firsthand experience of local emergency management, responders, survivors, and Federal departments and agencies. The challenge asks innovators across the nation: “How might we leverage real-time sensors, open data, social media, and other tools to help reduce the number of fatalities from flooding?”
In addition to this first Innovator Challenge, here are some highlights from disasters.data.gov:….(More)”

The Free 'Big Data' Sources Everyone Should Know


Bernard Marr at Linkedin Pulse: “…The moves by companies and governments to put large amounts of information into the public domain have made large volumes of data accessible to everyone….here’s my rundown of some of the best free big data sources available today.

Data.gov

The US Government pledged last year to make all government data available freely online. This site is the first stage and acts as a portal to all sorts of amazing information on everything from climate to crime. To check it out, click here.

US Census Bureau

A wealth of information on the lives of US citizens covering population data, geographic data and education. To check it out, click here. To check it out, click here.

European Union Open Data Portal

As the above, but based on data from European Union institutions. To check it out, click here.

Data.gov.uk

Data from the UK Government, including the British National Bibliography – metadata on all UK books and publications since 1950. To check it out, click here.

The CIA World Factbook

Information on history, population, economy, government, infrastructure and military of 267 countries. To check it out, click here.

Healthdata.gov

125 years of US healthcare data including claim-level Medicare data, epidemiology and population statistics. To check it out, click here.

NHS Health and Social Care Information Centre

Health data sets from the UK National Health Service. To check it out, click here.

Amazon Web Services public datasets

Huge resource of public data, including the 1000 Genome Project, an attempt to build the most comprehensive database of human genetic information and NASA’s database of satellite imagery of Earth. To check it out, click here.

Facebook Graph

Although much of the information on users’ Facebook profile is private, a lot isn’t – Facebook provide the Graph API as a way of querying the huge amount of information that its users are happy to share with the world (or can’t hide because they haven’t worked out how the privacy settings work). To check it out, click here.

Gapminder

Compilation of data from sources including the World Health Organization and World Bank covering economic, medical and social statistics from around the world. To check it out, click here.

Google Trends

Statistics on search volume (as a proportion of total search) for any given term, since 2004. To check it out, click here.

Google Finance

40 years’ worth of stock market data, updated in real time. To check it out, click here.

Google Books Ngrams

Search and analyze the full text of any of the millions of books digitised as part of the Google Books project. To check it out, click here.

National Climatic Data Center

Huge collection of environmental, meteorological and climate data sets from the US National Climatic Data Center. The world’s largest archive of weather data. To check it out, click here.

DBPedia

Wikipedia is comprised of millions of pieces of data, structured and unstructured on every subject under the sun. DBPedia is an ambitious project to catalogue and create a public, freely distributable database allowing anyone to analyze this data. To check it out, click here.

Topsy

Free, comprehensive social media data is hard to come by – after all their data is what generates profits for the big players (Facebook, Twitter etc) so they don’t want to give it away. However Topsy provides a searchable database of public tweets going back to 2006 as well as several tools to analyze the conversations. To check it out, click here.

Likebutton

Mines Facebook’s public data – globally and from your own network – to give an overview of what people “Like” at the moment. To check it out, click here.

New York Times

Searchable, indexed archive of news articles going back to 1851. To check it out, click here.

Freebase

A community-compiled database of structured data about people, places and things, with over 45 million entries. To check it out, click here.

Million Song Data Set

Metadata on over a million songs and pieces of music. Part of Amazon Web Services. To check it out, click here.”
See also Bernard Marr‘s blog at Big Data Guru