Springwise: “Transport apps such as Ototo make it easier than ever for passengers to stay informed about problems with public transport, but real-time information can only help so much — by the time users find out about a delayed service, it is often too late to take an alternative route. Now, Stockholmstag — the company that runs Sweden’s trains — have found a solution in the form of an algorithm called ‘The Commuter Prognosis’, which can predict network delays up to two hours in advance, giving train operators time to issue extra services or provide travelers with adequate warning.
The system was created by mathematician Wilhelm Landerholm. It uses historical data to predict how a small delay, even as little as two minutes, will affect the running of the rest of the network. Often the initial late train causes a ripple effect, with subsequent services being delayed to accommodate new platform arrival time, which then affect subsequent trains, and so on. But soon, using ‘The Commuter Prognosis’, Stockholmstag train operators will be able to make the necessary adjustments to prevent this. In addition, the information will be relayed to commuters, enabling them to take a different train and therefore reducing overcrowding. The prediction tool is expected to be put into use in Sweden by the end of the year….(More)”
Can Yelp Help Government Win Back the Public’s Trust?
Tod Newcombe at Governing: “Look out, DMV, IRS and TSA. Yelp, the popular review website that’s best known for its rants or cheers regarding restaurants and retailers, is about to make it easier to review and rank government services.
Last month, Yelp and the General Services Administration (GSA), which manages the basic functions of the federal government, announced that government workers will soon be able to read and respond to their agencies’ Yelp reviews — and, hopefully, incorporate the feedback into service improvements.
At first glance, the news might not seem so special. There already are Yelp pages for government agencies like Departments of Motor Vehicles, which have been particularly popular. San Francisco’s DMV office, for example, has received more than 450 reviews and has a three-star rating. But federal agencies and workers haven’t been allowed to respond to the reviewers nor could they collect data from the pages because Yelp hadn’t been approved by the GSA. The agreement changes that situation, also making it possible for agencies to set up new Yelp pages….
Yelp has been posting online reviews about restaurants, bars, nail salons and other retailers since 2004. Despite its reputation as a place to vent about bad service, more than two-thirds of the 82 million reviews posted since Yelp started have been positive with most rated at either four or five stars, according to the company’s website. And when businesses boost their Yelp rating by one star, revenues have increased by as much as 9 percent, according to a 2011 study by Harvard Business School Professor Michael Luca.
Now the public sector is about to start paying more attention to those rankings. More importantly, they will find out if engaging the public in a timely fashion changes their perception of government.
While all levels of government have become active with social media, direct interaction between an agency and citizens is still the exception rather than the rule. Agencies typically use Facebook and Twitter to inform followers about services or to provide information updates, not as a feedback mechanism. That’s why having a more direct connection between the comments on a Yelp page and a government agency represents a shift in engagement….(More)”
The era of “Scientific Urban Management” is approaching
Francesco Ferrero: “As members of the Smart City Strategic Program at Istituto Superiore Mario Boella, we strongly believe in the concept of Scientific Urban Management. This concept means that through new ICT trends such as the massive diffusion of sensors, wireless broadband and tools for data collection and analysis, the administration of urban spaces can get closer to being an exact science, i.e. urban decision-makers can exploit these technologies for practicing evidence based decision making. We are spending quite some efforts in doing research on Decision Support Systems integrating different Modelling & Simulation (M&S) techniques, to better predict and measure the impact of alternative Smart City initiatives on the path towards the social, economic and environmental sustainability target.
We believe that these tools will allow urban decision makers, solution providers and investment managers to predict, on the basis of a scientific approach, what initiatives will better contribute to implement the local Smart City strategies, and to satisfy the real needs of the citizens, thus reducing the risks associated with the deployment of large-scale innovations in the urban context.
Following this research roadmap, we have reached a paramount result on the theme of urban mobility simulation, by means of open-source software and open-data. Our work, done in collaboration with the CNR-IEIIT, University of Bologna and EURECOM has been recently published on IEEE Transactions on Vehicular Technology.
Simulation is an important technique to analyse the complex urban mobility system and to develop tools for supporting decision-making on top of it. However the effectiveness of this approach relies on the truthfulness of the mobility traces used to feed the traffic simulators. Furthermore, there is a lack of reference publicly available mobility traces. The main reasons for this lack are that the tools to generate realistic road traffic are complex to configure and operate, and real-world input data to be fed to such tools is hard to retrieve….(More)”
Using Big Data to Understand the Human Condition: The Kavli HUMAN Project
Azmak Okan et al in the Journal “Big Data”: “Until now, most large-scale studies of humans have either focused on very specific domains of inquiry or have relied on between-subjects approaches. While these previous studies have been invaluable for revealing important biological factors in cardiac health or social factors in retirement choices, no single repository contains anything like a complete record of the health, education, genetics, environmental, and lifestyle profiles of a large group of individuals at the within-subject level. This seems critical today because emerging evidence about the dynamic interplay between biology, behavior, and the environment point to a pressing need for just the kind of large-scale, long-term synoptic dataset that does not yet exist at the within-subject level. At the same time that the need for such a dataset is becoming clear, there is also growing evidence that just such a synoptic dataset may now be obtainable—at least at moderate scale—using contemporary big data approaches. To this end, we introduce the Kavli HUMAN Project (KHP), an effort to aggregate data from 2,500 New York City households in all five boroughs (roughly 10,000 individuals) whose biology and behavior will be measured using an unprecedented array of modalities over 20 years. It will also richly measure environmental conditions and events that KHP members experience using a geographic information system database of unparalleled scale, currently under construction in New York. In this manner, KHP will offer both synoptic and granular views of how human health and behavior coevolve over the life cycle and why they evolve differently for different people. In turn, we argue that this will allow for new discovery-based scientific approaches, rooted in big data analytics, to improving the health and quality of human life, particularly in urban contexts….(More)”
Open data is not just for startups
Mike Altendorf at CIO: “…Surely open data is just for start-ups, market research companies and people that want to save the world? Well there are two reasons why I wanted to dedicate a bit of time to the subject of open data. First, one of the major barriers to internal innovation that I hear about all the time is the inability to use internal data to inform that innovation. This is usually because data is deemed too sensitive, too complex, too siloed or too difficult to make usable. Leaving aside the issues that any of those problems are going to cause for the organisation more generally, it is easy to see how this can create a problem. So why not use someone else’s data?
The point of creating internal labs and innovation centres is to explore the art of the possible. I quite agree that insight from your own data is a good place to start but it isn’t the only place. You could also argue that by using your own data you are restricting your thinking because you are only looking at information that already relates to your business. If the point of a lab is to explore ideas for supporting the business then you may be better off looking outwards at what is happening in the world around you rather than inwards into the constrained world of the industry you already inhabit….
The fact is there is vast amounts of data sets that are freely available that can be made to work for you if you can just apply the creativity and technical smarts to them.
My second point is less about open data than about opening up data. Organisations collect information on their business operations, customers and suppliers all the time. The smart ones know how to use it to build competitive advantage but the really smart ones also know that there is significant extra value to be gained from sharing that data with the customer or supplier that it relates to. The customer or supplier can then use it to make informed decisions themselves. Some organisations have been doing this for a while. Customers of First Direct have been able to analyse their own spending patterns for years (although the data has been somewhat limited). The benefit to the customer is that they can make informed decisions based on actual data about their past behaviours and so adapt their spending habits accordingly (or put their head firmly in the sand and carry on as before in my case!). The benefit to the bank is that they are able to suggest ideas for how to improve a customer’s financial health alongside the data. Others have looked at how they can help customers by sharing (anonymised) information about what people with similar lifestyles/needs are doing/buying so customers can learn from each other. Trials have shown that customers welcomed the insight….(More)”
Sustainable Value of Open Government Data
Phd Thesis from Thorhildur Jetzek: “The impact of the digital revolution on our societies can be compared to the ripples caused by a stone thrown in water: spreading outwards and affecting a larger and larger part of our lives with every year that passes. One of the many effects of this revolution is the emergence of an already unprecedented amount of digital data that is accumulating exponentially. Moreover, a central affordance of digitization is the ability to distribute, share and collaborate, and we have thus seen an “open theme” gaining currency in recent years. These trends are reflected in the explosion of Open Data Initiatives (ODIs) around the world. However, while hundreds of national and local governments have established open data portals, there is a general feeling that these ODIs have not yet lived up to their true potential. This feeling is not without good reason; the recent Open Data Barometer report highlights that strong evidence on the impacts of open government data is almost universally lacking (Davies, 2013). This lack of evidence is disconcerting for government organizations that have already expended money on opening data, and might even result in the termination of some ODIs. This lack of evidence also raises some relevant questions regarding the nature of value generation in the context of free data and sharing of information over networks. Do we have the right methods, the right intellectual tools, to understand and reflect the value that is generated in such ecosystems?
This PhD study addresses the question of How is value generated from open data? through a mixed methods, macro-level approach. For the qualitative analysis, I have conducted two longitudinal case studies in two different contexts. The first is the case of the Basic Data Program (BDP), which is a Danish ODI. For this case, I studied the supply-side of open data publication, from the creation of open data strategy towards the dissemination and use of data. The second case is a demand-side study on the energy tech company Opower. Opower has been an open data user for many years and have used open data to create and disseminate personalized information on energy use. This information has already contributed to a measurable world-wide reduction in CO2 emissions as well as monetary savings. Furthermore, to complement the insights from these two cases I analyzed quantitative data from 76 countries over the years 2012 and 2013. I have used these diverse sources of data to uncover the most important relationships or mechanisms, that can explain how open data are used to generate sustainable value….(More)”
Flutrack.org: Open-source and linked data for epidemiology
Chorianopoulos K, and Talvis K at Health Informatics Journal: “Epidemiology has made advances, thanks to the availability of real-time surveillance data and by leveraging the geographic analysis of incidents. There are many health information systems that visualize the symptoms of influenza-like illness on a digital map, which is suitable for end-users, but it does not afford further processing and analysis. Existing systems have emphasized the collection, analysis, and visualization of surveillance data, but they have neglected a modular and interoperable design that integrates high-resolution geo-location with real-time data. As a remedy, we have built an open-source project and we have been operating an open service that detects flu-related symptoms and shares the data in real-time with anyone who wants to built upon this system. An analysis of a small number of precisely geo-located status updates (e.g. Twitter) correlates closely with the Google Flu Trends and the Centers for Disease Control and Prevention flu-positive reports. We suggest that public health information systems should embrace an open-source approach and offer linked data, in order to facilitate the development of an ecosystem of applications and services, and in order to be transparent to the general public interest…(More)” See also http://www.flutrack.org/
Revolution Delayed: The Impact of Open Data on the Fight against Corruption
Report by RiSSC – Research Centre on Security and Crime (Italy): “In the recent years, the demand for Open Data picked up stream among stakeholders to increasing transparency and accountability of the Public Sector. Governments are supporting Open Data supply, to achieve social and economic benefits, return on investments, and political consensus.
While it is self-evident that Open Data contributes to greater transparency – as it makes data more available and easy to use by the public and governments, its impact on fighting corruption largely depends on the ability to analyse it and develop initiatives that trigger both social accountability mechanisms, and government responsiveness against illicit or inappropriate behaviours.
To date, Open Data Revolution against corruption is delayed. The impact of Open Data on the prevention and repression of corruption, and on the development of anti- corruption tools, appears to be limited, and the return on investments not yet forthcoming. Evidence remains anecdotal, and a better understanding on the mechanisms and dynamics of using Open Data against corruption is needed.
The overall objective of this exploratory study is to provide evidence on the results achieved by Open Data, and recommendations for the European Commission and Member States’ authorities, for the implementation of effective anti-corruption strategies based on transparency and openness, to unlock the potential impact of “Open Data revolution” against Corruption.
The project has explored the legal framework and the status of implementation of Open Data policies in four EU Countries – Italy, United Kingdom, Spain, and Austria. TACOD project has searched for evidence on Open Data role on law enforcement cooperation, anti-corruption initiatives, public campaigns, and investigative journalism against corruption.
RiSSC – Research Centre on Security and Crime (Italy), the University of Oxford and the University of Nottingham (United Kingdom), Transparency International (Italy and United Kingdom), the Institute for Conflict Resolution (Austria), and Blomeyer&Sanz (Spain), have carried out the research between January 2014 and February 2015, under an agreement with the European Commission – DH Migration and Home Affairs. The project has been coordinated by RiSSC, with the support of a European Working Group of Experts, chaired by prof. Richard Rose, and an external evaluator, Mr. Andrea Menapace, and it has benefited from the contribution of many experts, activists, representatives of Institutions in the four Countries….(More)
What should governments require for their open data portals?
Luke Fretwell at GovFresh: “Johns Hopkins University’s new Center for Government Excellence is developing a much-needed open data portal requirements resource to serve as a “set of sample requirements to help governments evaluate, develop (or procure), deploy, and launch an open data web site (portal).”
As many governments ramp up their open data initiatives, this is an important project in that we often see open data platform decisions being made without a holistic approach and awareness of what government should purchase (or have the flexibility to develop on its own).
“The idea here is that any interested city can use this as a baseline and make their own adjustments before proceeding,” said GovEx Director of Open Data Andrew Nicklin via email. “Perhaps with this we can create some common denominators amongst open data portals and eventually push the whole movement forwards.”
My fundamental suggestion is that government-run open data platforms be fully open source. There are a number of technical and financial reasons for this, which I will address in the future, but I believe strongly that if the platform you’re hosting data on doesn’t adhere to the same licensing standards you hold for your data, you’re only doing open data half right.
With both CKAN and DKAN continuing to grow in adoption, we’re seeing an emergence of reliable solutions that adequately meet the same technical and procurement requirements as propriety options (full disclosure: I work with NuCivic on DKAN and NuCivic Data).
Learn more about the GovEx open data portal standards project”
Smarter as the New Urban Agenda: A Comprehensive View of the 21st Century City
Book edited by Gil-Garcia, J. Ramon, Pardo, Theresa A., Nam, Taewoo: “This book will provide one of the first comprehensive approaches to the study of smart city governments with theories and concepts for understanding and researching 21st century city governments innovative methodologies for the analysis and evaluation of smart city initiatives. The term “smart city” is now generally used to represent efforts that in different ways describe a comprehensive vision of a city for the present and future. A smarter city infuses information into its physical infrastructure to improve conveniences, facilitate mobility, add efficiencies, conserve energy, improve the quality of air and water, identify problems and fix them quickly, recover rapidly from disasters, collect data to make better decisions, deploy resources effectively and share data to enable collaboration across entities and domains. These and other similar efforts are expected to make cities more intelligent in terms of efficiency, effectiveness, productivity, transparency, and sustainability, among other important aspects. Given this changing social, institutional and technology environment, it seems feasible and likeable to attain smarter cities and by extension, smarter governments: virtually integrated, networked, interconnected, responsive, and efficient. This book will help build the bridge between sound research and practice expertise in the area of smarter cities and will be of interest to researchers and students in the e-government, public administration, political science, communication, information science, administrative sciences and management, sociology, computer science, and information technology. As well as government officials and public managers who will find practical recommendations based on rigorous studies that will contain insights and guidance for the development, management, and evaluation of complex smart cities and smart government initiatives.…(More)”