Personalised Health and Care 2020: Using Data and Technology to Transform Outcomes for Patients and Citizens


Report and Framework of Action by the UK National Information Board: “One of the greatest opportunities of the 21st century is the potential to safely harness the power of the technology revolution, which has transformed our society, to meet the challenges of improving health and providing better, safer, sustainable care for all. To date the health and care system has only begun to exploit the potential of using data and technology at a national or local level. Our ambition is for a health and care system that enables people to make healthier choices, to be more resilient, to deal more effectively with illness and disability when it arises, and to have happier, longer lives in old age; a health and care system where technology can help tackle inequalities and improve access to services for the vulnerable.
The purpose of this paper is to consider what progress the health and care system has already made and what can be learnt from other industries and the wider economy…”

How to use the Internet to end corrupt deals between companies and governments


Stella Dawson at the Thomson Reuters Foundation: “Every year governments worldwide spend more than $9.5 trillion on public goods and services, but finding out who won those contracts, why and whether they deliver as promised is largely invisible.
Enter the Open Contracting Data Standard (OCDS).
Canada, Colombia, Costa Rica and Paraguay became the first countries to announce on Tuesday that they have adopted the new global standards for publishing contracts online as part of a project to shine a light on how public money is spent and to combat massive corruption in public procurement.
“The mission is to end secret deals between companies and governments,” said Gavin Hayman, the incoming executive director for Open Contracting Partnership.
The concept is simple. Under Open Contracting, the government publishes online the projects it is putting out for bid and the terms; companies submit bids online; the winning contract is published including the reasons why; and then citizens can monitor performance according to the terms of the contract.
The Open Contracting initiative, developed by the World Wide Web Foundation with the support of the World Bank and Omidyar Network, has been several years in the making and is part of a broader global movement to increase the accountability of governments by using Internet technologies to make them more transparent.
A pioneer in data transparency was the Extractive Industries Transparency Initiative, a global coalition of governments, companies and civil society that works on improving accountability by publishing the revenues received in 35 member countries for their natural resources.
Publish What You Fund is a similar initiative for the aid industry. It delivered a common open standards in 2011 for donor countries to publish how much money they gave in development aid and details of what projects that money funded and where.
There’s also the Open Government Partnership, an international forum of 65 countries, each of which adopts an action plan laying out how it will improve the quality of government through collaboration with civil society, frequently using new technologies.
All of these initiatives have helped crack open the door of government.
What’s important about Open Contracting is the sheer scale of impact it could have. Public procurement accounts for about 15 percent of global GDP and according to Anne Jellema, CEO of the World Wide Web Foundation which seeks to expand free access to the web worldwide and backed the OCDS project, corruption adds an estimated $2.3 trillion to the cost of those contracts every year.
A study by the Center for Global Development, a Washington-based think tank, looked at four countries already publishing their contracts online — the United Kingdom, Georgia, Colombia and Slovakia. It found open contracting increased visibility and encouraged more companies to submit bids, the quality and price competitiveness improved and citizen monitoring meant better service delivery….”
 

Cities Find Rewards in Cheap Technologies


Nanette Byrnes at MIT Technology Review: “Cities around the globe, whether rich or poor, are in the midst of a technology experiment. Urban planners are pulling data from inexpensive sensors mounted on traffic lights and park benches, and from mobile apps on citizens’ smartphones, to analyze how their cities really operate. They hope the data will reveal how to run their cities better and improve urban life. City leaders and technology experts say that managing the growing challenges of cities well and affordably will be close to impossible without smart technology.
Fifty-four percent of humanity lives in urban centers, and almost all of the world’s projected population growth over the next three decades will take place in cities, including many very poor cities. Because of their density and often strained infrastructure, cities have an outsize impact on the environment, consuming two-thirds of the globe’s energy and contributing 70 percent of its greenhouse-gas emissions. Urban water systems are leaky. Pollution levels are often extreme.
But cities also contribute most of the world’s economic production. Thirty percent of the world’s economy and most of its innovation are concentrated in just 100 cities. Can technology help manage rapid population expansion while also nurturing cities’ all-important role as an economic driver? That’s the big question at the heart of this Business Report.
Selling answers to that question has become a big business. IBM, Cisco, Hitachi, Siemens, and others have taken aim at this market, publicizing successful examples of cities that have used their technology to tackle the challenges of parking, traffic, transportation, weather, energy use, water management, and policing. Cities already spend a billion dollars a year on these systems, and that’s expected to grow to $12 billion a year or more in the next 10 years.
To justify this kind of outlay, urban technologists will have to move past the test projects that dominate discussions today. Instead, they’ll have to solve some of the profound and growing problems of urban living. Cities leaning in that direction are using various technologies to ease parking, measure traffic, and save water (see “Sensing Santander”), reduce rates of violent crime (see “Data-Toting Cops”), and prepare for ever more severe weather patterns.
There are lessons to be learned, too, from cities whose grandiose technological ideas have fallen short, like the eco-city initiative of Tianjin, China (see “China’s Future City”), which has few residents despite great technology and deep government support.
The streets are similarly largely empty in the experimental high-tech cities of Songdo, South Korea; Masdar City, Abu Dhabi; and Paredes, Portugal, which are being designed to have minimal impact on the environment and offer high-tech conveniences such as solar-powered air-conditioning and pneumatic waste disposal systems instead of garbage trucks. Meanwhile, established cities are taking a much more incremental, less ambitious, and perhaps more workable approach, often benefiting from relatively inexpensive and flexible digital technologies….”

Principles for 21st Century Government


Dan Hon at Code for America: “I’m proud to share the beta of our principles for 21st century government. In this update, we’ve incorporated feedback we received from the 2014 Summit, as well as work from the U.S. Digital Service and Gov.UK that we think applies to the problems faced by local governments.
In the last few decades, the combination of agile and lean ways of working with digital technology and the internet have allowed businesses to serve people’s needs better than ever before. When people interact with their government though, it’s clear that their expectations aren’t being met.
Part of our work at Code for America is to make building digital government easy to understand and easy to copy.
We believe these seven principles help governments understand the values required to build digital government. They are critical for governments of any size or structure to deliver more effective, efficient, and inclusive services to their community. We’ve seen their importance over the last four years, in 32 Fellowship cities big and small across America, and in conversation with those around the world who have been transforming government.
In the past, we’ve described these concepts as “capabilities” — the abilities of governments to work or act in a certain way. But we have realised that there is something more fundamental than just the ability to work or act in a certain way.
We call these principles because it is only when governments agree to, follow, and adopt them at every level, that governments genuinely change and improve the way they work. Together, they provide a clear sense of direction that can then be acted upon….”

Spain is trialling city monitoring using sound


Springwise: “There’s more traffic on today’s city streets than there ever has been, and managing it all can prove to be a headache for local authorities and transport bodies. In the past, we’ve seen the City of Calgary in Canada detect drivers’ Bluetooth signals to develop a map of traffic congestion. Now the EAR-IT project in Santander, Spain, is using acoustic sensors to measure the sounds of city streets and determine real time activity on the ground.
Launched as part of the autonomous community’s SmartSantander initiative, the experimental scheme placed hundreds of acoustic processing units around the region. These pick up the sounds being made in any given area and, when processed through an audio recognition engine, can provide data about what’s going on on the street. Smaller ‘motes’ were also developed to provide more accurate location information about each sound.
Created by members of Portugal’s UNINOVA institute and IT consultants EGlobalMark, the system was able to use city noises to detect things such as traffic congestion, parking availability and the location of emergency vehicles based on their sirens. It could then automatically trigger smart signs to display up-to-date information, for example.
The team particularly focused on a junction near the city hospital that’s a hotspot for motor accidents. Rather than force ambulance drivers to risk passing through a red light and into lateral traffic, the sensors were able to detect when and where an emergency vehicle was coming through and automatically change the lights in their favor.
The system could also be used to pick up ‘sonic events’ such as gunshots or explosions and detect their location. The researchers have also trialled an indoor version that can sense if an elderly resident has fallen over or to turn lights off when the room becomes silent.”

Politics, Policy and Privatisation in the Everyday Experience of Big Data in the NHS


Chapter by Andrew Goffey ; Lynne Pettinger and Ewen Speed in Martin Hand , Sam Hillyard (ed.) Big Data? Qualitative Approaches to Digital Research (Studies in Qualitative Methodology, Volume 13) : “This chapter explains how fundamental organisational change in the UK National Health Service (NHS) is being effected by new practices of digitised information gathering and use. It analyses the taken-for-granted IT infrastructures that lie behind digitisation and considers the relationship between digitisation and big data.
Design/methodology/approach

Qualitative research methods including discourse analysis, ethnography of software and key informant interviews were used. Actor-network theories, as developed by Science and technology Studies (STS) researchers were used to inform the research questions, data gathering and analysis. The chapter focuses on the aftermath of legislation to change the organisation of the NHS.

Findings

The chapter shows the benefits of qualitative research into specific manifestations information technology. It explains how apparently ‘objective’ and ‘neutral’ quantitative data gathering and analysis is mediated by complex software practices. It considers the political power of claims that data is neutral.

Originality/value

The chapter provides insight into a specific case of healthcare data and. It makes explicit the role of politics and the State in digitisation and shows how STS approaches can be used to understand political and technological practice.”

Gov.uk quietly disrupts the problem of online identity login


The Guardian: “A new “verified identity” scheme for gov.uk is making it simpler to apply for a new driving licence, passport or to file a tax return online, allowing users to register securely using one log in that connects and securely stores their personal data.
After nearly a year of closed testing with a few thousand Britons, the “Gov.UK Verify” scheme quietly opened to general users on 14 October, expanding across more services. It could have as many as half a million users with a year.
The most popular services are expected to be one for tax credit renewals, and CAP farm information – both expected to have around 100,000 users by April next year, and on their own making up nearly half of the total use.
The team behind the system claim this is a world first. Those countries that have developed advanced government services online, such as Estonia, rely on state identity cards – which the UK has rejected.
“This is a federated model of identity, not a centralised one,” said Janet Hughes, head of policy and engagement at the Government Digital Service’s identity assurance program, which developed and tested the system.
How it works
The Verify system has taken three years to develop, and involves checking a user’s identity against details from a range of sources, including credit reference agencies, utility bills, driving licences and mobile provider bills.
But it does not retain those pieces of information, and the credit checking companies do not know what service is being used. Only a mobile or landline number is kept in order to send verification codes for subsequent logins.
When people subsequently log in, they would have to provide a user ID and password, and verify their identity by entering a code sent to related stored phone number.
To enrol in the system, users have to be over 19, living in the UK, and been resident for over 12 months. A faked passport would not be sufficient: “they would need a very full false ID, and have to not appear on any list of fraudulent identities,” one source at the GDS told the Guardian.
Banks now following gov.uk’s lead
Government developers are confident that it presents a higher barrier to authentication than any other digital service – so that fraudulent transactions will be minimised. That has interested banks, which are understood to be expressing interest in using the same service to verify customer identities through an arms-length verification system.
The government system would not pass on people’s data, but would instead verify that someone is who they claim to be, much like Twitter and Facebook verify users’ identity to log in to third party sites, yet don’t share their users’ data.
The US, Canada and New Zealand have also expressed interest in following up the UK’s lead in the system, which requires separate pieces of verified information about themselves from different sources.
The system then cross-references that verified information with credit reference agencies and other sources, which can include a mobile phone provider, passport, bank account, utility bill or driving licence.
The level of confidence in an individual’s identity is split into four levels. The lowest is for the creation of simple accounts to receive reports or updates: “we don’t need to know who it is, only that it’s the same person returning,” said Hughes.
Level 2 requires that “on the balance of probability” someone is who they say they are – which is the level to which Verify will be able to identify people. Hughes says that this will cover the majority of services.
Level 3 requires identity “beyond reasonable doubt” – perhaps including the first application for a passport – and Level 4 would require biometric information to confirm individual identity.

Could digital badges clarify the roles of co-authors?


  at AAAS Science Magazine: “Ever look at a research paper and wonder how the half-dozen or more authors contributed to the work? After all, it’s usually only the first or last author who gets all the media attention or the scientific credit when people are considered for jobs, grants, awards, and more. Some journals try to address this issue with the “authors’ contributions” sections within a paper, but a collection of science, publishing, and software groups is now developing a more modern solution—digital “badges,” assigned on publication of a paper online, that detail what each author did for the work and that the authors can link to their profiles elsewhere on the Web.

Digital badges could clarify co-authors' roles

Those organizations include publishers BioMed Central and the Public Library of Science; The Wellcome Trust research charity; software development groups Mozilla Science Lab (a group of researchers, developers, librarians, and publishers) and Digital Science (a software and technology firm); and ORCID, an effort to assign researchers digital identifiers. The collaboration presented its progress on the project at the Mozilla Festival in London that ended last week. (Mozilla is the open software community behind the Firefox browser and other programs.)
The infrastructure of the badges is still being established, with early prototypes scheduled to launch early next year, according to Amye Kenall, the journal development manager of open data initiatives and journals at BioMed Central. She envisions the badge process in the following way: Once an article is published, the publisher would alert software maintained by Mozilla to automatically set up an online form, where authors fill out roles using a detailed contributor taxonomy. After the authors have completed this, the badges would then appear next to their names on the journal article, and double-clicking on a badge would lead to the ORCID site for that particular author, where the author’s badges, integrated with their publishing record, live….
The parties behind the digital badge effort are “looking to change behavior” of scientists in the competitive dog-eat-dog world of academia by acknowledging contributions, says Kaitlin Thaney, director of Mozilla Science Lab. Amy Brand, vice president of academic and research relations and VP of North America at Digital Science, says that the collaboration believes that the badges should be optional, to accommodate old-fashioned or less tech-savvy authors. She says that the digital credentials may improve lab culture, countering situations where junior scientists are caught up in lab politics and the “star,” who didn’t do much of the actual research apart from obtaining the funding, gets to be the first author of the paper and receive the most credit. “All of this calls out for more transparency,” Brand says….”

Open Data – Searching for the right questions


Talk by Boyan Yurukov at TEDxBG: “Working on various projects Boyan started a sort of a quest for better transparency. It came with the promise of access that would yield answers to what is wrong and what is right with governments today. Over time, he realized that better transparency and more open data bring us almost no relevant answers. Instead, we get more questions and that’s great news. Questions help us see what is relevant, what is hidden, what our assumptions are. That’s the true value of data.
Boyan Yurukov is a software engineer and open data advocate based in Frankfurt. Graduated Computational Engineering with Data Mining from TU Darmstadt. Involved in data liberation, crowd sourcing and visualization projects focused on various issues in Bulgaria as well as open data legislation….

City slicker


The Economist on how “Data are slowly changing the way cities operate…WAITING for a bus on a drizzly winter morning is miserable. But for London commuters Citymapper, an app, makes it a little more bearable. Users enter their destination into a search box and a range of different ways to get there pop up, along with real-time information about when a bus will arrive or when the next Tube will depart. The app is an example of how data are changing the way people view and use cities. Local governments are gradually starting to catch up.
Nearly all big British cities have started to open up access to their data. On October 23rd the second version of the London Datastore, a huge trove of information on everything from crime statistics to delays on the Tube, was launched. In April Leeds City council opened an online “Data Mill” which contains raw data on such things as footfall in the city centre, the number of allotment sites or visits to libraries. Manchester also releases chunks of data on how the city region operates.
Mostly these websites act as tools for developers and academics to play around with. Since the first Datastore was launched in 2010, around 200 apps, such as Citymapper, have sprung up. Other initiatives have followed. “Whereabouts”, which also launched on October 23rd, is an interactive map by the Future Cities Catapult, a non-profit group, and the Greater London Authority (GLA). It uses 235 data sets, some 150 of them from the Datastore, from the age and occupation of London residents to the number of pubs or types of restaurants in an area. In doing so it suggests a different picture of London neighbourhoods based on eight different categories (see map, and its website: whereaboutslondon.org)….”