UK’s Digital Strategy


Executive Summary: “This government’s Plan for Britain is a plan to build a stronger, fairer country that works for everyone, not just the privileged few. …Our digital strategy now develops this further, applying the principles outlined in the Industrial Strategy green paper to the digital economy. The UK has a proud history of digital innovation: from the earliest days of computing to the development of the World Wide Web, the UK has been a cradle for inventions which have changed the world. And from Ada Lovelace – widely recognised as the first computer programmer – to the pioneers of today’s revolution in artificial intelligence, the UK has always been at the forefront of invention. …

Maintaining the UK government as a world leader in serving its citizens online

From personalised services in health, to safer care for the elderly at home, to tailored learning in education and access to culture – digital tools, techniques and technologies give us more opportunities than ever before to improve the vital public services on which we all rely.

The UK is already a world leader in digital government,7 but we want to go further and faster. The new Government Transformation Strategy published on 9 February 2017 sets out our intention to serve the citizens and businesses of the UK with a better, more coherent experience when using government services online – one that meets the raised expectations set by the many other digital services and tools they use every day. So, we will continue to develop single cross-government platform services, including by working towards 25 million GOV.UK Verify users by 2020 and adopting new services onto the government’s GOV.UK Pay and GOV.UK Notify platforms.

We will build on the ‘Government as a Platform’ concept, ensuring we make greater reuse of platforms and components across government. We will also continue to move towards common technology, ensuring that where it is right we are consuming commodity hardware or cloud-based software instead of building something that is needlessly government specific.

We will also continue to work, across government and the public sector, to harness the potential of digital to radically improve the efficiency of our public services – enabling us to provide a better service to citizens and service users at a lower cost. In education, for example, we will address the barriers faced by schools in regions not connected to appropriate digital infrastructure and we will invest in the Network of Teaching Excellence in Computer Science to help teachers and school leaders build their knowledge and understanding of technology. In transport, we will make our infrastructure smarter, more accessible and more convenient for passengers. At Autumn Statement 2016 we announced that the National Productivity Investment Fund would allocate £450 million from 2018-19 to 2020-21 to trial digital signalling technology on the rail network. And in policing, we will enable police officers to use biometric applications to match fingerprint and DNA from scenes of crime and return results including records and alerts to officers over mobile devices at the crime scene.

Read more about digital government.

Unlocking the power of data in the UK economy and improving public confidence in its use

As part of creating the conditions for sustainable growth, we will take the actions needed to make the UK a world-leading data-driven economy, where data fuels economic and social opportunities for everyone, and where people can trust that their data is being used appropriately.

Data is a global commodity and we need to ensure that our businesses can continue to compete and communicate effectively around the world. To maintain our position at the forefront of the data revolution, we will implement the General Data Protection Regulation by May 2018. This will ensure a shared and higher standard of protection for consumers and their data.

Read more about data….(More)”

Watchdog to launch inquiry into misuse of data in politics


, and Alice Gibbs in The Guardian: “The UK’s privacy watchdog is launching an inquiry into how voters’ personal data is being captured and exploited in political campaigns, cited as a key factor in both the Brexit and Trump victories last year.

The intervention by the Information Commissioner’s Office (ICO) follows revelations in last week’s Observer that a technology company part-owned by a US billionaire played a key role in the campaign to persuade Britons to vote to leave the European Union.

It comes as privacy campaigners, lawyers, politicians and technology experts express fears that electoral laws are not keeping up with the pace of technological change.

“We are conducting a wide assessment of the data-protection risks arising from the use of data analytics, including for political purposes, and will be contacting a range of organisations,” an ICO spokeswoman confirmed. “We intend to publicise our findings later this year.”

The ICO spokeswoman confirmed that it had approached Cambridge Analytica over its apparent use of data following the story in the Observer. “We have concerns about Cambridge Analytica’s reported use of personal data and we are in contact with the organisation,” she said….

In the US, companies are free to use third-party data without seeking consent. But Gavin Millar QC, of Matrix Chambers, said this was not the case in Europe. “The position in law is exactly the same as when people would go canvassing from door to door,” Millar said. “They have to say who they are, and if you don’t want to talk to them you can shut the door in their face.That’s the same principle behind the data protection act. It’s why if telephone canvassers ring you, they have to say that whole long speech. You have to identify yourself explicitly.”…

Dr Simon Moores, visiting lecturer in the applied sciences and computing department at Canterbury Christ Church University and a technology ambassador under the Blair government, said the ICO’s decision to shine a light on the use of big data in politics was timely.

“A rapid convergence in the data mining, algorithmic and granular analytics capabilities of companies like Cambridge Analytica and Facebook is creating powerful, unregulated and opaque ‘intelligence platforms’. In turn, these can have enormous influence to affect what we learn, how we feel, and how we vote. The algorithms they may produce are frequently hidden from scrutiny and we see only the results of any insights they might choose to publish.” …(More)”

Americans have lost faith in institutions. That’s not because of Trump or ‘fake news.’


Bill Bishop in the Washington Post: “…Trust in American institutions, however, has been in decline for some time. Trump is merely feeding on that sentiment.

The leaders of once-powerful institutions are desperate to resurrect the faith of the people they serve. They act like they have misplaced a credit card and must find the number so that a replacement can be ordered and then FedEx-ed, if possible overnight.

But that delivery truck is never coming. The decline in trust isn’t because of what the press (or politicians or scientists) did or didn’t do. Americans didn’t lose their trust because of some particular event or scandal. And trust can’t be regained with a new app or even an outbreak of competence. To believe so is to misunderstand what was lost.

In 1964, 3 out of 4 Americans trusted their government to do the right thing most of the time. By 1976, that number had dropped to 33 percent. It was a decline that political scientist Walter Dean Burnham described as “among the largest ever recorded in opinion surveys.”…

Everything about modern life works against community and trust. Globalization and urbanization put people in touch with the different and the novel. Our economy rewards initiative over conformity, so that the weight of convention and tradition doesn’t squelch the latest gizmo from coming to the attention of the next Bill Gates. Whereas parents in the 1920s said it was most important for their children to be obedient, that quality has declined in importance, replaced by a desire for independence and autonomy. Widespread education gives people the tools to make up their own minds. And technology offers everyone the chance to be one’s own reporter, broadcaster and commentator.

We have become, in Polish sociologist Zygmunt Bauman’s description, “artists of our own lives,” ignoring authorities and booting traditions while turning power over to the self. The shift in outlook has been all-encompassing. It has changed the purpose of marriage (once a practical arrangement, now a means of personal fulfillment). It has altered the relationship between citizens and the state (an all-volunteer fighting force replacing the military draft). It has transformed the understanding of art (craftsmanship and assessment are out; free-range creativity and self-promotion are in). It has even inverted the orders of humanity and divinity (instead of obeying a god, now we choose one).

People enjoy their freedoms. There’s no clamoring for a return to gray flannel suits and deferential housewives. Constant social retooling and choice come with costs, however. Without the authority and guidance of institutions to help order their lives, many people feel overwhelmed and adrift. “Depression is truly our modern illness,” writes French sociologist Alain Ehrenberg, with rates 20 to 30 times what they were just two generations ago.

Sustained collective action has also become more difficult. Institutions are turning to behavioral “nudges,” hoping to move an increasingly suspicious public to do what once could be accomplished by command or law. As groups based on tradition and consistent association dwindle, they are being replaced by “event communities,” temporary gatherings that come and go without long-term commitment (think Burning Man). The protests spawned by Trump’s election are more about passion than organization and focus. Today’s demonstrations are sometimes compared to civil-rights-era marches, but they have more in common with L.A.’s Sunset Strip riots of 1966, when more than 1,000 young people gathered to object to a 10 p.m. curfew. “There’s something happening here,” goes the Buffalo Springfield song “For What It’s Worth,” commemorating the riots. “What it is ain’t exactly clear.” In our new politics, expression is a purpose itself….(More)”.

Digital Media Integration for Participatory Democracy


Book by Rocci Luppicini and Rachel Baarda: “Digital technology has revitalized the landscape of political affairs. As e-government continues to become more prominent in society, conducting further research in this realm is vital to promoting democratic advancements.

Digital Media Integration for Participatory Democracy provides a comprehensive examination of the latest methods and trends used to engage citizens with the political world through new information and communication technologies. Highlighting innovative practices and applications across a variety of areas such as technoethics, civic literacy, virtual reality, and social networking, this book is an ideal reference source for government officials, academicians, students, and researchers interested in the enhancement of citizen engagement in modern democracies….(More)”

The Whatsapp-inspired, Facebook-investor funded app tackling India’s doctor shortage


 at TechInAsia: “A problem beyond India’s low doctor-to-patient ratio is the distribution of those doctors. Most, particularly specialists, congregate in bigger cities and get seen by patients in the surrounding areas. Only 19 percent of specialists are available in community health centers across India, and most fall well below the country’s requirement for specialists. Community health centers are located in smaller towns and help patients in the area decide if they need to visit a larger, better-equipped city facility….

The IIT-Madras grad’s company, DocsApp, co-founded with fellow IIT-Madras alum Enbasekar D (CTO), joins startups like Practo, DocDoc, and Medinfi in helping patients find physicians. However, the app’s main focus is specialists, and it lets patients chat with doctors and get consultations.

DocsApp’s name is directly inspired by WhatsApp. As long as you have a chat screen on your phone, you can input your problems and location, find a doctor, and ask questions. A user can pay for his or her own appointment over mobile. If treatment requires a physical visit, the user’s money is refunded….

Doctor profiles include the physician’s experience, medical counsel ID, patient reviews, specialty, and languages – DocsApp covers 17 different languages. DocsApp has 1,200 doctors in 15 specialties. All doctors on the platform are verified by looking up certification, an interview, and a facilities review.

If a consultation reveals that a patient needs a prescription, the doctor can provide a digitally-signed e-prescription. DocsApp can deliver medicines within two days to any location in India, says Satish.

Once a user has access to one of the doctors, he or she can message the doctor 24/7 and get a response in 30 minutes – Satish says that the company’s average is now 18 minutes. The team of 55 is aiming for a minute or less….

Telemedicine is one of the ways tech is combatting India’s doctor shortage. Other startups in the industry in the country include Visit, which focuses on both physical and mental health, and SeeDoc, a physician video consultation app.

A chat is a little less personal than a physical visit, which can open the door for patients who want to discuss more taboo topics in India, like mental health and fertility questions. Satish adds that women who live in locations where it’s best to be accompanied by a man when going out also find convenience, as they don’t necessarily need to wait for a husband to come back from work before addressing a medical question she has about her child…(More)”.

Restoring Trust in Expertise


Minouche Shafik at Project Syndicate: “…public confidence in experts is at a crossroads. With news becoming more narrowly targeted to individual interests and preferences, and with people increasingly choosing whom to trust and follow, the traditional channels for sharing expertise are being disrupted. Who needs experts when you have Facebook, Google, Mumsnet, and Twitter?

Actually, we all do. Over the course of human history, the application of expertise has helped tackle disease, reduce poverty, and improve human welfare. If we are to build on this progress, we need reliable experts to whom the public can confidently turn.

Restoring confidence requires, first, that those describing themselves as “experts” embrace uncertainty. Rather than pretending to be certain and risk frequently getting it wrong, commentators should be candid about uncertainty. Over the long term, such an approach will rebuild credibility. A good example of this is the use of “fan charts” in forecasts produced by the Bank of England’s Monetary Policy Committee (MPC), which show the wide range of possible outcomes for issues such as inflation, growth, and unemployment.

Yet conveying uncertainty increases the complexity of a message. This is a major challenge. It is easy to tweet “BoE forecasts 2% growth.” The fan chart’s true meaning – “If economic circumstances identical to today were to prevail on 100 occasions, the MPC’s best collective judgment is that the mature estimate of GDP growth would lie above 2% on 50 occasions and below 2% on 50 occasions” – doesn’t even fit within Twitter’s 140-character limit.

This underscores the need for sound principles and trustworthy practices to become more widespread as technology changes the way we consume information. Should journalists and bloggers be exposed for reporting or recirculating falsehoods or rumors? Perhaps principles and practices widely used in academia – such as peer review, competitive processes for funding research, transparency about conflicts of interests and financing sources, and requirements to publish underlying data – should be adapted and applied more widely to the world of think tanks, websites, and the media….

Schools and universities will have to do more to educate students to be better consumers of information. Striking research by the Stanford History Education Group, based on tests of thousands of students across the US, described as “bleak” their findings about young people’s ability to evaluate information they encounter online. Fact-checking websites appraising the veracity of claims made by public figures are a step in the right direction, and have some similarities to peer review in academia.

Listening to the other side is crucial. Social media exacerbates the human tendency of groupthink by filtering out opposing views. We must therefore make an effort to engage with opinions that are different from our own and resist algorithmic channeling to avoid difference. Perhaps technology “experts” could code algorithms that burst such bubbles.

Finally, the boundary between technocracy and democracy needs to be managed more carefully. Not surprisingly, when unelected individuals steer decisions that have huge social consequences, public resentment may not be far behind. Problems often arise when experts try to be politicians or politicians try to be experts. Clarity about roles – and accountability when boundaries are breached – is essential.

We need expertise more than ever to solve the world’s problems. The question is not how to manage without experts, but how to ensure that expertise is trustworthy. Getting this right is vital: if the future is not to be shaped by ignorance and narrow-mindedness, we need knowledge and informed debate more than ever before….(More)”.

Global Patterns of Synchronization in Human Communications


Alfredo J. Morales, Vaibhav Vavilala, Rosa M. Benito, and Yaneer Bar-Yam in the Journal of the Royal Society Interface: “Social media are transforming global communication and coordination and provide unprecedented opportunities for studying socio-technical domains. Here we study global dynamical patterns of communication on Twitter across many scales. Underlying the observed patterns is both the diurnal rotation of the earth, day and night, and the synchrony required for contingency of actions between individuals. We find that urban areas show a cyclic contraction and expansion that resembles heartbeats linked to social rather than natural cycles. Different urban areas have characteristic signatures of daily collective activities. We show that the differences detected are consistent with a new emergent global synchrony that couples behavior in distant regions across the world. Although local synchrony is the major force that shapes the collective behavior in cities, a larger-scale synchronization is beginning to occur….(More)”.

What Makes for Successful Open Government Co-Creation?


Panthea Lee at Reboot: “The promise of open government is unlocked when diverse actors work together toward a common vision. It requires engagement by citizens, government, civil society, the private sector, and others with a stake in good governance. Yet while collaborators may share values of transparency, participation, accountability, and innovation, the actual practice of co-creating solutions to advance these ideals can be messy…. we’ve surfaced some insights on what leads to successful co-creation; a sample is shared here, illustrated with snapshots from our remarkable partners. The issues each grappled with will be familiar to anyone working in open government, and we hope that their approaches to addressing the issues will inspire. Finally, we were excited to see OGP release draft co-creation standards to help strengthen government and civil society collaborations on the open government agenda, and we hope these stories help illuminate those guidelines.

When setting a vision Build on existing priorities and opportunities

Successful open government programs don’t start from scratch—they align with existing political mandates and institutional assets. By building upon current initiatives, and taking advantage of windows of political opportunity, initiatives can have more widespread and sustainable wins….(More)”.

AI, machine learning and personal data


Jo Pedder at the Information Commissioner’s Office Blog: “Today sees the publication of the ICO’s updated paper on big data and data protection.

But why now? What’s changed in the two and a half years since we first visited this topic? Well, quite a lot actually:

  • big data is becoming the norm for many organisations, using it to profile people and inform their decision-making processes, whether that’s to determine your car insurance premium or to accept/reject your job application;
  • artificial intelligence (AI) is stepping out of the world of science-fiction and into real life, providing the ‘thinking’ power behind virtual personal assistants and smart cars; and
  • machine learning algorithms are discovering patterns in data that traditional data analysis couldn’t hope to find, helping to detect fraud and diagnose diseases.

The complexity and opacity of these types of processing operations mean that it’s often hard to know what’s going on behind the scenes. This can be problematic when personal data is involved, especially when decisions are made that have significant effects on people’s lives. The combination of these factors has led some to call for new regulation of big data, AI and machine learning, to increase transparency and ensure accountability.

In our view though, whilst the means by which the processing of personal data are changing, the underlying issues remain the same. Are people being treated fairly? Are decisions accurate and free from bias? Is there a legal basis for the processing? These are issues that the ICO has been addressing for many years, through oversight of existing European data protection legislation….(More)”

When the Big Lie Meets Big Data


Peter Bruce in Scientific America: “…The science of predictive modeling has come a long way since 2004. Statisticians now build “personality” models and tie them into other predictor variables. … One such model bears the acronym “OCEAN,” standing for the personality characteristics (and their opposites) of openness, conscientiousness, extroversion, agreeableness, and neuroticism. Using Big Data at the individual level, machine learning methods might classify a person as, for example, “closed, introverted, neurotic, not agreeable, and conscientious.”

Alexander Nix, CEO of Cambridge Analytica (owned by Trump’s chief donor, Rebekah Mercer), says he has thousands of data points on you, and every other voter: what you buy or borrow, where you live, what you subscribe to, what you post on social media, etc. At a recent Concordia Summit, using the example of gun rights, Nix described how messages will be crafted to appeal specifically to you, based on your personality profile. Are you highly neurotic and conscientious? Nix suggests the image of a sinister gloved hand reaching through a broken window.

In his presentation, Nix noted that the goal is to induce behavior, not communicate ideas. So where does truth fit in? Johan Ugander, Assistant Professor of Management Science at Stanford, suggests that, for Nix and Cambridge Analytica, it doesn’t. In counseling the hypothetical owner of a private beach how to keep people off his property, Nix eschews the merely factual “Private Beach” sign, advocating instead a lie: “Sharks sighted.” Ugander, in his critique, cautions all data scientists against “building tools for unscrupulous targeting.”

The warning is needed, but may be too late. What Nix described in his presentation involved carefully crafted messages aimed at his target personalities. His messages pulled subtly on various psychological strings to manipulate us, and they obeyed no boundary of truth, but they required humans to create them.  The next phase will be the gradual replacement of human “craftsmanship” with machine learning algorithms that can supply targeted voters with a steady stream of content (from whatever source, true or false) designed to elicit desired behavior. Cognizant of the Pandora’s box that data scientists have opened, the scholarly journal Big Data has issued a call for papers for a future issue devoted to “Computational Propaganda.”…(More)”