Why citizen input is crucial to the government design process


Mark Forman in NextGov: “…Whether agencies are implementing an application or enterprisewide solution, end-user input (from both citizens and government workers) is a requirement for success. In fact, the only path to success in digital government is the “moment of truth,” the point of interaction when a government delivers a service or solves a problem for its citizens.

A recent example illustrates this challenge. A national government recently deployed a new application that enables citizens to submit questions to agency offices using their mobile devices. The mobile application, while functional and working to specifications, failed to address the core issue: Most citizens prefer asking questions via email, an option that was terminated when the new app was deployed.

Digital technologies offer government agencies numerous opportunities to cut costs and improve citizen services. But in the rush to implement new capabilities, IT professionals often neglect to consider fully their users’ preferences, knowledge, limitations and goals.

When developing new ways to deliver services, designers must expand their focus beyond the agency’s own operating interests to ensure they also create a satisfying experience for citizens. If not, the applications will likely be underutilized or even ignored, thus undermining the anticipated cost-savings and performance gains that set the project in motion.

Government executives also must recognize merely relying on user input creates a risk of “paving the cowpath”: innovations cannot significantly improve the customer experience if users do not recognize the value of new technologies in simplifying, making more worthwhile, or eliminating a task.

Many digital government playbooks and guidance direct IT organizations to create a satisfying citizen experience by incorporating user-centered design methodology into their projects. UCD is a process for ensuring a new solution or tool is designed from the perspective of users. Rather than forcing government workers or the public to adapt to the new solution, UCD helps create a solution tailored to their abilities, preferences and needs….effective UCD is built upon four primary principles or guidelines:

  • Focus on the moment of truth. A new application or service must actually be something that citizens want and need via the channel used, and not just easy to use.
  • Optimize outcomes, not just processes. True transformation occurs when citizens’ expectations and needs remain the constant center of focus. Merely overlaying new technology on business as usual may provide a prettier interface, but success requires a clear benefit for the public at the moment of truth in the interaction with government.
  • Evolve processes over time to help citizens adapt to new applications. In most instances, citizens will make a smoother transition to new services when processes are changed gradually to be more intuitive rather than with an abrupt, flip-of-the-switch approach.
  • Combine UCD with robust DevOps. Agencies need a strong DevOps process to incorporate what they learn about citizens’ preferences and needs as they develop, test and deploy new citizen services….(More)”

Three ways to grow the open data economy


Nigel Shadbolt in The Guardian: “…here are three areas where action by the UK government can help to support and promote a flourishing open data economy

Strengthen our data infrastructure

We are used to thinking of areas like transport and energy requiring physical infrastructure. From roads and rail networks to the national grid and power stations, we understand that investment and management of these vital parts of an infrastructure are essential to the economic wellbeing and future prosperity of the nation.

This is no less true of key data assets. Our data infrastructure is a core part of our national infrastructure. From lists of legally constituted companies to the country’s geospatial data, our data infrastructure needs to be managed, maintained, in some cases built and in all cases made as widely available as possible.

To maximise the benefits to the UK’s economy and to reduce costs in delivery of public services, the data we rely on needs to be adaptable, trustworthy, and as open as possible….

While we do have some excellent examples of infrastructure data from the likes of Companies House, Land Registry, Ordnance Survey and Defra, core parts of the data infrastructure that we need within the UK are missing, unreliable, or of a low quality. The government must invest here just as it invests in our other traditional infrastructure.

Support and promote data innovation

If we are to make best use of data, we need a bridge between academic research, public, private and third sectors, and a thriving startup ecosystem where new ideas and approaches can grow.

We have learned that properly targeted challenges could identify potential savings for government – similar to Prescribing Analytics, an ODI-incubated startup which used publicly available data to identify £200m in prescriptions savings per year for the NHS – but, more importantly, translate that potential into procurable products and services that could deliver those savings.

A data challenge series run at a larger scale, funded by Innovate UK, openly contested and independently managed, would stimulate the creation of new companies, jobs, products and services. It would also act as a forcing function to strengthen data infrastructure around key challenges, and raise awareness and capacity for those working to solve them. The data needed to satisfy the challenges would have to be made available and usable, bringing data innovation into government and bolstering the offer of the startups and SMEs who take part.

Invest in data literacy

In order to take advantage of the data revolution, policymakers, businesses and citizens need to understand how to make use of data. In other words, they must become data literate.

Data literacy is needed through our whole educational system and society more generally. Crucially, policymakers are going to need to be informed by insights that can only be gleaned through understanding and analysing data effectively….(More)”

Learning Privacy Expectations by Crowdsourcing Contextual Informational Norms


 at Freedom to Tinker: “The advent of social apps, smart phones and ubiquitous computing has brought a great transformation to our day-to-day life. The incredible pace with which the new and disruptive services continue to emerge challenges our perception of privacy. To keep apace with this rapidly evolving cyber reality, we need to devise agile methods and frameworks for developing privacy-preserving systems that align with evolving user’s privacy expectations.

Previous efforts have tackled this with the assumption that privacy norms are provided through existing sources such law, privacy regulations and legal precedents. They have focused on formally expressing privacy norms and devising a corresponding logic to enable automatic inconsistency checks and efficient enforcement of the logic.

However, because many of the existing regulations and privacy handbooks were enacted well before the Internet revolution took place, they often lag behind and do not adequately reflect the application of logic in modern systems. For example, the Family Rights and Privacy Act (FERPA) was enacted in 1974, long before Facebook, Google and many other online applications were used in an educational context. More recent legislation faces similar challenges as novel services introduce new ways to exchange information, and consequently shape new, unconsidered information flows that can change our collective perception of privacy.

Crowdsourcing Contextual Privacy Norms

Armed with the theory of Contextual Integrity (CI) in our work, we are exploring ways to uncover societal norms by leveraging the advances in crowdsourcing technology.

In our recent paper, we present the methodology that we believe can be used to extract a societal notion of privacy expectations. The results can be used to fine tune the existing privacy guidelines as well as get a better perspective on the users’ expectations of privacy.

CI defines privacy as collection of norms (privacy rules) that reflect appropriate information flows between different actors. Norms capture who shares what, with whom, in what role, and under which conditions. For example, while you are comfortable sharing your medical information with your doctor, you might be less inclined to do so with your colleagues.

We use CI as a proxy to reason about privacy in the digital world and a gateway to understanding how people perceive privacy in a systematic way. Crowdsourcing is a great tool for this method. We are able to ask hundreds of people how they feel about a particular information flow, and then we can capture their input and map it directly onto the CI parameters. We used a simple template to write Yes-or-No questions to ask our crowdsourcing participants:

“Is it acceptable for the [sender] to share the [subject’s] [attribute] with [recipient] [transmission principle]?”

For example:

“Is it acceptable for the student’s professor to share the student’s record of attendance with the department chair if the student is performing poorly? ”

In our experiments, we leveraged Amazon’s Mechanical Turk (AMT) to ask 450 turkers over 1400 such questions. Each question represents a specific contextual information flow that users can approve, disapprove or mark under the Doesn’t Make Sense category; the last category could be used when 1) the sender is unlikely to have the information, 2) the receiver would already have the information, or 3) the question is ambiguous….(More)”

The Participatory Condition in the Digital Age


Book edited by Darin Barney, Gabriella Coleman, Christine Ross, Jonathan Sterne, and Tamar Tembeck:

The Participatory Condition in the Digital Age

“Just what is the “participatory condition”? It is the situation in which taking part in something with others has become both environmental and normative. The fact that we have always participated does not mean we have always lived under the participatory condition. What is distinctive about the present is the extent to which the everyday social, economic, cultural, and political activities that comprise simply being in the world have been thematized and organized around the priority of participation.

Structured along four axes investigating the relations between participation and politics, surveillance, openness, and aesthetics, The Participatory Condition in the Digital Age comprises fifteen essays that explore the promises, possibilities, and failures of contemporary participatory media practices as related to power, Occupy Wall Street, the Arab Spring uprisings, worker-owned cooperatives for the post-Internet age; paradoxes of participation, media activism, open source projects; participatory civic life; commercial surveillance; contemporary art and design; and education.

This book represents the most comprehensive and transdisciplinary endeavor to date to examine the nature, place, and value of participation in the digital age. Just as in 1979, when Jean-François Lyotard proposed that “the postmodern condition” was characterized by the questioning of historical grand narratives, The Participatory Condition in the Digital Age investigates how participation has become a central preoccupation of our time….(More)”

The Digital City and Mediated Urban Ecologies


 Book by Kristin Scott: “…This book examines the phenomenon of the “digital city” in the US by looking at three case studies: New York City, San Antonio, and Seattle. Kristin Scott considers how digital technologies are increasingly built into the logic and organization of urban spaces and argues that while each city articulates ideals such as those of open democracy, civic engagement, efficient governance, and enhanced security, competing capitalist interests attached to many of these digital technological programs make the “digital city” problematic….(More)”

How to ensure smart cities benefit everyone


 at the Conversation Global: “By 2030, 60 percent of the world’s population is expected to live in mega-cities. How all those people live, and what their lives are like, will depend on important choices leaders make today and in the coming years.

Technology has the power to help people live in communities that are more responsive to their needs and that can actually improve their lives. For example, Beijing, notorious for air pollution, is testing a 23-foot-tall air purifier that vacuums up smog, filters the bad particles and releases clear air.

This isn’t a vision of life like on “The Jetsons.” It’s real urban communities responding in real-time to changing weather, times of day and citizen needs. These efforts can span entire communities. They can vary from monitoring traffic to keep cars moving efficiently or measuring air quality to warn residents (or turn on massive air purifiers) when pollution levels climb.

Using data and electronic sensors in this way is often referred to as building “smart cities,” which are the subject of a major global push to improve how cities function. In part a response to incoherent infrastructure design and urban planning of the past, smart cities promise real-time monitoring, analysis and improvement of city decision-making. The results, proponents say, will improve efficiency, environmental sustainability and citizen engagement.

Smart city projects are big investments that are supposed to drive social transformation. Decisions made early in the process determine what exactly will change. But most research and planning regarding smart cities is driven by the technology, rather than the needs of the citizens. Little attention is given to the social, policy and organizational changes that will be required to ensure smart cities are not just technologically savvy but intelligently adaptive to their residents’ needs. Design will make the difference between smart city projects offering great promise or actually reinforcing or even widening the existing gaps in unequal ways their cities serve residents.

City benefits from efficiency

A key feature of smart cities is that they create efficiency. Well-designed technology tools can benefit government agencies, the environment and residents. Smart cities can improve the efficiency of city services by eliminating redundancies, finding ways to save money and streamlining workers’ responsibilities. The results can provide higher-quality services at lower cost….

Environmental effects

Another way to save money involves real-time monitoring of energy use, which can also identify opportunities for environmental improvement.

The city of Chicago has begun implementing an “Array of Things” initiative by installing boxes on municipal light poles with sensors and cameras that can capture air quality, sound levels, temperature, water levels on streets and gutters, and traffic.

The data collected are expected to serve as a sort of “fitness tracker for the city,” by identifying ways to save energy, to address urban flooding and improve living conditions.

Helping residents

Perhaps the largest potential benefit from smart cities will come from enhancing residents’ quality of life. The opportunities cover a broad range of issues, including housing and transportation, happiness and optimism, educational services, environmental conditions and community relationships.

Efforts along this line can include tracking and mapping residents’ health, using data to fight neighborhood blight, identifying instances of discrimination and deploying autonomous vehicles to increase residents’ safety and mobility….(More)“.

Big Data Is Not a Monolith


Book edited by Cassidy R. Sugimoto, Hamid R. Ekbia and Michael Mattioli: “Big data is ubiquitous but heterogeneous. Big data can be used to tally clicks and traffic on web pages, find patterns in stock trades, track consumer preferences, identify linguistic correlations in large corpuses of texts. This book examines big data not as an undifferentiated whole but contextually, investigating the varied challenges posed by big data for health, science, law, commerce, and politics. Taken together, the chapters reveal a complex set of problems, practices, and policies.

The advent of big data methodologies has challenged the theory-driven approach to scientific knowledge in favor of a data-driven one. Social media platforms and self-tracking tools change the way we see ourselves and others. The collection of data by corporations and government threatens privacy while promoting transparency. Meanwhile, politicians, policy makers, and ethicists are ill-prepared to deal with big data’s ramifications. The contributors look at big data’s effect on individuals as it exerts social control through monitoring, mining, and manipulation; big data and society, examining both its empowering and its constraining effects; big data and science, considering issues of data governance, provenance, reuse, and trust; and big data and organizations, discussing data responsibility, “data harm,” and decision making….(More)”

Tackling Corruption with People-Powered Data


Sandra Prüfer at Mastercard Center for Inclusive Growth: “Informal fees plague India’s “free” maternal health services. In Nigeria, village households don’t receive the clean cookstoves their government paid for. Around the world, corruption – coupled with the inability to find and share information about it – stymies development in low-income communities.

Now, digital transparency platforms – supplemented with features illiterate and rural populations can use – make it possible for traditionally excluded groups to make their voices heard and access tools they need to grow.

Mapping Corruption Hot Spots in India

One of the problems surrounding access to information is the lack of reliable information in the first place: a popular method to create knowledge is crowdsourcing and enlisting the public to monitor and report on certain issues.

The Mera Swasthya Meri Aawaz platform, which means “Our Health, Our Voice”, is an interactive map in Uttar Pradesh launched by the Indian non-profit organization SAHAYOG. It enables women to anonymously report illicit fees charged for services at maternal health clinics using their mobile phones.

To reduce infant mortality and deaths in childbirth, the Indian government provides free prenatal care and cash incentives to use maternal health clinics, but many charge illegal fees anyway – cutting mothers off from lifesaving healthcare and inhibiting communities’ growth. An estimated 45,000 women in India died in 2015 from complications of pregnancy and childbirth – one of the highest rates of any country in the world; low-income women are disproportionately affected….“Documenting illegal payment demands in real time and aggregating the data online increased governmental willingness to listen,” Sandhya says. “Because the data is linked to technology, its authenticity is not questioned.”

Following the Money in Nigeria

In Nigeria, Connected Development (CODE) also champions open data to combat corruption in infrastructure building, health and education projects. Its mission is to improve access to information and empower local communities to share data that can expose financial irregularities. Since 2012, the Abuja-based watchdog group has investigated twelve capital projects, successfully pressuring the government to release funds including $5.3 million to treat 1,500 lead-poisoned children.

“People activate us: if they know about any project that is supposed to be in their community, but isn’t, they tell us they want us to follow the money – and we’ll take it from there,” says CODE co-founder Oludotun Babayemi.

Users alert the watchdog group directly through its webpage, which publishes open-source data about development projects that are supposed to be happening, based on reports from freedom of information requests to Nigeria’s federal minister of environment, World Bank data and government press releases.

Last year, as part of their #WomenCookstoves reporting campaign, CODE revealed an apparent scam by tracking a $49.8 million government project that was supposed to purchase 750,000 clean cookstoves for rural women. Smoke inhalation diseases disproportionately affect women who spend time cooking over wood fires; according to the World Health Organization, almost 100,000 people die yearly in Nigeria from inhaling wood smoke, the country’s third biggest killer after malaria and AIDS.

“After three months, we found out that only 15 percent of the $48 million was given to the contractor – meaning there were only 45,000 cook stoves out of 750,000 in the county,” Babayemi says….(More)”

How to Succeed in the Networked World: A Grand Strategy for the Digital Age


 in Foreign Affairs: “Foreign policy experts have long been taught to see the world as a chessboard, analyzing the decisions of great powers and anticipating rival states’ reactions in a continual game of strategic advantage. Nineteenth-century British statesmen openly embraced this metaphor, calling their contest with Russia in Central Asia “the Great Game.” Today, the TV show Game of Thrones offers a particularly gory and irresistible version of geopolitics as a continual competition among contending kingdoms.

Think of a standard map of the world, showing the borders and capitals of the world’s 190-odd countries. That is the chessboard view.

Now think of a map of the world at night, with the lit-up bursts of cities and the dark swaths of wilderness. Those corridors of light mark roads, cars, houses, and offices; they mark the networks of human relationships, where families and workers and travelers come together. That is the web view. It is a map not of separation, marking off boundaries of sovereign power, but of connection.

To see the international system as a web is to see a world not of states but of networks. It is the world of terrorism; of drug, arms, and human trafficking; of climate change and declining biodiversity; of water wars and food insecurity; of corruption, money laundering, and tax evasion; of pandemic disease carried by air, sea, and land. In short, it is the world of many of the most pressing twenty-first-century global threats… (More)”.

Predicting judicial decisions of the European Court of Human Rights: a Natural Language Processing perspective


 et al at Peer J. Computer Science: “Recent advances in Natural Language Processing and Machine Learning provide us with the tools to build predictive models that can be used to unveil patterns driving judicial decisions. This can be useful, for both lawyers and judges, as an assisting tool to rapidly identify cases and extract patterns which lead to certain decisions. This paper presents the first systematic study on predicting the outcome of cases tried by the European Court of Human Rights based solely on textual content. We formulate a binary classification task where the input of our classifiers is the textual content extracted from a case and the target output is the actual judgment as to whether there has been a violation of an article of the convention of human rights. Textual information is represented using contiguous word sequences, i.e., N-grams, and topics. Our models can predict the court’s decisions with a strong accuracy (79% on average). Our empirical analysis indicates that the formal facts of a case are the most important predictive factor. This is consistent with the theory of legal realism suggesting that judicial decision-making is significantly affected by the stimulus of the facts. We also observe that the topical content of a case is another important feature in this classification task and explore this relationship further by conducting a qualitative analysis….(More)”