Government data does not mean data governance: Lessons learned from a public sector application audit


Paper by Nik ThompsonRavi Ravindran, and Salvatore Nicosia: “Public sector agencies routinely store large volumes of information about individuals in the community. The storage and analysis of this information benefits society, as it enables relevant agencies to make better informed decisions and to address the individual’s needs more appropriately. Members of the public often assume that the authorities are well equipped to handle personal data; however, due to implementation errors and lack of data governance, this is not always the case. This paper reports on an audit conducted in Western Australia, focusing on findings in the Police Firearms Management System and the Department of Health Information System. In the case of the Police, the audit revealed numerous data protection issues leading the auditors to report that they had no confidence in the accuracy of information on the number of people licensed to possess firearms or the number of licensed firearms. Similarly alarming conclusions were drawn in the Department of Health as auditors found that they could not determine which medical staff member was responsible for clinical data entries made. The paper describes how these issues often do not arise from existing business rules or the technology itself, but a lack of sound data governance. Finally, a discussion section presents key data governance principles and best practices that may guide practitioners involved in data management. These cases highlight the very real data management concerns, and the associated recommendations provide the context to spark further interest in the applied aspects of data protection….(More)”

 

Civic open data at a crossroads: Dominant models and current challenges


Renee E. Sieber and Peter A. Johnson in Government Information Quarterly: “As open data becomes more widely provided by government, it is important to ask questions about the future possibilities and forms that government open data may take. We present four models of open data as they relate to changing relations between citizens and government. These models include; a status quo ‘data over the wall’ form of government data publishing, a form of ‘code exchange’, with government acting as an open data activist, open data as a civic issue tracker, and participatory open data. These models represent multiple end points that can be currently viewed from the unfolding landscape of government open data. We position open data at a crossroads, with significant concerns of the conflicting motivations driving open data, the shifting role of government as a service provider, and the fragile nature of open data within the government space. We emphasize that the future of open data will be driven by the negotiation of the ethical-economic tension that exists between provisioning governments, citizens, and private sector data users….(More)”

 

Confidence in U.S. Institutions Still Below Historical Norms


Jeffrey M. Jones at Gallup: “Americans’ confidence in most major U.S. institutions remains below the historical average for each one. Only the military (72%) and small business (67%) — the highest-rated institutions in this year’s poll — are currently rated higher than their historical norms, based on the percentage expressing “a great deal” or “quite a lot” of confidence in the institution.

Confidence in U.S. Institutions, 2015 vs. Historical Average for Each Institution

These results are based on a June 2-7 Gallup poll that included Gallup’s latest update on confidence in U.S. institutions. Gallup first measured confidence ratings in 1973 and has updated them each year since 1993.

Americans’ confidence in most major institutions has been down for many years as the nation has dealt with prolonged wars in Iraq and Afghanistan, a major recession and sluggish economic improvement, and partisan gridlock in Washington. In fact, 2004 was the last year most institutions were at or above their historical average levels of confidence. Perhaps not coincidentally, 2004 was also the last year Americans’ satisfaction with the way things are going in the United States averaged better than 40%. Currently, 28% of Americans are satisfied with the state of the nation.

From a broad perspective, Americans’ confidence in all institutions over the last two years has been the lowest since Gallup began systematic updates of a larger set of institutions in 1993. The average confidence rating of the 14 institutions asked about annually since 1993 — excluding small business, asked annually since 2007 — is 32% this year. This is one percentage point above the all-institution average of 31% last year. Americans were generally more confident in all institutions in the late 1990s and early 2000s as the country enjoyed a strong economy and a rally in support for U.S. institutions after the 9/11 terrorist attacks.

Trend: Average Confidence Rating Across All Institutions, by Year

Confidence in Political, Financial and Religious Institutions Especially Low

Today’s confidence ratings of Congress, organized religion, banks, the Supreme Court and the presidency show the greatest deficits compared with their historical averages, all running at least 10 points below that mark. Americans’ frustration with the government’s performance has eroded the trust they have in all U.S. political institutions….(More)”

The Climatologist’s Almanac


Clara Chaisson at onEarth: “Forget your weather app with its five- or even ten-day forecasts—a supercomputer at NASA has just provided us with high-resolution climate projections through the end of the century. The massive new 11-terabyte data set combines historical daily temperatures and precipitation measurements with climate simulations under two greenhouse gas emissions scenarios. The project spans from 1950 to 2100, but users can easily zero in on daily timescales for their own locales—which is precisely the point.

The projections can be found on Amazon for free for all to see and plan by. The space agency hopes that developing nations and poorer communities that may not have any spare supercomputers lying around will use the info to predict and prepare for climate change. …(More)”

Why open data should be central to Fifa reform


Gavin Starks in The Guardian: “Over the past two weeks, Fifa has faced mounting pressure to radically improve its transparency and governance in the wake of corruption allegations. David Cameron has called for reforms including expanding the use of open data.

Open data is information made available by governments, businesses and other groups for anyone to read, use and share. Data.gov.uk was launched as the home of UK open government data in January 2010 and now has almost 21,000 published datasets, including on government spending.

Allowing citizens to freely access data related to the institutions that govern them is essential to a well-functioning democratic society. It is the first step towards holding leaders to account for failures and wrongdoing.

Fifa has a responsibility for the shared interests of millions of fans around the world. Football’s popularity means that Fifa’s governance has wide-ranging implications for society, too. This is particularly true of decisions about hosting the World Cup, which is often tied to large-scale government investment in infrastructure and even extends to law-making. Brazil spent up to £10bn hosting the 2014 World Cup and had to legalise the sale of beer at matches.

Following Sepp Blatter’s resignation, Fifa will gather its executive committee in July to plan for a presidential election, expected to take place in mid-December. Open data should form the cornerstone of any prospective candidate’s manifesto. It can help Fifa make better spending decisions and ensure partners deliver value for money, restore the trust of the international football community.

Fifa’s lengthy annual financial report gives summaries of financial expenditure,budgeted at £184m for operations and governance alone in 2016, but individual transactions are not published. Publishing spending data incentivises better spending decisions. If all Fifa’s outgoings – which totalled around £3.5bn between 2011 and 2014 – were made open, it would encourage much more efficiency….(more)”

Architecting Transparency: Back to the Roots – and Forward to the Future?


Paper by Dieter Zinnbauer: “Where to go next in research and practice on information disclosure and institutional transparency? Where to learn and draw inspiration from? How about if we go back to the roots and embrace an original, material notion of transparency as the quality of a substance or element to be see-through? How about, if we then explore how the deliberate use and assemblage of such physical transparency strategies in architecture and design connects to – or could productively connect to – the institutional, political notions of transparency that we are concerned with in our area of institutional or political transparency? Or put more simply and zooming in on one core aspect of the conversation: what have the arrival of glass and its siblings done for democracy and what can we still hope they will do for open, transparent governance now and in the future?

This paper embarks upon this exploratory journey in four steps. It starts out (section 2.1) by revisiting the historic relationship between architecture, design and the build environment on the one side and institutional ambitions for democracy, openness, transparency and collective governance on the other side. Quite surprisingly it finds a very close and ancient relationship between the two. Physical and political transparency have through the centuries been joined at the hip and this relationship – overlooked as it is typically is – has persisted in very important ways in our contemporary institutions of governance. As a second step I seek to trace the major currents in the architectural debate and practice on transparency over the last century and ask three principal questions:

– How have architects as the master-designers of the built environment in theory, criticism and practice historically grappled with the concept of transparency? To what extent have they linked material notions and building strategies of transparency to political and social notions of transparency as tools for emancipation and empowerment? (section 2.2.)

– What is the status of transparency in architecture today and what is the degree of cross-fertilisation between physical and institutional/political transparency? (section 3)

– Where could a closer connect between material and political transparency lead us in terms of inspiring fresh experimentation and action in order to broaden the scope of available transparency tools and spawn fresh ideas and innovation? (section 4).

Along the way I will scan the fragmented empirical evidence base for the actual impact of physical transparency strategies and also flag interesting areas for future research. As it turns out, an obsession with material transparency in architecture and the built environment has evolved in parallel and in many ways predates the rising popularity of transparency in political science and governance studies. There are surprising parallels in the hype-and-skepticism curve, common challenges, interesting learning experiences and a rich repertoire of ideas for cross-fertilisation and joint ideation that is waiting to be tapped. However, this will require to find ways to bridge the current disconnect between the physical and institutional transparency professions and move beyond the current pessimism about an actual potential of physical transparency beyond empty gestures or deployment for surveillance, notions that seems to linger on both sides. But the analysis shows that this bridge-building could be an extremely worthwhile endeavor. Both the available empirical data, as well as the ideas that even just this first brief excursion into physical transparency has yielded bode well for embarking on this cross-disciplinary conversation about transparency. And as the essay also shows, help from three very unexpected corners might be on the way to re-ignite the spark for taking the physical dimension of transparency seriously again. Back to the roots has a bright future….(More)

Exploring Open Energy Data in Urban Areas


The Worldbank: “…Energy efficiency – using less energy input to deliver the same level of service – has been described by many as the ‘first fuel’ of our societies. However, lack of adequate data to accurately predict and measure energy efficiency savings, particularly at the city level, has limited the realization of its promise over the past two decades.
Why Open Energy Data?
Open Data can be a powerful tool to reduce information asymmetry in markets, increase transparency and help achieve local economic development goals. Several sectors like transport, public sector management and agriculture have started to benefit from Open Data practices. Energy markets are often characterized by less-than-optimal conditions with high system inefficiencies, misaligned incentives and low levels of transparency. As such, the sector has a lot to potentially gain from embracing Open Data principles.
The United States is a leader in this field with its ‘Energy Data’ initiative. This initiative makes data easy to find, understand and apply, helping to fuel a clean energy economy. For example, the Energy Information Administration’s (EIA) open application programming interface (API) has more than 1.2 million time series of data and is frequently visited by users from the private sector, civil society and media. In addition, the Green Button  initiative is empowering American citizens to have access to their own energy usage data, and OpenEI.org is an Open Energy Information platform to help people find energy information, share their knowledge and connect to other energy stakeholders.
Introducing the Open Energy Data Assessment
To address this data gap in emerging and developing countries, the World Bank is conducting a series of Open Energy Data Assessments in urban areas. The objective is to identify important energy-related data, raise awareness of the benefits of Open Data principles and improve the flow of data between traditional energy stakeholders and others interested in the sector.
The first cities we assessed were Accra, Ghana and Nairobi, Kenya. Both are among the fastest-growing cities in the world, with dynamic entrepreneurial and technology sectors, and both are capitals of countries with an ongoing National Open Data Initiative., The two cities have also been selected to be part of the Negawatt Challenge, a World Bank international competition supporting technology innovation to solve local energy challenges.
The ecosystem approach
The starting point for the exercise was to consider the urban energy sector as an ecosystem, comprised of data suppliers, data users, key datasets, a legal framework, funding mechanisms, and ICT infrastructure. The methodology that we used adapted the established World Bank Open Data Readiness Assessment (ODRA), which highlights valuable connections between data suppliers and data demand.  The assessment showcases how to match pressing urban challenges with the opportunity to release and use data to address them, creating a longer-term commitment to the process. Mobilizing key stakeholders to provide quick, tangible results is also key to this approach….(More) …See also World Bank Open Government Data Toolkit.”

Flawed Humans, Flawed Justice


Adam Benforado in the New York Times  on using …”lessons from behavioral science to make police and courts more fair…. WHAT would it take to achieve true criminal justice in America?

Imagine that we got rid of all of the cops who cracked racist jokes and prosecutors blinded by a thirst for power. Imagine that we cleansed our courtrooms of lying witnesses and foolish jurors. Imagine that we removed every judge who thought the law should bend to her own personal agenda and every sadistic prison guard.

We would certainly feel just then. But we would be wrong.

We would still have unarmed kids shot in the back and innocent men and women sentenced to death. We would still have unequal treatment, disregarded rights and profound mistreatment.

The reason is simple and almost entirely overlooked: Our legal system is based on an inaccurate model of human behavior. Until recently, we had no way of understanding what was driving people’s thoughts, perceptions and actions in the criminal arena. So, we built our institutions on what we had: untested assumptions about what deceit looks like, how memories work and when punishment is merited.

But we now have tools — from experimental methods and data collection approaches to brain-imaging technologies — that provide an incredible opportunity to establish a new and robust foundation.

Our justice system must be reconstructed upon scientific fact. We can start by acknowledging what the data says about the fundamental flaws in our current legal processes and structures.

Consider the evidence that we treat as nearly unassailable proof of guilt at trial — an unwavering eyewitness, a suspect’s signed confession or a forensic match to the crime scene.

While we charge tens of thousands of people with crimes each year after they are identified in police lineups, research shows that eyewitnesses chose an innocent person roughly one-third of the time. Our memories can fail us because we’re frightened. They can be altered by the word choice of a detective. They can be corrupted by previously seeing someone’s image on a social media site.

Picking out lying suspects from their body language is ineffective. And trying then to gain a confession by exaggerating the strength of the evidence and playing down the seriousness of the offense can encourage people to admit to terrible things they didn’t do.

Even seemingly objective forensic analysis is far from incorruptible. Recent data shows that fingerprint — and even DNA — matches are significantly more likely when the forensic expert is aware that the sample comes from someone the police believe is guilty.

With the aid of psychology, we see there’s a whole host of seemingly extraneous forces influencing behavior and producing systematic distortions. But they remain hidden because they don’t fit into our familiar legal narratives.

We assume that the specific text of the law is critical to whether someone is convicted of rape, but research shows that the details of the criminal code — whether it includes a “force” requirement or excuses a “reasonably mistaken” belief in consent — can be irrelevant. What matters are the backgrounds and identifies of the jurors.

When a black teenager is shot by a police officer, we expect to find a bigot at the trigger.

But studies suggest that implicit bias, rather than explicit racism, is behind many recent tragedies. Indeed, simulator experiments show that the biggest danger posed to young African-American men may not be hate-filled cops, but well-intentioned police officers exposed to pervasive, damaging stereotypes that link the concepts of blackness and violence.

Likewise, Americans have been sold a myth that there are two kinds of judges — umpires and activists — and that being unbiased is a choice that a person makes. But the truth is that all judges are swayed by countless forces beyond their conscious awareness or control. It should have no impact on your case, for instance, whether your parole hearing is scheduled first thing in the morning or right before lunch, but when scientists looked at real parole boards, they found that judges were far more likely to grant petitions at the beginning of the day than they were midmorning.

The choice of where to place the camera in an interrogation room may seem immaterial, yet experiments show that it can affect whether a confession is determined to be coerced. When people watch a recording with the camera behind the detective, they are far more likely to find that the confession was voluntary than when watching the interactions from the perspective of the suspect.

With such challenges to our criminal justice system, what can possibly be done? The good news is that an evidence-based approach also illuminates the path forward.

Once we have clear data that something causes a bias, we can then figure out how to remove that influence. …(More)

The Civic Organization and the Digital Citizen


New book by Chris Wells: “The powerful potential of digital media to engage citizens in political actions has now crossed our news screens many times. But scholarly focus has tended to be on “networked,” anti-institutional forms of collective action, to the neglect of advocacy and service organizations. This book investigates the changing fortunes of the citizen-civil society relationship by exploring how social changes and innovations in communication technology are transforming the information expectations and preferences of many citizens, especially young citizens. In doing so, it is the first work to bring together theories of civic identity change with research on civic organizations. Specifically, it argues that a shift in “information styles” may help to explain the disjuncture felt by many young people when it comes to institutional participation and politics.

The book theorizes two paradigms of information style: a dutiful style, which was rooted in the society, communication system and citizen norms of the modern era, and an actualizing style, which constitutes the set of information practices and expectations of the young citizens of late modernity for whom interactive digital media are the norm. Hypothesizing that civil society institutions have difficulty adapting to the norms and practices of the actualizing information style, two empirical studies apply the dutiful/actualizing framework to innovative content analyses of organizations’ online communications-on their websites, and through Facebook. Results demonstrate that with intriguing exceptions, most major civil society organizations use digital media more in line with dutiful information norms than actualizing ones: they tend to broadcast strategic messages to an audience of receivers, rather than encouraging participation or exchange among an active set of participants. The book concludes with a discussion of the tensions inherent in bureaucratic organizations trying to adapt to an actualizing information style, and recommendations for how they may more successfully do so….(More)”

The Code Issue: Special Multi-platform Package on Demystifying Code


 

6.15.15 newsstand (25)

“Bloomberg Businessweek  released The Code Issue, a special double issue containing a single essay by writer and programmer Paul Ford. ….

Code directs the fate of everything from media to e-commerce to banking, and is arguably the most important phenomenon for the twenty-first century businessperson to understand. Yet it remains an intimidating mystery to most execs. In The Code Issue introduction, Bloomberg Businessweek editor Josh Tyrangiel writes, “Software has been around since the 1940s. Which means that people have been faking their way through meetings about software, and the code that builds it, for generations… ignorance is no longer acceptable.”

Tyrangiel says of The Code Issue, “There’s some technical language along with a few pretty basic mathematical concepts. There are also lots of solid jokes and lasting insights. It may take a few hours to read, but that’s a small price to pay for adding decades to your career.”

Chapters in The Code Issue include:

  • From Hardware to Software and How Does Code Become Software?
  • What Is an Algorithm?
  • What’s With All These Conferences, Anyway? (and why are there so many men in this field and why is it so hard for them to be in groups with female programmers and behave in a typical, adult way?)
  • Why Are Programmers So Intense About Languages?
  • What Do Different Languages Do? and The Importance of C
  • Why Are Coders Angry?
  • The Legend of the 10X Programmer (which details the accoutrements of the coder)
  • The Time You Attended the E-mail Address Validation Meeting
  • The Language of White Collars
  • Briefly on the Huge Subject Of Microsoft
  • What About JavaScript?
  • How Are Apps Made?
  • What Is Debugging?
  • Managing Programmers
  • Should You Learn to Code?

….An animated and interactive treatment of the essay allows web and mobile readers to dive deeper into code, to manipulate it and see the results. Among the demos and widgets are tinder for code, a fun Easter egg, and a certificate of completion you can share with friends. The code for the “What Is Code?” essay has been published on GitHub.”