Architecting Transparency: Back to the Roots – and Forward to the Future?


Paper by Dieter Zinnbauer: “Where to go next in research and practice on information disclosure and institutional transparency? Where to learn and draw inspiration from? How about if we go back to the roots and embrace an original, material notion of transparency as the quality of a substance or element to be see-through? How about, if we then explore how the deliberate use and assemblage of such physical transparency strategies in architecture and design connects to – or could productively connect to – the institutional, political notions of transparency that we are concerned with in our area of institutional or political transparency? Or put more simply and zooming in on one core aspect of the conversation: what have the arrival of glass and its siblings done for democracy and what can we still hope they will do for open, transparent governance now and in the future?

This paper embarks upon this exploratory journey in four steps. It starts out (section 2.1) by revisiting the historic relationship between architecture, design and the build environment on the one side and institutional ambitions for democracy, openness, transparency and collective governance on the other side. Quite surprisingly it finds a very close and ancient relationship between the two. Physical and political transparency have through the centuries been joined at the hip and this relationship – overlooked as it is typically is – has persisted in very important ways in our contemporary institutions of governance. As a second step I seek to trace the major currents in the architectural debate and practice on transparency over the last century and ask three principal questions:

– How have architects as the master-designers of the built environment in theory, criticism and practice historically grappled with the concept of transparency? To what extent have they linked material notions and building strategies of transparency to political and social notions of transparency as tools for emancipation and empowerment? (section 2.2.)

– What is the status of transparency in architecture today and what is the degree of cross-fertilisation between physical and institutional/political transparency? (section 3)

– Where could a closer connect between material and political transparency lead us in terms of inspiring fresh experimentation and action in order to broaden the scope of available transparency tools and spawn fresh ideas and innovation? (section 4).

Along the way I will scan the fragmented empirical evidence base for the actual impact of physical transparency strategies and also flag interesting areas for future research. As it turns out, an obsession with material transparency in architecture and the built environment has evolved in parallel and in many ways predates the rising popularity of transparency in political science and governance studies. There are surprising parallels in the hype-and-skepticism curve, common challenges, interesting learning experiences and a rich repertoire of ideas for cross-fertilisation and joint ideation that is waiting to be tapped. However, this will require to find ways to bridge the current disconnect between the physical and institutional transparency professions and move beyond the current pessimism about an actual potential of physical transparency beyond empty gestures or deployment for surveillance, notions that seems to linger on both sides. But the analysis shows that this bridge-building could be an extremely worthwhile endeavor. Both the available empirical data, as well as the ideas that even just this first brief excursion into physical transparency has yielded bode well for embarking on this cross-disciplinary conversation about transparency. And as the essay also shows, help from three very unexpected corners might be on the way to re-ignite the spark for taking the physical dimension of transparency seriously again. Back to the roots has a bright future….(More)

Exploring Open Energy Data in Urban Areas


The Worldbank: “…Energy efficiency – using less energy input to deliver the same level of service – has been described by many as the ‘first fuel’ of our societies. However, lack of adequate data to accurately predict and measure energy efficiency savings, particularly at the city level, has limited the realization of its promise over the past two decades.
Why Open Energy Data?
Open Data can be a powerful tool to reduce information asymmetry in markets, increase transparency and help achieve local economic development goals. Several sectors like transport, public sector management and agriculture have started to benefit from Open Data practices. Energy markets are often characterized by less-than-optimal conditions with high system inefficiencies, misaligned incentives and low levels of transparency. As such, the sector has a lot to potentially gain from embracing Open Data principles.
The United States is a leader in this field with its ‘Energy Data’ initiative. This initiative makes data easy to find, understand and apply, helping to fuel a clean energy economy. For example, the Energy Information Administration’s (EIA) open application programming interface (API) has more than 1.2 million time series of data and is frequently visited by users from the private sector, civil society and media. In addition, the Green Button  initiative is empowering American citizens to have access to their own energy usage data, and OpenEI.org is an Open Energy Information platform to help people find energy information, share their knowledge and connect to other energy stakeholders.
Introducing the Open Energy Data Assessment
To address this data gap in emerging and developing countries, the World Bank is conducting a series of Open Energy Data Assessments in urban areas. The objective is to identify important energy-related data, raise awareness of the benefits of Open Data principles and improve the flow of data between traditional energy stakeholders and others interested in the sector.
The first cities we assessed were Accra, Ghana and Nairobi, Kenya. Both are among the fastest-growing cities in the world, with dynamic entrepreneurial and technology sectors, and both are capitals of countries with an ongoing National Open Data Initiative., The two cities have also been selected to be part of the Negawatt Challenge, a World Bank international competition supporting technology innovation to solve local energy challenges.
The ecosystem approach
The starting point for the exercise was to consider the urban energy sector as an ecosystem, comprised of data suppliers, data users, key datasets, a legal framework, funding mechanisms, and ICT infrastructure. The methodology that we used adapted the established World Bank Open Data Readiness Assessment (ODRA), which highlights valuable connections between data suppliers and data demand.  The assessment showcases how to match pressing urban challenges with the opportunity to release and use data to address them, creating a longer-term commitment to the process. Mobilizing key stakeholders to provide quick, tangible results is also key to this approach….(More) …See also World Bank Open Government Data Toolkit.”

Flawed Humans, Flawed Justice


Adam Benforado in the New York Times  on using …”lessons from behavioral science to make police and courts more fair…. WHAT would it take to achieve true criminal justice in America?

Imagine that we got rid of all of the cops who cracked racist jokes and prosecutors blinded by a thirst for power. Imagine that we cleansed our courtrooms of lying witnesses and foolish jurors. Imagine that we removed every judge who thought the law should bend to her own personal agenda and every sadistic prison guard.

We would certainly feel just then. But we would be wrong.

We would still have unarmed kids shot in the back and innocent men and women sentenced to death. We would still have unequal treatment, disregarded rights and profound mistreatment.

The reason is simple and almost entirely overlooked: Our legal system is based on an inaccurate model of human behavior. Until recently, we had no way of understanding what was driving people’s thoughts, perceptions and actions in the criminal arena. So, we built our institutions on what we had: untested assumptions about what deceit looks like, how memories work and when punishment is merited.

But we now have tools — from experimental methods and data collection approaches to brain-imaging technologies — that provide an incredible opportunity to establish a new and robust foundation.

Our justice system must be reconstructed upon scientific fact. We can start by acknowledging what the data says about the fundamental flaws in our current legal processes and structures.

Consider the evidence that we treat as nearly unassailable proof of guilt at trial — an unwavering eyewitness, a suspect’s signed confession or a forensic match to the crime scene.

While we charge tens of thousands of people with crimes each year after they are identified in police lineups, research shows that eyewitnesses chose an innocent person roughly one-third of the time. Our memories can fail us because we’re frightened. They can be altered by the word choice of a detective. They can be corrupted by previously seeing someone’s image on a social media site.

Picking out lying suspects from their body language is ineffective. And trying then to gain a confession by exaggerating the strength of the evidence and playing down the seriousness of the offense can encourage people to admit to terrible things they didn’t do.

Even seemingly objective forensic analysis is far from incorruptible. Recent data shows that fingerprint — and even DNA — matches are significantly more likely when the forensic expert is aware that the sample comes from someone the police believe is guilty.

With the aid of psychology, we see there’s a whole host of seemingly extraneous forces influencing behavior and producing systematic distortions. But they remain hidden because they don’t fit into our familiar legal narratives.

We assume that the specific text of the law is critical to whether someone is convicted of rape, but research shows that the details of the criminal code — whether it includes a “force” requirement or excuses a “reasonably mistaken” belief in consent — can be irrelevant. What matters are the backgrounds and identifies of the jurors.

When a black teenager is shot by a police officer, we expect to find a bigot at the trigger.

But studies suggest that implicit bias, rather than explicit racism, is behind many recent tragedies. Indeed, simulator experiments show that the biggest danger posed to young African-American men may not be hate-filled cops, but well-intentioned police officers exposed to pervasive, damaging stereotypes that link the concepts of blackness and violence.

Likewise, Americans have been sold a myth that there are two kinds of judges — umpires and activists — and that being unbiased is a choice that a person makes. But the truth is that all judges are swayed by countless forces beyond their conscious awareness or control. It should have no impact on your case, for instance, whether your parole hearing is scheduled first thing in the morning or right before lunch, but when scientists looked at real parole boards, they found that judges were far more likely to grant petitions at the beginning of the day than they were midmorning.

The choice of where to place the camera in an interrogation room may seem immaterial, yet experiments show that it can affect whether a confession is determined to be coerced. When people watch a recording with the camera behind the detective, they are far more likely to find that the confession was voluntary than when watching the interactions from the perspective of the suspect.

With such challenges to our criminal justice system, what can possibly be done? The good news is that an evidence-based approach also illuminates the path forward.

Once we have clear data that something causes a bias, we can then figure out how to remove that influence. …(More)

The Civic Organization and the Digital Citizen


New book by Chris Wells: “The powerful potential of digital media to engage citizens in political actions has now crossed our news screens many times. But scholarly focus has tended to be on “networked,” anti-institutional forms of collective action, to the neglect of advocacy and service organizations. This book investigates the changing fortunes of the citizen-civil society relationship by exploring how social changes and innovations in communication technology are transforming the information expectations and preferences of many citizens, especially young citizens. In doing so, it is the first work to bring together theories of civic identity change with research on civic organizations. Specifically, it argues that a shift in “information styles” may help to explain the disjuncture felt by many young people when it comes to institutional participation and politics.

The book theorizes two paradigms of information style: a dutiful style, which was rooted in the society, communication system and citizen norms of the modern era, and an actualizing style, which constitutes the set of information practices and expectations of the young citizens of late modernity for whom interactive digital media are the norm. Hypothesizing that civil society institutions have difficulty adapting to the norms and practices of the actualizing information style, two empirical studies apply the dutiful/actualizing framework to innovative content analyses of organizations’ online communications-on their websites, and through Facebook. Results demonstrate that with intriguing exceptions, most major civil society organizations use digital media more in line with dutiful information norms than actualizing ones: they tend to broadcast strategic messages to an audience of receivers, rather than encouraging participation or exchange among an active set of participants. The book concludes with a discussion of the tensions inherent in bureaucratic organizations trying to adapt to an actualizing information style, and recommendations for how they may more successfully do so….(More)”

The Code Issue: Special Multi-platform Package on Demystifying Code


 

6.15.15 newsstand (25)

“Bloomberg Businessweek  released The Code Issue, a special double issue containing a single essay by writer and programmer Paul Ford. ….

Code directs the fate of everything from media to e-commerce to banking, and is arguably the most important phenomenon for the twenty-first century businessperson to understand. Yet it remains an intimidating mystery to most execs. In The Code Issue introduction, Bloomberg Businessweek editor Josh Tyrangiel writes, “Software has been around since the 1940s. Which means that people have been faking their way through meetings about software, and the code that builds it, for generations… ignorance is no longer acceptable.”

Tyrangiel says of The Code Issue, “There’s some technical language along with a few pretty basic mathematical concepts. There are also lots of solid jokes and lasting insights. It may take a few hours to read, but that’s a small price to pay for adding decades to your career.”

Chapters in The Code Issue include:

  • From Hardware to Software and How Does Code Become Software?
  • What Is an Algorithm?
  • What’s With All These Conferences, Anyway? (and why are there so many men in this field and why is it so hard for them to be in groups with female programmers and behave in a typical, adult way?)
  • Why Are Programmers So Intense About Languages?
  • What Do Different Languages Do? and The Importance of C
  • Why Are Coders Angry?
  • The Legend of the 10X Programmer (which details the accoutrements of the coder)
  • The Time You Attended the E-mail Address Validation Meeting
  • The Language of White Collars
  • Briefly on the Huge Subject Of Microsoft
  • What About JavaScript?
  • How Are Apps Made?
  • What Is Debugging?
  • Managing Programmers
  • Should You Learn to Code?

….An animated and interactive treatment of the essay allows web and mobile readers to dive deeper into code, to manipulate it and see the results. Among the demos and widgets are tinder for code, a fun Easter egg, and a certificate of completion you can share with friends. The code for the “What Is Code?” essay has been published on GitHub.”

The death of data science – and rise of the citizen scientist


Ben Rossi at Information Age: “The notion of data science was born from the recent idea that if you have enough data, you don’t need much (if any) science to divine the truth and foretell the future – as opposed to the long-established rigours of statistical or actuarial science, which most times require painstaking efforts and substantial time to produce their version of ‘the truth’. …. Rather than embracing this untested and, perhaps, doomed form of science, and aimlessly searching for unicorns (also known as data scientists) to pay vast sums to, many organisations are now embracing the idea of making everyone data and analytics literate.

This leads me to what my column is really meant to focus on: the rise of the citizen scientist. 

The citizen scientist is not a new idea, having seen action in the space and earth sciences world for decades now, and has really come into its own as we enter the age of open data.

Cometh the hour

Given the exponential growth of open data initiatives across the world – the UK remains the leader, but has growing competition from all locations – the need for citizen scientists is now paramount. 

As governments open up vast repositories of new data of every type, the opportunity for these same governments (and commercial interests) to leverage the passion, skills and collective know-how of citizen scientists to help garner deeper insights into the scientific and civic challenges of the day is substantial. 

They can then take this knowledge and the collective energy of the citizen scientist community to develop common solution sets and applications to meet the needs of all their constituencies without expending much in terms of financial resources or suffering substantial development time lags. 

This can be a windfall of benefits for every level or type of government found around the world. The use of citizen scientists to tackle so-called ‘grand challenge’ problems has been a driving force behind many governments’ commitment to and investment in open data to date. 

There are so many challenges in governing today that it would be foolish not to employ these very capable resources to help tackle them. 

The benefits manifested from this approach are substantial and well proven. Many are well articulated in the open data success stories to date. 

Additionally, you only need to attend a local ‘hack fest’ to see how engaged citizen scientists can be of any age, gender and race, and feel the sense of community that these events foster as everyone focuses on the challenges at hand and works diligently to surmount them using very creative approaches. 

As open data becomes pervasive in use and matures in respect to the breadth and richness of the data sets being curated, the benefits returned to both government and its constituents will be manifold. 

The catalyst to realising these benefits and achieving return on investment will be the role of citizen scientists, which are not going to be statisticians, actuaries or so-called data gurus, but ordinary people with a passion for science and learning and a desire to contribute to solving the many grand challenges facing society at large….(More)

How Crowdsourcing Can Help Us Fight ISIS


 at the Huffington Post: “There’s no question that ISIS is gaining ground. …So how else can we fight ISIS? By crowdsourcing data – i.e. asking a relevant group of people for their input via text or the Internet on specific ISIS-related issues. In fact, ISIS has been using crowdsourcing to enhance its operations since last year in two significant ways. Why shouldn’t we?

First, ISIS is using its crowd of supporters in Syria, Iraq and elsewhere to help strategize new policies. Last December, the extremist group leveraged its global crowd via social media to brainstorm ideas on how to kill 26-year-old Jordanian coalition fighter pilot Moaz al-Kasasba. ISIS supporters used the hashtag “Suggest a Way to Kill the Jordanian Pilot Pig” and “We All Want to Slaughter Moaz” to make their disturbing suggestions, which included decapitation, running al-Kasasba over with a bulldozer and burning him alive (which was the winner). Yes, this sounds absurd and was partly a publicity stunt to boost ISIS’ image. But the underlying strategy to crowdsource new strategies makes complete sense for ISIS as it continues to evolve – which is what the US government should consider as well.

In fact, in February, the US government tried to crowdsource more counterterrorism strategies. Via its official blog, DipNote, the State Departmentasked the crowd – in this case, US citizens – for their suggestions for solutions to fight violent extremism. This inclusive approach to policymaking was obviously important for strengthening democracy, with more than 180 entries posted over two months from citizens across the US. But did this crowdsourcing exercise actually improve US strategy against ISIS? Not really. What might help is if the US government asked a crowd of experts across varied disciplines and industries about counterterrorism strategies specifically against ISIS, also giving these experts the opportunity to critique each other’s suggestions to reach one optimal strategy. This additional, collaborative, competitive and interdisciplinary expert insight can only help President Obama and his national security team to enhance their anti-ISIS strategy.

Second, ISIS has been using its crowd of supporters to collect intelligence information to better execute its strategies. Since last August, the extremist group has crowdsourced data via a Twitter campaign specifically on Saudi Arabia’s intelligence officials, including names and other personal details. This apparently helped ISIS in its two suicide bombing attacks during prayers at a Shite mosque last month; it also presumably helped ISIS infiltrate a Saudi Arabian border town via Iraq in January. This additional, collaborative approach to intelligence collection can only help President Obama and his national security team to enhance their anti-ISIS strategy.

In fact, last year, the FBI used crowdsourcing to spot individuals who might be travelling abroad to join terrorist groups. But what if we asked the crowd of US citizens and residents to give us information specifically on where they’ve seen individuals get lured by ISIS in the country, as well as on specific recruitment strategies they may have noted? This might also lead to more real-time data points on ISIS defectors returning to the US – who are they, why did they defect and what can they tell us about their experience in Syria or Iraq? Overall, crowdsourcing such data (if verifiable) would quickly create a clearer picture of trends in recruitment and defectors across the country, which can only help the US enhance its anti-ISIS strategies.

This collaborative approach to data collection could also be used in Syria and Iraq with texts and online contributions from locals helping us to map ISIS’ movements….(More)”

In The Information Debate, Openness and Privacy Are The Same Thing


 at TechCrunch: “We’ve been framing the debate between openness and privacy the wrong way.

Rather than positioning privacy and openness as opposing forces, the fact is they’re different sides of the same coin – and equally important. This might seem simple, but it might also be the key to moving things forward around this crucial debate.

Open data advocates often suggest that openness should be the default for all human knowledge. We should share, re-use and compare data freely and in doing so reap the benefits of innovation, cost savings and increased citizen participation — to name a just a few gains.

And although it might sound a little utopian, the promise is being realized in many corners of the world….But as we all know, even if we accept all the possible benefits of open data, concerns about privacy, especially personal information, still exist as a counter weight to the open data evangelists. People worry that the path of openness could lead to an Orwellian world where all our information is shared with everyone, permanently.

There is a way to turn the conversation from the face-value clash between openness and privacy to how they can be complementary forces. Gus Hosein, CEO of Privacy International, has explained that privacy is “the governing framework to control access to, collection and usage of information.” Basically, privacy laws enable knowledge and control of data about citizens and their surroundings.

Even if we accept all the possible benefits of open data, concerns about privacy, especially personal information, still exist as a counter weight to the open data evangelists.

This is strikingly similar to the argument that open data increases service delivery efficiency and personalization. Openness and privacy both share the same impulse: I want to be in control of my life, I want to know and choose whether a hospital or school is a good hospital or school and be in control of my choice of services.

Another strong thread in conversations around open data is that transparency should be proportionate to power. This makes sense on one level and seems simple enough: Politicians should be held accountable which means a heightened level of transparency.

But who is ‘powerful’, how do you define ‘power’ and who is in charge of defining this?

Politicians have chosen to run for public office and submit themselves to public scrutiny, but what about the CEO of a listed company, the leader of a charity, the anonymous owner of a Cayman-islands’ registered corporation? In practice, it is very difficult to apply the ‘transparency is proportionate to power’ rule outside democratic politics.

We need to stop making a binary distinction between freedom of information laws and data protection; between open data policies and privacy policies. We need one single policy framework that controls as well as encourages the use ‘open’ data.

The closest we get is with so-called PEPs (politically exposed persons) databases: Individuals who are the close family and kin, and close business associates of politicians. But even that defines power as derivative from political power, and not commercial, social or other forms of power.

 And what about personal data?  Should personal data ever be open?

Omidyar Network asked this question to 200 guests at a convention on openness and privacy last year. The audience was split down the middle: 50% thought personal data could never be open data. 50% thought that it should, and that foregoing the opportunity to release it would block the promise of economic gains, better services and other benefits. Open data experts, including the 1,000 who attended a recent meeting in Ottawa, ultimately disagree on this fundamental issue.

Herein lies the challenge. Many of us, including the general public, are uncomfortable with open personal data, even despite the gains it can bring….(More)”

Waze and the Traffic Panopticon


 in the New Yorker: “In April, during his second annual State of the City address, Los Angeles Mayor Eric Garcetti announced a data-sharing agreement with Waze, the Google-owned, Israel-based navigation service. Waze is different from most navigation apps, including Google Maps, in that it relies heavily on real-time, user-generated data. Some of this data is produced actively—a driver or passenger sees a stalled vehicle, then uses a voice command or taps a stalled-vehicle icon on the app to alert others—while other data, such as the user’s location and average speed, is gathered passively, via smartphones. The agreement will see the city provide Waze with some of the active data it collects, alerting drivers to road closures, construction, and parades, among other things. From Waze, the city will get real-time data on traffic and road conditions. Garcetti said that the partnership would mean “less congestion, better routing, and a more livable L.A.” Di-Ann Eisnor, Waze’s head of growth, acknowledged to me that these kinds of deals can cause discomfort to the people working inside city government. “It’s exciting, but people inside are also fearful because it seems like too much work, or it seems so unknown,” she said.

Indeed, the deal promises to help the city improve some of its traffic and infrastructure systems (L.A. still uses paper to manage pothole patching, for example), but it also acknowledges Waze’s role in the complex new reality of urban traffic planning. Traditionally, traffic management has been a largely top-down process. In Los Angeles, it is coördinated in a bunker downtown, several stories below the sidewalk, where engineers stare at blinking lights representing traffic and live camera feeds of street intersections. L.A.’s sensor-and-algorithm-driven Automated Traffic Surveillance and Control System is already one of the world’s most sophisticated traffic-mitigation tools, but it can only do so much to manage the city’s eternally unsophisticated gridlock. Los Angeles appears to see its partnership with Waze as an important step toward improving the bridge between its subterranean panopticon and the rest of the city still further, much like other metropolises that have struck deals with Waze under the company’s Connected Cities program.
Among the early adopters is Rio de Janeiro, whose urban command center tracks everything from accidents to hyperlocal weather conditions, pulling data from thirty departments and private companies, including Waze. “In Rio,” Eisnor said, traffic managers “were able to change the garbage routes, figure out where to install cameras, and deploy traffic personnel” because of the program. She also pointed out that Connected Cities has helped municipal workers in Washington, D.C., patch potholes within forty-eight hours of their being identified on Waze. “We’re helping reframe city planning through not just space but space and time,” she said…..(More)

Did Performance Measurement Cause America’s Police Problem?


Katherine Barrett and Richard Greene in Governing: “You’ve doubtless heard the maxim “what gets measured, gets managed.” Sometimes it’s attributed to management guru Peter Drucker, though others also get credit for it. But whoever actually coined the phrase, we remember the first time we became aware of it, about a quarter of a century ago.

It seemed like a purely positive sentiment to us back in the days when we naively believed that performance measurement could cure most governmental ills. If gathering data about inputs, outputs and outcomes could solve all management problems, then cities and states had access to a golden key to a more effective and efficient future. Then reality intervened and we recognized that even good measurements don’t necessarily result in the right policy or practice changes.

But, somewhat more ominously, we’ve become aware of a troubling question that lurks in the field of performance measurement: What happens if we’re not measuring the right things in the first place? If Drucker — or whoever — was right, doesn’t that mean that we may manage government programs in a way that leads to more problems? Sometimes, for example, states and localities focus their measurements on the speed with which a service is delivered. Faster always seems better. But often delivering a service quickly means doing so less effectively.

For fire departments, response times are a commonly used measure of service quality.  But “the requirement for low response times may incentivize firefighters to drive fast,” said Amy Donahue, professor and vice-provost for academic operations at the University of Connecticut. “And it has been shown that while speeding saves very little in terms of total driving time, it is much more dangerous — both to those in the emergency vehicle and other innocents who might get in their way. The potential for accidents is high, and when they happen, the consequences can be very tragic.”

As the field has become aware of these dangers, many agencies are trying to mitigate them by improving education, prohibiting responders from exceeding speed limits, and requiring responders to participate in emergency vehicle operators programs.

Examples like this one are everywhere. But we just came across something in the March 2015 edition of New Perspectives in Policing that had never occurred to us before and that seems to be widely ignored by public safety organizations around the country. It was written by Malcolm K. Sparrow, professor of practice of public management at the John F. Kennedy School of Government at Harvard University

As violent incidents in several of America’s cities show the underlying tensions between police and the public they serve, Sparrow argues that some of this dissonance has actually been encouraged by the fact that most police departments are pushed to measure crime clearance and enforcement. These are important factors, but they have little to do with community satisfaction. Meanwhile, he points out that “a few departments now use citizen satisfaction surveys on a regular basis, but most do not.”…(More)”