## Visualizing the legislative process with Sankey diagrams

Kamil Gregor at OpeningParliament.org: “The process of shaping the law often resembles an Indiana Jones maze. Bills and amendments run through an elaborate system of committees, sessions and hearings filled with booby traps before finally reaching the golden idol of a final approval.
Parliamentary monitoring organizations and researchers are often interested in how various pieces of legislation survive in this environment and what are the strategies to either kill or aid them. This specifically means answering two questions: What is the probability of a bill being approved and what factors determine this probability?
The legislative process is usually hierarchical: Successful completion of a step in the process is conditioned by completion of all previous steps. Therefore, we may also want to know the probabilities of completion in each consecutive step and their determinants.
A simple way how to give a satisfying answer to these questions without wandering into the land of nonlinear logistic regressions is the Sankey diagram. It is a famous flow chart in which a process is visualized using arrows. Relative quantities of outcomes in the process are represented by arrows’ widths.
A famous example is a Sankey diagram of Napoleon’s invasion of Russia. We can clearly see how the Grand Army was gradually shrinking as French soldiers were dying or defecting. Another well-known example is the Google Analytics flow chart. It shows how many visitors enter a webpage and then either leave or continue to a different page on the same website. As the number of consecutive steps increases, the number of visitors remaining on the website decreases.
The legislative process can be visualized in the same way. Progress of bills is represented by a stream between various steps in the process and width of the stream corresponds to quantities of bills. A bill can either complete all the steps of the process, or it can “drop out” of it at some point if it gets rejected.
Let’s take a look…”

## How to make a city great

New video and report by McKinsey: “What makes a great city? It is a pressing question because by 2030, 5 billion people—60 percent of the world’s population—will live in cities, compared with 3.6 billion today, turbocharging the world’s economic growth. Leaders in developing nations must cope with urbanization on an unprecedented scale, while those in developed ones wrestle with aging infrastructures and stretched budgets. All are fighting to secure or maintain the competitiveness of their cities and the livelihoods of the people who live in them. And all are aware of the environmental legacy they will leave if they fail to find more sustainable, resource-efficient ways of managing these cities.

To understand the core processes and benchmarks that can transform cities into superior places to live and work, McKinsey developed and analyzed a comprehensive database of urban economic, social, and environmental performance indicators. The research included interviewing 30 mayors and other leaders in city governments on four continents and synthesizing the findings from more than 80 case studies that sought to understand what city leaders did to improve processes and services from urban planning to financial management and social housing.
The result is How to make a city great (PDF–2.1MB), a new report arguing that leaders who make important strides in improving their cities do three things really well:

• They achieve smart growth.
• They do more with less. Great cities secure all revenues due, explore investment partnerships, embrace technology, make organizational changes that eliminate overlapping roles, and manage expenses. Successful city leaders have also learned that, if designed and executed well, private–public partnerships can be an essential element of smart growth, delivering lower-cost, higher-quality infrastructure and services.
• They win support for change. Change is not easy, and its momentum can even attract opposition. Successful city leaders build a high-performing team of civil servants, create a working environment where all employees are accountable for their actions, and take every opportunity to forge a stakeholder consensus with the local population and business community. They take steps to recruit and retain top talent, emphasize collaboration, and train civil servants in the use of technology.”

## From Potholes to Policies: Technology, Civic Engagement and the Path to Peer-Produced Governance

Chris Osgood and Nigel Jacob at Living Cities: “There’s been tremendous energy behind the movement to change the way that local governments use technology to better connect with residents. Civic hackers, Code for America Fellows, concerned residents, and offices such as ours, the Mayor’s Office of New Urban Mechanics in Boston, are working together to create a more collaborative environment in which these various players can develop new kinds of solutions to urban challenges…

These initiatives have shown a lot of promise. Now we need to build on these innovations to bring public participation into the heart of policymaking.
This is not going to happen overnight, nor is the path to changing the interface between citizens and government an obvious one. However, reflecting on the work we’ve done over the past few years, we are starting to see a set of design principles that can help guide our efforts. These are emergent, and so imperfect, but we share them here in the hopes of getting feedback to improve them:

1. The reasons for engagement must be clear: It is incumbent on us as creators and purveyors of civic technologies to be crystal-clear about what policies we are trying to rewrite, why, and what role the public plays in that process. With the Public Schools, the Community PlanIT game was built to engage residents both on-line and in person to co-design school performance metrics; the result was an approach that was different, and better, than what had originally been proposed, with less discord than was happening in traditional town hall meetings.
2. Channels must be high-quality and appropriately receptive: When you use Citizens Connect to report quality-of-life issues in Boston, you get an email saying: “Thank you for reporting this pothole. It has now been fixed.” You can’t just cut and paste that email to say: “Thank you for your views on this policy. The policy has now been fixed.” The channel has to make it possible for the City to make meaning of and act on resident input, and then to communicate back to users what has been done and why. And as our friends at Code for America say, they must be “simple, beautiful and easy to use.”
3. Transparency is vital: Transparency around how the process works and why fosters greater public trust in the system and consequently makes people more likely to engage. Local leaders must therefore be very clear up-front about these points, and communicate them repeatedly and consistently in the face of potential mistrust and misunderstanding.”

## How X Prize Contestants Will Hunt Down The Health Sensors Of The Future

Ariel Schwartz in Co.Exist: “The \$10 million Qualcomm Tricorder X Prize asks entrants to perform an incredibly difficult feat: accurately diagnose 15 diseases in 30 patients in three days using only a mobile platform. To do that, competing teams need to have access to sophisticated sensors and related software.
Some of those sensors may be found among the finalists of the \$2.25 million Nokia Sensing XCHALLENGE, a set of two consecutive competitions that challenges teams to advance sensing technology for gathering data about human health and the environment. The finalists for the first challenge, announced in early August, are diverse, though they do share one common trait: They’re all lab-on-a-chip technologies. “They’re small enough to be body wearable and programmable, but they use different methods,” says Mark Winter, senior director of the Nokia Sensing XCHALLENGE.”

## Strengthening Local Capacity for Data-Driven Decisionmaking

A report by the National Neighborhood Indicators Partnership (NNIP): “A large share of public decisions that shape the fundamental character of American life are made at the local level; for example, decisions about controlling crime, maintaining housing quality, targeting social services, revitalizing low-income neighborhoods, allocating health care, and deploying early childhood programs. Enormous benefits would be gained if a much larger share of these decisions were based on sound data and analysis.
In the mid-1990s, a movement began to address the need for data for local decisionmaking.Civic leaders in several cities funded local groups to start assembling neighborhood and address-level data from multiple local agencies. For the first time, it became possible to track changing neighborhood conditions, using a variety of indicators, year by year between censuses. These new data intermediaries pledged to use their data in practical ways to support policymaking and community building and give priority to the interests of distressed neighborhoods. Their theme was “democratizing data,” which in practice meant making the data accessible to residents and community groups (Sawicki and Craig 1996).

The initial groups that took on this work formed the National Neighborhood Indicators Partnership (NNIP) to further develop these capacities and spread them to other cities. By 2012, NNIP partners were established in 37 cities, and similar capacities were in development in a number of others. The Urban Institute (UI) serves as the secretariat for the network. This report documents a strategic planning process undertaken by NNIP in 2012 and early 2013. The network’s leadership and funders re-examined the NNIP model in the context of 15 years of local partner experiences and the dramatic changes in technology and policy approaches that have occurred over that period. The first three sections explain NNIP functions and institutional structures and examine the potential role for NNIP in advancing the community information field in today’s environment.”

## Collaboration In Biology's Century

Todd Sherer, Chief Executive Officer of The Michael J. Fox Foundation for Parkinson’s Research, in Forbes: “he problem is, we all still work in a system that feeds on secrecy and competition. It’s hard enough work just to dream up win/win collaborative structures; getting them off the ground can feel like pushing a boulder up a hill. Yet there is no doubt that the realities of today’s research environment — everything from the accumulation of big data to the ever-shrinking availability of funds — demand new models for collaboration. Call it “collaboration 2.0.”…I share a few recent examples in the hope of increasing the reach of these initiatives, inspiring others like them, and encouraging frank commentary on how they’re working.
Open-Access Data
The successes of collaborations in the traditional sense, coupled with advanced techniques such as genomic sequencing, have yielded masses of data. Consortia of clinical sites around the world are working together to collect and characterize data and biospecimens through standardized methods, leading to ever-larger pools — more like Great Lakes — of data. Study investigators draw their own conclusions, but there is so much more to discover than any individual lab has the bandwidth for….
Crowdsourcing
A great way to grow engagement with resources you’re willing to share? Ask for it. Collaboration 2.0 casts a wide net. We dipped our toe in the crowdsourcing waters earlier this year with our Parkinson’s Data Challenge, which asked anyone interested to download a set of data that had been collected from PD patients and controls using smart phones. …
Cross-Disciplinary Collaboration 2.0
The more we uncover about the interconnectedness and complexity of the human system, the more proof we are gathering that findings and treatments for one disease may provide invaluable insights for others. We’ve seen some really intriguing crosstalk between the Parkinson’s and Alzheimer’s disease research communities recently…
The results should be: More ideas. More discovery. Better health.”

## International Principles on the Application of Human Rights to Communications Surveillance

Final version, 10 July 2013:  “As technologies that facilitate State surveillance of communications advance, States are failing to ensure that laws and regulations related to communications surveillance adhere to international human rights and adequately protect the rights to privacy and freedom of expression. This document attempts to explain how international human rights law applies in the current digital environment, particularly in light of the increase in and changes to communications surveillance technologies and techniques. These principles can provide civil society groups, industry, States and others with a framework to evaluate whether current or proposed surveillance laws and practices are consistent with human rights.
These principles are the outcome of a global consultation with civil society groups, industry and international experts in communications surveillance law, policy and technology.”

## Understanding Smart Data Disclosure Policy Success: The Case of Green Button

New Paper by Djoko Sigit Sayogo and Theresa Pardo: “Open data policies are expected to promote innovations that stimulate social, political and economic change. In pursuit of innovation potential, open datahas expanded to wider environment involving government, business and citizens. The US government recently launched such collaboration through a smart data policy supporting energy efficiency called Green Button. This paper explores the implementation of Green Button and identifies motivations and success factors facilitating successful collaboration between public and private organizations to support smart disclosure policy. Analyzing qualitative data from semi-structured interviews with experts involved in Green Button initiation and implementation, this paper presents some key findings. The success of Green Button can be attributed to the interaction between internal and external factors. The external factors consist of both market and non-market drivers: economic factors, technology related factors, regulatory contexts and policy incentives, and some factors that stimulate imitative behavior among the adopters. The external factors create the necessary institutional environment for the Green Button implementation. On the other hand, the acceptance and adoption of Green Button itself is influenced by the fit of Green Button capability to the strategic mission of energy and utility companies in providing energy efficiency programs. We also identify the different roles of government during the different stages of Green Button implementation.”
[Recipient of Best Management/Policy Paper Award, dgo2013]

## ICANN Strategy Panels Launched

ICANN PressRelease: “During today’s opening ceremony of ICANN 47 in Durban, South Africa, President and CEO Fadi Chehadé announced the creation of five new ICANN Strategy Panels that will serve as an integral part of a framework for cross-community dialogue on strategic matters. The ICANN Strategy Panels will convene subject matter experts, thought leaders and industry practitioners to support development of ICANN‘s strategic and operational plans, in coordination with many other global players, and will be comprised of up to seven members including the chair for an anticipated one-year timeframe…
In its fourteen-year history, ICANN has grown to reflect a changing landscape of continued innovation, interconnectedness, and unprecedented growth in the DNS ecosystem, one that transcends groups and borders to serve the public interest. Yet, the Internet is at a critical inflection point as billions of new people are expected to join the global network in the next few years and as the nature of its usage matures dramatically. With this in mind, the ICANN Strategy Panels are expected to help catalyze transformation and advance ICANN‘s role in the context of a dynamic, increasingly complex global environment.”