Citizen Science for Citizen Access to Law


Paper by Michael Curtotti, Wayne Weibel, Eric McCreath, Nicolas Ceynowa, Sara Frug, and Tom R Bruce: “This paper sits at the intersection of citizen access to law, legal informatics and plain language. The paper reports the results of a joint project of the Cornell University Legal Information Institute and the Australian National University which collected thousands of crowdsourced assessments of the readability of law through the Cornell LII site. The aim of the project is to enhance accuracy in the prediction of the readability of legal sentences. The study requested readers on legislative pages of the LII site to rate passages from the United States Code and the Code of Federal Regulations and other texts for readability and other characteristics. The research provides insight into who uses legal rules and how they do so. The study enables conclusions to be drawn as to the current readability of law and spread of readability among legal rules. The research is intended to enable the creation of a dataset of legal rules labelled by human judges as to readability. Such a dataset, in combination with machine learning, will assist in identifying factors in legal language which impede readability and access for citizens. As far as we are aware, this research is the largest ever study of readability and usability of legal language and the first research which has applied crowdsourcing to such an investigation. The research is an example of the possibilities open for enhancing access to law through engagement of end users in the online legal publishing environment for enhancement of legal accessibility and through collaboration between legal publishers and researchers….(More)”

White House Releases 150 Data Sets to Fight Climate Change


 at GovTech: “To support the president’s Climate Data Initiative, the White House revealed on Tuesday, April 7, a series of data projects and partnerships that includes more than 150 new open data sets, as well as commitments from Google, Microsoft and others to cultivate climate analysis.

The undertakings were released at a White House climate and health conference where John Holdren, director of the White House Office of Science and Technology Policy, pressed the need for greater data to compel decreases to greenhouse emissions.

“This is a science-based administration, a fact-based administration, and our climate policies have to be based on fact, have to be based on data, and we want to make those data available to everybody,” Holdren said.

The data initiative touches multiple agencies — including NASA, the Centers for Disease Control and Prevention, the National Institutes of Health and the Environmental Protection Agency — and is part of the White House proclamation of a new National Public Health Week, from April 6 to April 12, to spur national health solutions and awareness.

The 150-plus data sets are all connected to health, and are among the 560 climate-related data sets available on Data.gov, the U.S. government’s open data portal. Accompanying the release, the Department of Health and Human Services added a Health Care Facilities Toolkit on Toolkit.climate.gov, a site that delivers climate resilience techniques, strategies, case studies and tools for organizations attempting climate change initiatives.

Holdren was followed by White House Chief Data Scientist D.J. Patil, who moderated a tech industry panel with representatives from Google, Microsoft and GIS mapping software company Esri.

Google Earth Outreach Program Manager Allison Lieber confirmed that Google will continue to provide assistance with 10 million hours for high-performance computing for climate data projects — down from 50 million in 2014 — and the company will likewise provide climate data hosting on Google Earth….(More)”

Rebooting Democracy


 John Boik, Lorenzo Fioramonti, and Gary Milante at Foreign Policy: “….The next generation of political and economic systems may look very different from the ones we know today.

Some changes along these lines are already happening. Civil society groups, cities, organizations, and government agencies have begun to experiment with a host of innovations that promote decentralization, redundancy, inclusion, and diversity. These include participatory budgeting, where residents of a city democratically choose how public monies are spent. They also include local currency systems, open-source development, open-design, open-data and open-government, public banking, “buy local” campaigns, crowdfunding, and socially responsible business models.

Such innovations are a type of churning on the edges of current systems. But in complex systems, changes at the periphery can cascade to changes at the core. Further, the speed of change is increasing. Consider the telephone, first introduced by Bell in 1876. It took about 75 years to reach adoption by 50 percent of the market. A century later the Internet did the same in about 35 years. We can expect that the next major innovations will be adopted even faster.

Following the examples of the telephone and Internet, it appears likely that the technology of new economic and political decision-making systems will first be adopted by small groups, then spread virally. Indeed, small groups, such as neighborhoods and cities, are among today’s leaders in innovation. The influence of larger bodies, such as big corporations and non-governmental organizations, is also growing steadily as nation states increasingly share their powers, willingly or not.

Changes are evident even within large corporations. Open-source software development has become the norm, for example, and companies as large as Toyota have announced plans to freely share their intellectual property.

While these innovations represent potentially important parts of new political and economic systems, they are only the tip of the iceberg. Systems engineering design could eventually integrate these and other innovations into efficient, user-friendly, scalable, and resilient whole systems. But the need for this kind of innovation is not yet universally acknowledged. In its list of 14 grand challenges for the 21st century, the U.S. National Academy of Engineering addresses many of the problems caused by poor decision making, such as climate change, but not the decision-making systems themselves. The work has only just begun.

The development of new options will dramatically alter how democracy is used, adjusted, and exported. Attention will shift toward groups, perhaps at the city/regional level, who wish to apply the flexible tools freely available on the Internet. Future practitioners of democracy will invest more time and resources to understand what communities want and need — helping them adapt designs to make them fit for their purpose — and to build networked systems that beneficially connect diverse groups into larger political and economic structures. In time, when the updates to next-generation political and economic near completion, we might find ourselves more fully embracing the notion “engage local, think global.”…(More)

Sensor Law


Paper by Sandra Braman: For over two decades, information policy-making for human society has been increasingly supplemented, supplanted, and/or superceded by machinic decision-making; over three decades since legal decision-making has been explicitly put in place to serve machinic rather than social systems; and over four decades since designers of the Internet took the position that they were serving non-human (machinic, or daemon) users in addition to humans. As the “Internet of Things” becomes more and more of a reality, these developments increasingly shape the nature of governance itself. This paper’s discussion of contemporary trends in these diverse modes of human-computer interaction at the system level — interactions between social systems and technological systems — introduces the changing nature of the law as a sociotechnical problem in itself. In such an environment, technological innovations are often also legal innovations, and legal developments require socio-technical analysis as well as social, legal, political, and cultural approaches.

Examples of areas in which sensors are already receiving legal attention are rife. A non-comprehensive listing includes privacy concerns beginning but not ending with those raised by sensors embedded in phones and geolocation devices, which are the most widely discussed and those of which the public is most aware. Sensor issues arise in environmental law, health law, marine law, intellectual property law, and as they are raised by new technologies in use for national security purposes that include those confidence- and security-building measures intended for peacekeeping. They are raised by liability issues for objects that range from cars to ovens. And sensor issues are at the core of concerns about “telemetric policing,” as that is coming into use not only in North America and Europe, but in societies such as that of Brazil as well.

Sensors are involved in every stage of legal processes, from identification of persons of interest to determination of judgments and consequences of judgments. Their use significantly alters the historically-developed distinction among types of decision-making meant to come into use at different stages of the process, raising new questions about when, and how, human decision-making needs to dominate and when, and how, technological innovation might need to be shaped by the needs of social rather than human systems.

This paper will focus on the legal dimensions of sensors used in ubiquitous embedded computing….(More)”

Open-Data Project Adds Transparency to African Elections


Jessica Weiss at the International Center for Journalists: “An innovative tool developed to help people register to vote in Kenya is proving to be a valuable asset to voters across the African continent.

GotToVote was created in 2012 by two software developers under the guidance of ICFJ’s Knight International Journalism Fellow Justin Arenstein for use during Kenya’s general elections. In just 24 hours, the developers took voter registration information in a government PDF and turned it into a simple website with usable data that helped people locate the nearest voting center where they could register for elections. Kenyan media drove a large audience to the site, which resulted in a major boost in voter registrations.

Since then, GotToVote has helped people register to vote in Malawi and Zimbabwe. Now, it is being adapted for use in national elections in Ghana and Uganda in 2016.

Ugandan civic groups led by The African Freedom of Information Centre are planning to use it to help people register, to verify registrations and for SMS registration drives. They are also proposing new features—including digital applications to help citizens post issues of concern and compare political positions between parties and candidates so voters better understand the choices they are being offered.

In Ghana, GotToVote is helping citizens find their nearest registration center to make sure they are eligible to vote in that country’s 2016 national elections. The tool, which is optimized for mobile devices, makes voter information easily accessible to the public. It explains who is eligible to register for the 2016 general elections and gives a simple overview of the voter registration process. It also tells users what documentation to take with them to register…..

Last year, Malawi’s national government used GotToVote to check whether voters were correctly registered. As a result, more than 20,000 were found to be incorrectly registered, because they were not qualified voters or were registered in the wrong constituency. In 2013, thousands used GotToVote via their mobile and tablet devices to find their polling places in Zimbabwe.

The successful experiment provides a number of lessons about the power and feasibility of open data projects, showing that they don’t require large teams, big budgets or a lot of time to build…(More)

Eight ways to make government more experimental


Jonathan Breckon et al at NESTA: “When the banners and bunting have been tidied away after the May election, and a new bunch of ministers sit at their Whitehall desks, could they embrace a more experimental approach to government?

Such an approach requires a degree of humility.  Facing up to the fact that we don’t have all the answers for the next five years.  We need to test things out, evaluate new ways of doing things with the best of social science, and grow what works.  And drop policies that fail.

But how best to go about it?  Here are our 8 ways to make it a reality:

  1. Make failure OK. A more benign attitude to risk is central to experimentation.  As a 2003 Cabinet Office review entitled Trying it Out said, a pilot that reveals a policy to be flawed should be ‘viewed as a success rather than a failure, having potentially helped to avert a potentially larger political and/or financial embarrassment’. Pilots are particularly important in fast moving areas such as technology to try promising fresh ideas in real-time. Our ‘Visible Classroom’ pilot tried an innovative approach to teacher CPD developed from technology for television subtitling.
  2. Avoid making policies that are set in stone.  Allowing policy to be more project–based, flexible and time-limited could encourage room for manoeuvre, according to a previous Nesta report State of Uncertainty; Innovation policy through experimentation.  The Department for Work and Pensions’ Employment Retention and Advancement pilot scheme to help people back to work was designed to influence the shape of legislation. It allowed for amendments and learning as it was rolled out.  We need more policy experiments like this.
  3. Work with the grain of current policy environment. Experimenters need to be opportunists. We need to be nimble and flexible. Ready to seize windows of opportunity to  experiment. Some services have to be rolled out in stages due to budget constraints. This offers opportunities to try things out before going national. For instance, The Mexican Oportunidades anti-poverty experiments which eventually reached 5.8 million households in all Mexican states, had to be trialled first in a handful of areas. Greater devolution is creating a patchwork of different policy priorities, funding and delivery models – so-called ‘natural experiments’. Let’s seize the opportunity to deliberately test and compare across different jurisdictions. What about a trial of basic income in Northern Ireland, for example, along the lines of recent Finnish proposals, or universal free childcare in Scotland?
  4. Experiments need the most robust and appropriate evaluation methods such as, if appropriate, Randomised Controlled Trials. Other methods, such as qualitative research may be needed to pry open the ‘black box’ of policies – to learn about why and how things are working. Civil servants should use the government trial advice panel as a source of expertise when setting up experiments.
  5. Grow the public debate about the importance of experimentation. Facebook had to apologise after a global backlash to psychological experiments on their 689,000 users web-users. Approval by ethics committees – normal practice for trials in hospitals and universities – is essential, but we can’t just rely on experts. We need a dedicated public understanding of experimentation programmes, perhaps run by Evidence Matters or Ask for Evidence campaigns at Sense about Science. Taking part in an experiment in itself can be a learning opportunity creating  an appetite amongt the public, something we have found from running an RCT with schools.
  6. Create ‘Skunkworks’ institutions. New or improved institutional structures within government can also help with experimentation.   The Behavioural Insights Team, located in Nesta,  operates a classic ‘skunkworks’ model, semi-detached from day-to-day bureaucracy. The nine UK What Works Centres help try things out semi-detached from central power, such as the The Education Endowment Foundation who source innovations widely from across the public and private sectors- including Nesta-  rather than generating ideas exclusively in house or in government.
  7. Find low-cost ways to experiment. People sometimes worry that trials are expensive and complicated.  This does not have to be the case. Experiments to encourage organ donation by the Government Digital Service and Behavioural Insights Team involved an estimated cost of £20,000.  This was because the digital experiments didn’t involve setting up expensive new interventions – just changing messages on  web pages for existing services. Some programmes do, however, need significant funding to evaluate and budgets need to be found for it. A memo from the White House Office for Management and Budget has asked for new Government schemes seeking funding to allocate a proportion of their budgets to ‘randomized controlled trials or carefully designed quasi-experimental techniques’.
  8. Be bold. A criticism of some experiments is that they only deal with the margins of policy and delivery. Government officials and researchers should set up more ambitious experiments on nationally important big-ticket issues, from counter-terrorism to innovation in jobs and housing….(More)

Twitter for government: Indonesians get social media for public services


Medha Basu at FutureGov: “One of the largest users of social media in the world, Indonesians are taking it a step further with a new social network just for public services.

Enda Nasution and his team have built an app called Sebangsa, or Same Nation, featuring Facebook-like timelines (or Twitter-like feeds) for citizens to share about public services.

They want to introduce an idea they call “social government” in Indonesia, Nasution told FutureGov, going beyond e-government and open government to build a social relationship between the government and citizens….

It has two features that stand out. One called Sebangsa911 is for Indonesians to post emergencies, much like they might on Twitter or Facebook when they see an accident on the road or a crowd getting violent, for instance. Indonesia does not have any single national emergency number.

Another feature is called Sebangsa1800 which is a channel for people to post reviews, questions and complaints on public services and consumer products.

Why another social network?
But why build another social network when there are millions of users on Facebook and Twitter already? One reason is to provide a service that focuses on Indonesians, Nasution said – the app is in Bahasa.

Another is because existing social networks are not built specifically for public services. If you post a photo of an accident on Twitter, how many and how fast people see it depends on how many followers you have, Nasution said. These reports are also unstructured because they are “scattered all over Twitter”, he said. The app “introduces a little bit of structure to the reports”….(More)”

Growing Data Collection Inspires Openness at NGA


at Secrecy News: “A flood of information from the ongoing proliferation of space-based sensors and ground-based data collection devices is promoting a new era of transparency in at least one corner of the U.S. intelligence community.

The “explosion” of geospatial information “makes geospatial intelligence increasingly transparent because of the huge number and diversity of commercial and open sources of information,” said Robert Cardillo, director of the National Geospatial-Intelligence Agency (NGA), in a speech last month.

Hundreds of small satellites are expected to be launched within the next three years — what Mr. Cardillo called a “darkening of the skies” — and they will provide continuous, commercially available coverage of the entire Earth’s surface.

“The challenges of taking advantage of all of that data are daunting for all of us,” Mr. Cardillo said.

Meanwhile, the emerging “Internet of Things” is “spreading rapidly as more people carry more handheld devices to more places” generating an abundance of geolocation data.

This is, of course, a matter of intelligence interest since “Every local, regional, and global challenge — violent extremism in the Middle East and Africa, Russian aggression, the rise of China, Iranian and North Korean nuclear weapons, cyber security, energy resources, and many more — has geolocation at its heart.”

Consequently, “We must open up GEOINT far more toward the unclassified world,” Director Cardillo said in another speech last week.

“In the past, we have excelled in our closed system. We enjoyed a monopoly on sources and methods. That monopoly has long since ended. Today and in the future, we must thrive and excel in the open.”

So far, NGA has already distinguished itself in the area of disaster relief, Mr. Cardillo said.

“Consider Team NGA’s response to the Ebola crisis. We are the first intelligence agency to create a World Wide Web site with access to our relevant unclassified content. It is open to everyone — no passwords, no closed groups.”

NGA provided “more than a terabyte of up-to-date commercial imagery.”

“You can imagine how important it is for the Liberian government to have accurate maps of the areas hardest hit by the Ebola epidemic as well as the medical and transportation infrastructure to combat the disease,” Mr. Cardillo said.

But there are caveats. Just because information is unclassified does not mean that it is freely available.

“Although 99 percent of all of our Ebola data is unclassified, most of that is restricted by our agreements [with commercial providers],” Mr. Cardillo said. “We are negotiating with many sources to release more data.”

Last week, Director Cardillo announced a new project called GEOINT Pathfinder that will attempt “to answer key intelligence questions using only unclassified data.”….(More)

Methods to Protect and Secure “Big Data” May Be Unknowingly Corrupting Research


New paper by John M. Abowd and Ian M. Schmutte: “…As the government and private companies increase the amount of data made available for public use (e.g. Census data, employment surveys, medical data), efforts to protect privacy and confidentiality (through statistical disclosure limitation or SDL) can often cause misleading and compromising effects on economic research and analysis, particularly in cases where data properties are unclear for the end-user.

Data swapping is a particularly insidious method of SDL and is frequently used by important data aggregators like the Census Bureau, the National Center for Health Statistics and others, which interferes with the results of empirical analysis in ways that few economists and other social scientists are aware of.

To encourage more transparency, the authors call for both government statistical agencies as well as the private sector (Amazon, Google, Microsoft, Netfix, Yahoo!, etc.) to release more information about parameters used in SDL methods, and insist that journals and editors publishing such research require documentation of the author’s entire methodological process….(More)

VIDEO:

Why governments need guinea pigs for policies


Jonathan Breckon in the Guardian:”People are unlikely to react positively to the idea of using citizens as guinea pigs; many will be downright disgusted. But there are times when government must experiment on us in the search for knowledge and better policy….

Though history calls into question the ethics of experimentation, unless we try things out, we will never learn. The National Audit Office says that £66bn worth of government projects have no plans to evaluate their impact. It is unethical to roll out policies in this arbitrary way. We have to experiment on a small scale to have a better understanding of how things work before rolling out policies across the UK. This is just as relevant to social policy, as it is to science and medicine, as set out in a new report by the Alliance for Useful Evidence.

Whether it’s the best ways to teach our kids to read, designing programmes to get unemployed people back to work, or encouraging organ donation – if the old ways don’t work, we have to test new ones. And that testing can’t always be done by a committee in Whitehall or in a university lab.

Experimentation can’t happen in isolation. What works in Lewisham or Londonnery, might not work in Lincoln – or indeed across the UK. For instance, there is a huge amount debate around the current practice of teaching children to read and spell using phonics, which was based on a small-scale study in Clackmannanshire, as well as evidence from the US. A government-commissioned review on the evidence for phonics led professor Carole Torgerson, then at York University, to warn against making national policy off the back of just one small Scottish trial.

One way round this problem is to do larger experiments. The increasing use of the internet in public services allows for more and faster experimentation, on a larger scale for lower cost – the randomised controlled trial on voter mobilisation that went to 61 million users in the 2010 US midterm elections, for example. However, the use of the internet doesn’t get us off the ethical hook. Facebook had to apologise after a global backlash to secret psychological tests on their 689,000 users.

Contentious experiments should be approved by ethics committees – normal practice for trials in hospitals and universities.

We are also not interested in freewheeling trial-and-error; robust and appropriate research techniques to learn from experiments are vital. It’s best to see experimentation as a continuum, ranging from the messiness of attempts to try something new to experiments using the best available social science, such as randomised controlled trials.

Experimental government means avoiding an approach where everything is fixed from the outset. What we need is “a spirit of experimentation, unburdened by promises of success”, as recommended by the late professor Roger Jowell, author of the 2003 Cabinet Office report, Trying it out [pdf]….(More)”