Crowdfunding sites aim to make the law accessible to all


Jonathan Ford at the Financial Times: “Using the internet to harness the financial power of crowds is hardly novel. Almost since the first electronic impulse pinged its way across the world wide web, entrepreneurs have been dreaming up sites to facilitate everything from charitable donation to hard-nosed investment.

Peer-to-peer lending is now almost part of the mainstream. JustGiving, the charitable portal, has been going since 2000. But employing the web to raise money for legal actions remains a less well ploughed piece of virtual terrain.

At first glance, you might wonder why this is. There is already a booming offline trade in the commercial funding of litigation, especially in America and Britain, whether through lawyers’ no-win, no-fee arrangements or third party investment. And, indeed, a few pioneering crowdfunding vehicles have recently emerged in the US. One such is Invest4Justice, a site that boldly touts returns of “500 per cent plus in a few months”.

Whether these eye-catching figures are ultimately deliverable is — as lawyers like to say — moot. But there are risks in seeking to share the fruits of a third party’s action that can make it perilous for the crowdfunding investor. One is that when actions fail, those same backers might have to pay not only their own, but the successful party’s, costs….

But not all crowdfunding ventures seek to reward participants in the currency of cold financial return. Crowdjustice, Britain’s first legal crowdfunding website, seeks to scratch quite a different itch in the psyches of its participants….Among the causes it has taken up are a criminal appeal and a planning dispute in Lancashire involving a landfill site. The only real requirement for consideration is that the legal David confronting the corporate or governmental Goliath must have already engaged a lawyer to take on their case….This certainly means the risk of being dragged into proceedings is far lower. But it also raises a question: why would the public want to donate money to lawyers in the first place?

Ms Salasky thinks it ranges from a sense of justice to enlightened self-interest. “Donors can be people who take human rights seriously, but they could also be those who worry that something which is happening to someone else could also happen to them,” she says. It is one reason why perhaps the most potent application is seen to be in the fields of environmental and planning law. …(More)”

 

100 parliaments as open data, ready for you to use


Myfanwy Nixon at mySociety’s blog and OpeningParliament: “If you need data on the people who make up your parliament, another country’s parliament, or indeed all parliaments, you may be in luck.

Every Politician, the latest Poplus project, aims to collect, store and share information about every parliament in the world, past and present—and it already contains 100 of them.

What’s more, it’s all provided as Open Data to anyone who would like to use it to power a civic tech project. We’re thinking parliamentary monitoring organisations, journalists, groups who run access-to-democracy sites like our own WriteToThem, and especially researchers who want to do analysis across multiple countries.

But isn’t that data already available?

Yes and no. There’s no doubt that you can find details of most parliaments online, either on official government websites, on Wikipedia, or on a variety of other places online.

But, as you might expect from data that’s coming from hundreds of different sources, it’s in a multitude of different formats. That makes it very hard to work with in any kind of consistent fashion.

Every Politician standardises all of its data into the Popolo standard and then provides it in two simple downloadable formats:

  • csv, which contains basic data that’s easy to work with on spreadsheets
  • JSON which contains richer data on each person, and is ideal for developers

This standardisation means that it should now be a lot easier to work on projects across multiple countries, or to compare one country’s data with another. It also means that data works well with other Poplus Components….(More)”

Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings


Future of Privacy Forum: “In the wake of last year’s news about the Facebook “emotional contagion” study and subsequent public debate about the role of A/B Testing and ethical concerns around the use of Big Data, FPF Senior Fellow Omer Tene participated in a December symposum on corporate consumer research hosted by Silicon Flatirons. This past month, the Colorado Technology Law Journal published a series of papers that emerged out of the symposium, including “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

“Beyond the Common Rule,” by Jules Polonetsky, Omer Tene, and Joseph Jerome, continues the Future of Privacy Forum’s effort to build on the notion of consumer subject review boards first advocated by Ryan Calo at FPF’s 2013 Big Data symposium. It explores how researchers, increasingly in corporate settings, are analyzing data and testing theories using often sensitive personal information. Many of these new uses of PII are simply natural extensions of current practices, and are either within the expectations of individuals or the bounds of the FIPPs. Yet many of these projects could involve surprising applications or uses of data, exceeding user expectations, and offering notice and obtaining consent could may not be feasible.

This article expands on ideas and suggestions put forward around the recent discussion draft of the White House Consumer Privacy Bill of Rights, which espouses “Privacy Review Boards” as a safety value for noncontextual data uses. It explores how existing institutional review boards within the academy and for human testing research could offer lessons for guiding principles, providing accountability and enhancing consumer trust, and offers suggestions for how companies — and researchers — can pursue both knowledge and data innovation responsibly and ethically….(More)”

Local open data ecosystems – a prototype map


Ed Parkes and Gail Dawes at Nesta: “It is increasingly recognised that some of the most important open data is published by local authorities (LAs) – data which is important to us like bin collection days, planning applications and even where your local public toilet is. Also given the likely move towards greater decentralisation, firstly through devolution to cities, the importance of the publication of local open data could arguably become more important over the next couple of years. In addition, as of 1st April, there is a new transparency code for local government requiring local authorities to publish further information on things like spending to local land assets. To pre-empt this likely renewed focus on local open data we have begun to develop a prototype map to highlight the UK’s local open data ecosystem.

Already there is some great practice in the publication of open data at a local level – such as Leeds Data Mill, London Datastore, and Open Data Sheffield. This regional activity is also characterised not just by high quality data publication, but also by pulling together through hackdays, challenges and meetups a community interested in the power of open data. This creates an ecosystem of publishers and re-users at a local level. Some of the best practice in relation to developing such an ecosystem was recognised by the last government in the announcement of a group of Local Authority Open Data Champions. Some of these were also recipients of the funding for projects from both the Cabinet Office and through the Open Data User Group.

Outside of this best practice it isn’t always easy to understand how developed smaller, less urban open data agendas are. Other than looking at each councils’ website or increasingly on the data portals that forwarding thinking councils are providing, there is a surprisingly large number of places that local authorities could make their open data available. The most well known of these is the Openly Local project but at the time of writing this now seems to be retired. Perhaps the best catalogue of local authority data is on Data.gov.uk itself. This has 1,449 datasets published by LAs across 200 different organisations. Following that there is the Open Data Communities website which hosts links to LA linked datasets. Using data from the latter, Steve Peters has developed the local data dashboard (which was itself based on the UK Local Government Open Data resource map from Owen Boswarva). In addition, local authorities can also register their open data in the LGA’s Open Data Inventory Service and take it through the ODI’s data certification process.

Prototype map of local open data eco-systems

To try to highlight patterns in local authority open data publication we decided to make a map of activity around the country (although in the first instance we’ve focused on England)….(More)

Why transparency can be a dirty word


Francis Fukuyama in the Financial Times: “It is hard to think of a political good that is more universally praised than government transparency. Whereas secrecy shelters corruption, abuse of power, undue influence and a host of other evils, transparency allows citizens to keep their rulers accountable. Or that is the theory.

It is clear that there are vast areas in which modern governments should reveal more. Edward Snowden’s revelations of eavesdropping by the National Security Agency has encouraged belief that the US government has been not nearly transparent enough. But is it possible to have too much transparency? The answer is clearly yes: demands for certain kinds of transparency have hurt government effectiveness, particularly with regard to its ability to deliberate.

The US has a number of statutes mandating transparency passed decades ago in response to perceived government abuses, and motivated by perfectly reasonable expectations that the government should operate under greater scrutiny. Yet they have had a number of unfortunate consequences.

The Federal Advisory Committee Act, for example, places onerous requirements on any public agency seeking to consult a group outside the government, requiring that they are formally approved and meet various criteria for political balance. Meetings must be held in public. The Government in the Sunshine Act stipulates that, with certain exceptions, “every portion of every meeting of an agency shall be open to public observation”.

These obligations put a serious damper on informal consultations with citizens, and even make it difficult for officials to talk to one another. Deliberation, whether in the context of a family or a federal agency, require people to pose hypotheticals and, when trying to reach agreement, make concessions.

When the process itself is open to public scrutiny, officials fear being hounded for a word taken out of context. They resort to cumbersome methods of circumventing the regulations, such as having one-on-one discussions so as not to trigger a group rule, or having subordinates do all the serious work.

The problem with the Freedom of Information Act is different. It was meant to serve investigative journalists looking into abuses of power. But today a large number of FOIA requests are filed by corporate sleuths trying to ferret out secrets for competitive advantage, or simply by individuals curious to find out what the government knows about them. The FOIA can be “weaponised”, as when the activist group Judicial Watch used it to obtain email documents on the Obama administration’s response to the 2012 attack on the US compound in Benghazi…..

National security aside, the federal government’s executive branch is probably one of the most transparent organisations on earth — no corporation, labour union, lobbying group or non-profit organisation is subject to such scrutiny. The real problem, as Professor John DiIulio of Pennsylvania university has pointed out, is that most of the work of government has been outsourced to contractors who face none of the transparency requirements of the government itself. It is an impossible task even to establish the number of such contractors in a single American city, much less how they are performing their jobs.

In Europe, where there is no equivalent to the FACA or the Sunshine Act, governments can consult citizens’ groups more flexibly. There is, of course, a large and growing distrust of European institutions by citizens. But America’s experience suggests that greater transparency requirements do not necessarily lead to more trust in government….(More)”

 

ENGAGE: Building and Harnessing Networks for Social Impact


Faizal Karmali and Claudia Juech at the Rockefeller Foundation: “Have you heard of ‘X’ organization? They’re doing interesting work that you should know about. You might even want to work together.”

Words like these abound between individuals at conferences, at industry events, in email, and, all too often, trapped in the minds of those who see the potential in connecting the dots. Bridging individuals, organizations, or ideas is fulfilling because these connections often result in value for everyone, sometimes immediately, but often over the long term. While many of us can think of that extraordinary network connector in our personal or professional circles, if asked to identify an organization that plays a similar role at scale, across multiple sectors, we may be hard-pressed to name more than a few—let alone understand how they do it well….

In an effort to capture and codify the growing breadth of knowledge and experience around leveraging networks for social impact, the Monitor Institute, a part of Deloitte Consulting, with support from The Rockefeller Foundation, have produced ENGAGE: How Funders Can Support and Leverage Networks for Social Impact— an online guide which offers a series of frameworks, tools, insights, and stories to help funders explore the critical questions around using networks as part of their grantmaking strategy—particularly as a means to accelerating impact….

ENGAGE draws on the experience and knowledge of over 40 leaders and practitioners in the field who are using networks to create change; digs into the deep pool of writing on the topic; and mines the significant experience in working with networks that is resident in both Monitor Institute and The Rockefeller Foundation. The result is an aggregation and synthesis of some of the leading thinking in both the theory and practice of engaging with networks as a grantmaker.

Compelling examples on how the Foundation leverages the power of networks can be seen in the creation of formal network institutions like the Global Impact Investing Network (GIIN) and the Joint Learning Network for Universal Health Coverage, but also through more targeted and time-bound network engagement activities, such as enabling greater connectivity among grantees and unleashing the power of technology to surface innovation from loosely curated crowds.

Building and harnessing networks is more an art than a science. It is our hope that ENGAGE will enable grantmakers and other network practitioners to be more deliberate and thoughtful about how and when a network can help accelerate their work…. (More)

IMF Publishes Worldwide Government Revenue Database


IMF Press Release: “The IMF today published for the first time the World Revenue Longitudinal Dataset (WoRLD), which provides data on tax and non-tax revenues for 186 countries over the period 1990-2013. The database includes broad country coverage and time periods, and it is the result of combining in a consistent manner data from two other IMF publications — the IMF Government Finance Statistics and World Economic Outlook (WEO)– and drawing on the OECD’s Revenue Statistics and Revenue Statistics in Latin America and the Caribbean.

Vitor Gaspar, Director of the IMF’s Fiscal Affairs Department, said the purpose of releasing the database for general use is to “encourage and facilitate informed discussion and analysis of tax policy and administration for the full range of countries, the need for which was highlighted most recently during the Financing for Development conference in Addis Ababa.”

Constructing the database was a challenging exercise. An accompanying background note will be released in the coming weeks to explain the methodology. The database will be updated annually and will include information from IMF staff reports.

The database is available for download free of charge on the IMF e-Library data portal (http://data.imf.org/revenues).”

 

Postal Service’s Futuristic Vision for the Internet of Things


Mohana Ravindranath at NextGov: “The U.S. Postal Service is betting a new device will soon enter hi-tech homes, alongside the self-adjusting thermostats, text-enabled washing machines, and fridges notifying owners when groceries run out: the smart mailbox.

It’s the future of mail delivery, according to USPS’ Office of the Inspector General: a mailbox, equipped with tiny sensors that can collect data on mail delivery and pick up time, or outside temperature. The owners might control the box’s internal temperature and locking mechanism through a smartphone app.

The smart mailbox is just one element of the Postal Service’s larger vision for the Internet of Things, a term for a connected network of devices and sensors. In a report about that vision — dubbed the “Internet of Postal Things” — the Inspector General’s Office, in collaboration with IBM, paints a picture of post office systems that auto-fill paperwork for customers as they walk in, delivery vehicles that monitor themselves for maintenance, and sensors that notify package recipients upon delivery, among other scenarios.

The IG’s office recommended the Postal Service “start experimenting with Internet of Things technologies” to modernize its business, “as well as develop new business models to stay relevant in the digital age.”

A more advanced technological infrastructure could prepare the agency for same-day delivery and real-time rerouting for packages, options which “have the potential to grow in importance as consumers continue to buy more items online,” the report said.

For instance, sensors in a smart mailbox could scan a barcode or read an RFID tag on a letter, which could confirm delivery and replace the hand scan and delivery signature process, the report said. If the mailbox were temperature controlled, employees could deliver groceries or temperature-sensitive medicine.

These new mailboxes could also make the agency money. “If 5 percent of the 117 U.S. million households rented such a box for $3 a month, the product would generate $210 million a year in revenue,” the report said.

Postal Service vehicles could use sensors for driving routing systems, modeled after the DHL’s SmartTruck program, which can recalculate routes based on real-time events such as traffic, weather, or new pick-up requests. Vehicle sensors could also weigh cargo to make sure the truck is fully packed.

In the 2012 fiscal year, the Postal Service spent $926 million on fuel for more than 15,000 highway routes, some of which are coast-to-coast — sensors could help manage those contracts better, the report said.

In-vehicle sensors could also “incentivize contract drivers as well as its own carriers to adopt fuel-efficient behavior.”

The sensors could additionally collect large volumes of data on routes, helping the agency identify operational inefficiencies. The Postal Service has about 200,000 vehicles traveling more than 1.2 billion miles annually, often the same route six  days a week — and plans to acquire about 10,000 vehicles over the next two years, the report said.

The IG also suggested post offices themselves get a tech upgrade. More efficient lighting or air conditioning systems might turn off when the office is empty. During business hours, beacon technology could detect when a customer enters the post office, send him or her a push notification through a smartphone app, and direct them to the right counter.

Customers could also use an app to pre-fill paperwork, such as customs forms, which could then be displayed on a clerk’s monitor. They may also use a smartphone app to pay for postage….(More)”

The New Science of Sentencing


Anna Maria Barry-Jester et al at the Marshall Project: “Criminal sentencing has long been based on the present crime and, sometimes, the defendant’s past criminal record. In Pennsylvania, judges could soon consider a new dimension: the future.

Pennsylvania is on the verge of becoming one of the first states in the country to base criminal sentences not only on what crimes people have been convicted of, but also on whether they are deemed likely to commit additional crimes. As early as next year, judges there could receive statistically derived tools known as risk assessments to help them decide how much prison time — if any — to assign.

Risk assessments have existed in various forms for a century, but over the past two decades, they have spread through the American justice system, driven by advances in social science. The tools try to predict recidivism — repeat offending or breaking the rules of probation or parole — using statistical probabilities based on factors such as age, employment history and prior criminal record. They are now used at some stage of the criminal justice process in nearly every state. Many court systems use the tools to guide decisions about which prisoners to release on parole, for example, and risk assessments are becoming increasingly popular as a way to help set bail for inmates awaiting trial.

But Pennsylvania is about to take a step most states have until now resisted for adult defendants: using risk assessment in sentencing itself. A state commission is putting the finishing touches on a plan that, if implemented as expected, could allow some offenders considered low risk to get shorter prison sentences than they would otherwise or avoid incarceration entirely. Those deemed high risk could spend more time behind bars.

Pennsylvania, which already uses risk assessment in other phases of its criminal justice system, is considering the approach in sentencing because it is struggling with an unwieldy and expensive corrections system. Pennsylvania has roughly 50,000 people in state custody, 2,000 more than it has permanent beds for. Thousands more are in local jails, and hundreds of thousands are on probation or parole. The state spends $2 billion a year on its corrections system — more than 7 percent of the total state budget, up from less than 2 percent 30 years ago. Yet recidivism rates remain high: 1 in 3inmates is arrested again or reincarcerated within a year of being released.

States across the country are facing similar problems — Pennsylvania’s incarceration rate is almost exactly the national average — and many policymakers see risk assessment as an attractive solution. Moreover, the approach has bipartisan appeal: Among some conservatives, risk assessment appeals to the desire to spend tax dollars on locking up only those criminals who are truly dangerous to society. And some liberals hope a data-driven justice system will be less punitive overall and correct for the personal, often subconscious biases of police, judges and probation officers. In theory, using risk assessment tools could lead to both less incarceration and less crime.

There are more than 60 risk assessment tools in use across the U.S., and they vary widely. But in their simplest form, they are questionnaires — typically filled out by a jail staff member, probation officer or psychologist — that assign points to offenders based on anything from demographic factors to family background to criminal history. The resulting scores are based on statistical probabilities derived from previous offenders’ behavior. A low score designates an offender as “low risk” and could result in lower bail, less prison time or less restrictive probation or parole terms; a high score can lead to tougher sentences or tighter monitoring.

The risk assessment trend is controversial. Critics have raised numerous questions: Is it fair to make decisions in an individual case based on what similar offenders have done in the past? Is it acceptable to use characteristics that might be associated with race or socioeconomic status, such as the criminal record of a person’s parents? And even if states can resolve such philosophical questions, there are also practical ones: What to do about unreliable data? Which of the many available tools — some of them licensed by for-profit companies — should policymakers choose?…(More)”

Public Participation in Selected Civilizations: Problems and Potentials


Paper by Sulaimon Adigun Muse and Sagie Narsiah: “Public participation is not a recent phenomenon. It has spanned centuries, cultures and civilizations. The aim of this paper is to present a historical overview of public participation in some selected civilizations across the globe. The conceptual basis of the paper is premised on participatory democracy. It will adopt an analytical and historical approach. Scholars have recognized that public participation remains a relevant concept globally. The concept is not unproblematic, but there is enormous potential for substantive democratization of the public sphere. Hence, one of the key recommendations of the paper is that the potentials of public participation have to be fully explored and exploited….(More)”