New million dollar fund for participatory budgeting in South Australia


Medha Basu at Future Gov: “A new programme in South Australia is allowing citizens to determine which community projects should get funding.

The Fund My Community programme has a pool of AU$1 million (US$782,130) to fund projects by non-profit organisations aimed at supporting disadvantaged South Australians.

Organisations can nominate their projects for funding from this pool and anyone in the state can vote for the projects on the YourSAy web site.

All information about the projects submitted by the organisations will be available online to make the process transparent. “We hope that by providing the community with the right information about grant applications, people will support projects that will have the biggest impact in addressing disadvantage across South Australia,” the Fund My Community web site says.

The window to nominate community projects for funding is open until 2 April. Eligible applications will be opened for community assessment from 23 April to 4 May. The outcome will be announced and grants will be given out in June. See the full timeline here:

Fund my Community South Australia

There is a catch here though. The projects that receive the most support from the community are suggested for funding, but due to “a legal requirement” the final decision and grant approval comes from the Board of the Charitable and Social Welfare Fund, according to the YourSAy web site….(More)”

Scenario Planning Case Studies Using Open Government Data


New Paper by Robert Power, Bella Robinson, Lachlan Rudd, and Andrew Reeson: “The opportunity for improved decision making has been enhanced in recent years through the public availability of a wide variety of information. In Australia, government data is routinely made available and maintained in the http://data.gov.au repository. This is a single point of reference for data that can be reused for purposes beyond that originally considered by the data custodians. Similarly a wealth of citizen information is available from the Australian Bureau of Statistics. Combining this data allows informed decisions to be made through planning scenarios.”

We present two case studies that demonstrate the utility of data integration and web mapping. As a simple proof of concept the user can explore different scenarios in each case study by indicating the relative weightings to be used for the decision making process. Both case studies are demonstrated as a publicly available interactive map-based website….(More)”

The Trouble With Disclosure: It Doesn’t Work


Jesse Eisinger at ProPublica: “Louis Brandeis was wrong. The lawyer and Supreme Court justice famously declared that sunlight is the best disinfectant, and we have unquestioningly embraced that advice ever since.
All this sunlight is blinding. As new scholarship is demonstrating, the value of all this information is unproved. Paradoxically, disclosure can be useless — and sometimes actually harmful or counterproductive.
“We are doing disclosure as a regulatory move all over the board,” says Adam J. Levitin, a law professor at Georgetown, “The funny thing is, we are doing this despite very little evidence of its efficacy.”…
Of course, some disclosure works. Professor Levitin cites two examples. The first is an olfactory disclosure. Methane doesn’t have any scent, but a foul smell is added to alert people to a gas leak. The second is ATM. fees. A study in Australia showed that once fees were disclosed, people avoided the high-fee machines and took out more when they had to go to them.
But to Omri Ben-Shahar, co-author of a recent book, ” More Than You Wanted To Know: The Failure of Mandated Disclosure,” these are cherry-picked examples in a world awash in useless disclosures. Of course, information is valuable. But disclosure as a regulatory mechanism doesn’t work nearly well enough, he argues.
First, it really works only when things are simple. As soon as transactions become complex, disclosure starts to stumble. Buying a car, for instance, turns out to be several transactions: the purchase itself, the financing, maybe the trade-in of old car and various insurance and warranty decisions. These are all subject to various disclosure rules, but making the choices clear and useful has proved nigh impossible.
In complex transactions, we then must rely on intermediaries to give us advice. Because they are often conflicted, they, too, become subject to disclosure obligations. Ah, even more boilerplate to puzzle over!
And then there’s the harm. Over the years, banks that sold complex securities often stuck impossible-to-understand clauses deep in prospectuses that “disclosed” what was really going on. When the securities blew up, as they often did, banks then fended off lawsuits by arguing they had done everything the law required and were therefore not liable.
“That’s the harm of disclosure,” Professor Ben-Shahar said. “It provides a safe harbor for practices that smell bad. It sanitizes every bad practice.”
The anti-disclosure movement is taking on the ” Nudge” school, embraced by the Obama administration and promoted most prominently by Cass R. Sunstein, a scholar at Harvard, and Richard H. Thaler, an economist at the University of Chicago. These nudgers believe that small policies will prod people to do what’s in their best interests.
The real-world evidence in favor of nudging is thin. …
The ever-alluring notion is that we are just one or two changes away from having meaningful disclosure. If we could only have annual Securities and Exchange Commission filings in plain English, we could finally understand what’s going on at corporations. A University of San Diego Law School professor, Frank Partnoy, and I called for better bank disclosure in an article in The Atlantic a few years ago.
Professor Ben-Shahar mocks it. ” ‘Plain English!’ ‘Make it simple.’ That is the deus ex machina, the god that will solve everything,” he said.
Complex things are, sadly, complex. A mortgage is not an easy transaction to understand. People are not good at predicting their future behavior and so don’t know what options are best for them. “The project of simplification is facing a very poor empirical track record and very powerful theoretical problem,” he said.
What to do instead? Hard and fast rules. If lawmakers want to end a bad practice, ban it. Having them admit it is not enough. (More)”

Schemes used by South Australia to include citizens in policy making


Joshua Chambers at Future Gov Asia: “…South Australia has pioneered a number of innovative methods to try to include its residents in policymaking. …The highest profile participatory programme run by the state government is the Citizens’ Jury initiative, …The Citizens’ Jury takes a randomly selected, representative group of citizens through a process to hear arguments and evidence much like a jury in a trial, before writing an independent report which makes recommendations to government.
There were 37 members of the jury, hearing evidence on Thursday evenings and Saturdays over a five week period. They heard from motorists associations, cycling associations, and all sorts of other interested groups.
They used Basecamp software to ensure that jurors stayed connected when not at meetings, hosting discussions in a private space to consider the evidence they heard. …The jurors prepared 21 recommendations, ranging from decreasing speed in the city to a schools programme…. The Government supports the majority of the recommendations and will investigate the remaining three.
The government has also committed to provide jurors with an update every 6 months on the progress being made in this area.
Lessons and challenges
As would be expected with an innovative new scheme, it hasn’t always been smooth. One lesson learned from the first initiative was that affected agencies need to be engaged in advance, and briefed throughout the process, so that they can prepare their responses and resources. ….
Aside from the Citizens’ Jury, the Government of South Australia is also pioneering other approaches to include citizens in policy making. Fund My Idea is a crowdsourcing site that allows citizens to propose new projects. …(More)”

The downside of Open Data


Joshua Chambers at FutureGov: “…Inaccurate public datasets can cause big problems, because apps that feed off of them could be giving out false information. I was struck by this when we reported on an app in Australia that was issuing alerts for forest fires that didn’t exist. The data was coming from public emergency calls, but wasn’t verified before being displayed. This meant that app users would be alerted of all possible fires, but also could be caused unnecessarily panic. The government takes the view that more alerts are better than slower verified ones, but there is the potential for people to become less likely to trust all alerts on the app.
No-one wants to publish inaccurate data, but accuracy takes time and costs money. So we come to a central tension in discussions about open data: is it better to publish more data, with the risk of inaccuracy, or limit publication to datasets which are accurate?
The United Kingdom takes the view that more data is best. I interviewed the UK’s lead official on open data, Paul Maltby, a couple of years ago, and he told me that: “There’s a misnomer here that everything has to be perfect before you can put it out,” adding that “what we’re finding is that, actually, some of the datasets are a bit messy. We try to keep them as high-quality as we can; but other organisations then clean up the data and sell it on”.
Indeed, he noted that some officials use data accuracy as an excuse to not publish information that could hold their departments to account. “There’s sometimes a reluctance to get data out from the civil service; and whilst we see many examples of people understanding the reasons why data has been put to use, I’d say the general default is still not pro-release”.
Other countries take a different view, however. Singapore, for example, publishes much less data than Britain, but has more of a push on making its data accurate to assist startups and app builders….(More)”

The Next 5 Years in Open Data: 3 Key Trends to Watch


Kevin Merritt (Socrata Inc.) at GovTech:2014 was a pivotal year in the evolution of open data for one simple and powerful reason – it went mainstream and was widely adopted on just about every continent. Open data is now table stakes. Any government that is not participating in open data is behind its peers…The move toward data-driven government will absolutely accelerate between 2015 and 2020, thanks to three key trends.

1. Comparative Analytics for Government Employees

The first noteworthy trend that will drive open data change in 2015 is that open data technology offerings will deliver first-class benefits to public-sector employees. This means government employees will be able to derive enormous insights from their own data and act on them in a deep, meaningful and analytical way. Until only recently, the primary beneficiaries of open data initiatives were external stakeholders: developers and entrepreneurs; scientists, researchers, analysts, journalists and economists; and ordinary citizens lacking technical training. The open data movement, until now, has ignored an important class of stakeholders – government employees….

2. Increased Global Expansion for Open Data

The second major trend fueling data-driven government is that 2015 will be a year of accelerating adoption of open data internationally.
Right now, for example, open data is being adopted prolifically in Europe, Latin America, Australia, New Zealand and Canada.
….
We will continue to see international governments adopt open data in 2015 for a variety of reasons. Northern European governments, for instance, are interested in efficiency and performance right now; Southern European governments, on the other hand, are currently focused on transparency, trust, and credibility. Despite the different motivations, the open data technology solutions are the same. And, looking out beyond 2015, it’s important to note that Southern European governments will also adopt open data to help increase job creation and improve delivery of services.

3. “Open Data” Will Simply Become “Government Data”

The third trend that we’ll see in the arena of open data lies a little further out on the horizon, and it will be surprising. In my opinion, the term “open data” may disappear within a decade; and in its place will simply be the term “government data.”
That’s because virtually all government data will be open data by 2020; and government data will be everywhere it needs to be – available to the public as fast as it’s created, processed and accumulated….(More).”

The Global Open Data Index 2014


Open Knowledge Foundation: “The Global Open Data Index ranks countries based on the availability and accessibility of information in ten key areas, including government spending, election results, transport timetables, and pollution levels.
The UK tops the 2014 Index retaining its pole position with an overall score of 96%, closely followed by Denmark and then France at number 3 up from 12th last year. Finland comes in 4th while Australia and New Zealand share the 5th place. Impressive results were seen from India at #10 (up from #27) and Latin American countries like Colombia and Uruguay who came in joint 12th .
Sierra Leone, Mali, Haiti and Guinea rank lowest of the countries assessed, but there are many countries where the governments are less open but that were not assessed because of lack of openness or a sufficiently engaged civil society.
Overall, whilst there is meaningful improvement in the number of open datasets (from 87 to 105), the percentage of open datasets across all the surveyed countries remained low at only 11%.
Even amongst the leaders on open government data there is still room for improvement: the US and Germany, for example, do not provide a consolidated, open register of corporations. There was also a disappointing degree of openness around the details of government spending with most countries either failing to provide information at all or limiting the information available – only two countries out of 97 (the UK and Greece) got full marks here. This is noteworthy as in a period of sluggish growth and continuing austerity in many countries, giving citizens and businesses free and open access to this sort of data would seem to be an effective means of saving money and improving government efficiency.
Explore the Global Open Data Index 2014 for yourself!”

Participatory sensing: enabling interactive local governance through citizen engagement


New White Paper by the Institute for a Broadband-Enabled Society (Australia): “Local government (such as the City of Melbourne) is accountable and responsible for establishment, execution and oversight of strategic objectives and resource management in the metropolis. Faced with a rising population, Council has in place a number of strategic plans to ensure it is able to deliver services that maintain (and ideally improve) the quality of life for its citizens (including residents, workers and visitors). This publication explores participatory sensing (PS) and issues associated with governance in the light of new information gathering capabilities that directly engage citizens in collecting data and providing contextual insight that has the potential to greatly enhance Council operations in managing these environments. Download: Participatory Sensing: Enabling interactive local governance through citizen engagement (pdf: 2.3mb)

Gov.uk quietly disrupts the problem of online identity login


The Guardian: “A new “verified identity” scheme for gov.uk is making it simpler to apply for a new driving licence, passport or to file a tax return online, allowing users to register securely using one log in that connects and securely stores their personal data.
After nearly a year of closed testing with a few thousand Britons, the “Gov.UK Verify” scheme quietly opened to general users on 14 October, expanding across more services. It could have as many as half a million users with a year.
The most popular services are expected to be one for tax credit renewals, and CAP farm information – both expected to have around 100,000 users by April next year, and on their own making up nearly half of the total use.
The team behind the system claim this is a world first. Those countries that have developed advanced government services online, such as Estonia, rely on state identity cards – which the UK has rejected.
“This is a federated model of identity, not a centralised one,” said Janet Hughes, head of policy and engagement at the Government Digital Service’s identity assurance program, which developed and tested the system.
How it works
The Verify system has taken three years to develop, and involves checking a user’s identity against details from a range of sources, including credit reference agencies, utility bills, driving licences and mobile provider bills.
But it does not retain those pieces of information, and the credit checking companies do not know what service is being used. Only a mobile or landline number is kept in order to send verification codes for subsequent logins.
When people subsequently log in, they would have to provide a user ID and password, and verify their identity by entering a code sent to related stored phone number.
To enrol in the system, users have to be over 19, living in the UK, and been resident for over 12 months. A faked passport would not be sufficient: “they would need a very full false ID, and have to not appear on any list of fraudulent identities,” one source at the GDS told the Guardian.
Banks now following gov.uk’s lead
Government developers are confident that it presents a higher barrier to authentication than any other digital service – so that fraudulent transactions will be minimised. That has interested banks, which are understood to be expressing interest in using the same service to verify customer identities through an arms-length verification system.
The government system would not pass on people’s data, but would instead verify that someone is who they claim to be, much like Twitter and Facebook verify users’ identity to log in to third party sites, yet don’t share their users’ data.
The US, Canada and New Zealand have also expressed interest in following up the UK’s lead in the system, which requires separate pieces of verified information about themselves from different sources.
The system then cross-references that verified information with credit reference agencies and other sources, which can include a mobile phone provider, passport, bank account, utility bill or driving licence.
The level of confidence in an individual’s identity is split into four levels. The lowest is for the creation of simple accounts to receive reports or updates: “we don’t need to know who it is, only that it’s the same person returning,” said Hughes.
Level 2 requires that “on the balance of probability” someone is who they say they are – which is the level to which Verify will be able to identify people. Hughes says that this will cover the majority of services.
Level 3 requires identity “beyond reasonable doubt” – perhaps including the first application for a passport – and Level 4 would require biometric information to confirm individual identity.

Traversing Digital Babel


New book by Alon Peled: “The computer systems of government agencies are notoriously complex. New technologies are piled on older technologies, creating layers that call to mind an archaeological dig. Obsolete programming languages and closed mainframe designs offer barriers to integration with other agency systems. Worldwide, these unwieldy systems waste billions of dollars, keep citizens from receiving services, and even—as seen in interoperability failures on 9/11 and during Hurricane Katrina—cost lives. In this book, Alon Peled offers a groundbreaking approach for enabling information sharing among public sector agencies: using selective incentives to “nudge” agencies to exchange information assets. Peled proposes the establishment of a Public Sector Information Exchange (PSIE), through which agencies would trade information.
After describing public sector information sharing failures and the advantages of incentivized sharing, Peled examines the U.S. Open Data program, and the gap between its rhetoric and results. He offers examples of creative public sector information sharing in the United States, Australia, Brazil, the Netherlands, and Iceland. Peled argues that information is a contested commodity, and draws lessons from the trade histories of other contested commodities—including cadavers for anatomical dissection in nineteenth-century Britain. He explains how agencies can exchange information as a contested commodity through a PSIE program tailored to an individual country’s needs, and he describes the legal, economic, and technical foundations of such a program. Touching on issues from data ownership to freedom of information, Peled offers pragmatic advice to politicians, bureaucrats, technologists, and citizens for revitalizing critical information flows.”