New Book on 25 Years of Participatory Budgeting


Tiago Peixoto at Democracy Spot: “A little while ago I mentioned the launch of the Portuguese version of the book organized by Nelson Dias, “Hope for Democracy: 25 Years of Participatory Budgeting Worldwide”.

The good news is that the English version is finally out. Here’s an excerpt from the introduction:

This book represents the effort  of more than forty authors and many other direct and indirect contributions that spread across different continents seek to provide an overview on the Participatory Budgeting (PB) in the World. They do so from different backgrounds. Some are researchers, others are consultants, and others are activists connected to several groups and social movements. The texts reflect this diversity of approaches and perspectives well, and we do not try to influence that.
(….)
The pages that follow are an invitation to a fascinating journey on the path of democratic innovation in very diverse cultural, political, social and administrative settings. From North America to Asia, Oceania to Europe, from Latin America to Africa, the reader will find many reasons to closely follow the proposals of the different authors.

The book can be downloaded here [PDF]. I had the pleasure of being one of the book’s contributors, co-authoring an article with Rafael Sampaio on the use of ICT in PB processes: “Electronic Participatory Budgeting: False Dilemmas and True Complexities” [PDF]...”

The Emerging Science of Computational Anthropology


Emerging Technology From the arXiv: The increasing availability of big data from mobile phones and location-based apps has triggered a revolution in the understanding of human mobility patterns. This data shows the ebb and flow of the daily commute in and out of cities, the pattern of travel around the world and even how disease can spread through cities via their transport systems.
So there is considerable interest in looking more closely at human mobility patterns to see just how well it can be predicted and how these predictions might be used in everything from disease control and city planning to traffic forecasting and location-based advertising.
Today we get an insight into the kind of detailed that is possible thanks to the work of Zimo Yang at Microsoft research in Beijing and a few pals. These guys start with the hypothesis that people who live in a city have a pattern of mobility that is significantly different from those who are merely visiting. By dividing travelers into locals and non-locals, their ability to predict where people are likely to visit dramatically improves.
Zimo and co begin with data from a Chinese location-based social network called Jiepang.com. This is similar to Foursquare in the US. It allows users to record the places they visit and to connect with friends at these locations and to find others with similar interests.
The data points are known as check-ins and the team downloaded more than 1.3 million of them from five big cities in China: Beijing, Shanghai, Nanjing, Chengdu and Hong Kong. They then used 90 per cent of the data to train their algorithms and the remaining 10 per cent to test it. The Jiapang data includes the users’ hometowns so it’s easy to see whether an individual is checking in in their own city or somewhere else.
The question that Zimo and co want to answer is the following: given a particular user and their current location, where are they most likely to visit in the near future? In practice, that means analysing the user’s data, such as their hometown and the locations recently visited, and coming up with a list of other locations that they are likely to visit based on the type of people who visited these locations in the past.
Zimo and co used their training dataset to learn the mobility pattern of locals and non-locals and the popularity of the locations they visited. The team then applied this to the test dataset to see whether their algorithm was able to predict where locals and non-locals were likely to visit.
They found that their best results came from analysing the pattern of behaviour of a particular individual and estimating the extent to which this person behaves like a local. That produced a weighting called the indigenization coefficient that the researchers could then use to determine the mobility patterns this person was likely to follow in future.
In fact, Zimo and co say they can spot non-locals in this way without even knowing their home location. “Because non-natives tend to visit popular locations, like the Imperial Palace in Beijing and the Bund in Shanghai, while natives usually check in around their homes and workplaces,” they add.
The team say this approach considerably outperforms the mixed algorithms that use only individual visiting history and location popularity. “To our surprise, a hybrid algorithm weighted by the indigenization coefficients outperforms the mixed algorithm accounting for additional demographical information.”
It’s easy to imagine how such an algorithm might be useful for businesses who want to target certain types of travelers or local people. But there is a more interesting application too.
Zimo and co say that it is possible to monitor the way an individual’s mobility patterns change over time. So if a person moves to a new city, it should be possible to see how long it takes them to settle in.
One way of measuring this is in their mobility patterns: whether they are more like those of a local or a non-local. “We may be able to estimate whether a non-native person will behave like a native person after a time period and if so, how long in average a person takes to become a native-like one,” say Zimo and co.
That could have a fascinating impact on the way anthropologists study migration and the way immigrants become part of a local community. This is computational anthropology a science that is clearly in its early stages but one that has huge potential for the future.”
Ref: arxiv.org/abs/1405.7769 : Indigenization of Urban Mobility

A brief history of open data


Article by Luke Fretwell in FCW: “In December 2007, 30 open-data pioneers gathered in Sebastopol, Calif., and penned a set of eight open-government data principles that inaugurated a new era of democratic innovation and economic opportunity.
“The objective…was to find a simple way to express values that a bunch of us think are pretty common, and these are values about how the government could make its data available in a way that enables a wider range of people to help make the government function better,” Harvard Law School Professor Larry Lessig said. “That means more transparency in what the government is doing and more opportunity for people to leverage government data to produce insights or other great business models.”
The eight simple principles — that data should be complete, primary, timely, accessible, machine-processable, nondiscriminatory, nonproprietary and license-free — still serve as the foundation for what has become a burgeoning open-data movement.

The benefits of open data for agencies

  • Save time and money when responding to Freedom of Information Act requests.
  • Avoid duplicative internal research.
  • Use complementary datasets held by other agencies.
  • Empower employees to make better-informed, data-driven decisions.
  • Attract positive attention from the public, media and other agencies.
  • Generate revenue and create new jobs in the private sector.

Source: Project Open Data

In the seven years since those principles were released, governments around the world have adopted open-data initiatives and launched platforms that empower researchers, journalists and entrepreneurs to mine this new raw material and its potential to uncover new discoveries and opportunities. Open data has drawn civic hacker enthusiasts around the world, fueling hackathons, challenges, apps contests, barcamps and “datapaloozas” focused on issues as varied as health, energy, finance, transportation and municipal innovation.
In the United States, the federal government initiated the beginnings of a wide-scale open-data agenda on President Barack Obama’s first day in office in January 2009, when he issued his memorandum on transparency and open government, which declared that “openness will strengthen our democracy and promote efficiency and effectiveness in government.” The president gave federal agencies three months to provide input into an open-government directive that would eventually outline what each agency planned to do with respect to civic transparency, collaboration and participation, including specific objectives related to releasing data to the public.
In May of that year, Data.gov launched with just 47 datasets and a vision to “increase public access to high-value, machine-readable datasets generated by the executive branch of the federal government.”
When the White House issued the final draft of its federal Open Government Directive later that year, the U.S. open-government data movement got its first tangible marching orders, including a 45-day deadline to open previously unreleased data to the public.
Now five years after its launch, Data.gov boasts more than 100,000 datasets from 227 local, state and federal agencies and organizations….”

Big Data, new epistemologies and paradigm shifts


Paper by Rob Kitchin in the Journal “Big Data and Society”: This article examines how the availability of Big Data, coupled with new data analytics, challenges established epistemologies across the sciences, social sciences and humanities, and assesses the extent to which they are engendering paradigm shifts across multiple disciplines. In particular, it critically explores new forms of empiricism that declare ‘the end of theory’, the creation of data-driven rather than knowledge-driven science, and the development of digital humanities and computational social sciences that propose radically different ways to make sense of culture, history, economy and society. It is argued that: (1) Big Data and new data analytics are disruptive innovations which are reconfiguring in many instances how research is conducted; and (2) there is an urgent need for wider critical reflection within the academy on the epistemological implications of the unfolding data revolution, a task that has barely begun to be tackled despite the rapid changes in research practices presently taking place. After critically reviewing emerging epistemological positions, it is contended that a potentially fruitful approach would be the development of a situated, reflexive and contextually nuanced epistemology”

How Long Is Too Long? The 4th Amendment and the Mosaic Theory


Law and Liberty Blog: “Volume 8.2 of the NYU Journal of Law and Liberty has been sent to the printer and physical copies will be available soon, but the articles in the issue are already available online here. One article that has gotten a lot of attention so far is by Steven Bellovin, Renee Hutchins, Tony Jebara, and Sebastian Zimmeck titled “When Enough is Enough: Location Tracking, Mosaic Theory, and Machine Learning.” A direct link to the article is here.
The mosaic theory is a modern corollary accepted by some academics – and the D.C. Circuit Court of Appeals in Maynard v. U.S. – as a twenty-first century extension of the Fourth Amendment’s prohibition on unreasonable searches of seizures. Proponents of the mosaic theory argue that at some point enough individual data collections, compiled and analyzed together, become a Fourth Amendment search. Thirty years ago the Supreme Court upheld the use of a tracking device for three days without a warrant, however the proliferation of GPS tracking in cars and smartphones has made it significantly easier for the police to access a treasure trove of information about our location at any given time.
It is easy to see why this theory has attracted some support. Humans are creatures of habit – if our public locations are tracked for a few days, weeks, or a month, it is pretty easy for machines to learn our ways and assemble a fairly detailed report for the government about our lives. Machines could basically predict when you will leave your house for work, what route you will take, when and where you go grocery shopping, all before you even do it, once it knows your habits. A policeman could observe you moving about in public without a warrant of course, but limited manpower will always reduce the probability of continuous mass surveillance. With current technology, a handful of trained experts could easily monitor hundreds of people at a time from behind a computer screen, and gather even more information than most searches requiring a warrant. The Supreme Court indicated a willingness to consider the mosaic theory in U.S. v. Jones, but has yet to embrace it…”

The article in Law & Liberty details the need to determine at which point machine learning creates an intrusion into our reasonable expectations of privacy, and even discusses an experiment that could be run to determine how long data collection can proceed before it is an intrusion. If there is a line at which individual data collection becomes a search, we need to discover where that line is. One of the articles’ authors, Steven Bollovin, has argued that the line is probably at one week – at that point your weekday and weekend habits would be known. The nation’s leading legal expert on criminal law, Professor Orin Kerr, fired back on the Volokh Conspiracy that Bollovin’s one week argument is not in line with previous iterations of the mosaic theory.

Open Government Will Reshape Latin America


Alejandro Guerrero at Medium: “When people think on the place for innovations, they typically think on innovation being spurred by large firms and small startups based in the US. And particularly in that narrow stretch of land and water called Silicon Valley.
However, the flux of innovation taking place in the intersection between technology and government is phenomenal and emerging everywhere. From the marble hallways of parliaments everywhere —including Latin America’s legislative houses— to office hubs of tech-savvy non-profits full of enthusiastic social changers —also including Latin American startups— a driving force is starting to challenge our conception of how government and citizens can and should interact. And few people are discussing or analyzing these developments.
Open Government in Latin America
The potential for Open Government to improve government’s decision-making and performance is huge. And it is particularly immense in middle income countries such as the ones in Latin America, where the combination of growing incomes, more sophisticated citizens’ demands, and broken public services is generating a large bottom-up pressure and requesting more creative solutions from governments to meet the enormous social needs, while cutting down corruption and improving governance.
It is unsurprising that citizens from all over Latin America are increasingly taking the streets and demanding better public services and more transparent institutions.
While these protests are necessarily short-lived and unarticulated —a product of growing frustration with government— they are a symptom with deeper causes that won’t go easily away, and these protests will most likely come back with increasing frequency and the unresolved frustration may eventually transmute in political platforms with more radical ideas to challenge the status quo.
Behind the scene, governments across the region still face enormous weaknesses in public management, ill-prepared and underpaid public officials carry on with their duties as the platonic idea of a demotivated workforce, and the opportunities for corruption, waste, and nepotism are plenty. The growing segment of more affluent citizens simply opt out from government and resort to private alternatives, thus exacerbating inequalities in the already most unequal region in the world. The crumbling middle classes and the poor can just resort to voicing their complaints. And they are increasingly doing so.
And here is where open government initiatives might play a transformative role, disrupting the way governments make decisions and work while empowering citizens in the process.
The preconditions for OpenGov are almost here
In Latin America, connectivity rates are growing fast (reaching 61% in 2013 for the Americas as a whole), close to 90% of the population owns a cellphone, and access to higher levels of education keeps growing (as an example, the latest PISA report indicates that Mexico went from 58% in 2003 to 70% high-schoolers in 2012). The social conditions for a stronger role of citizens in government are increasingly there.
Moreover, most Latin American countries passed transparency laws during the 2000s, creating the enabling environment for open government initiatives to flourish. It is thus unsurprising that the next generation of young government bureaucrats, on average more internet-savvy and better educated than its predecessors, is taking over and embracing innovations in government. And they are finding echo (and suppliers of ideas and apps!) among local startups and civil society groups, while also being courted by large tech corporations (think of Google or Microsoft) behind succulent government contracts associated with this form of “doing good”.
This is an emerging galaxy of social innovators, technologically-savvy bureaucrats, and engaged citizens providing a large crowd-sourcing community and an opportunity to test different approaches. And the underlying tectonic shifts are pushing governments towards that direction. For a sampler, check out the latest developments for Brazil, Argentina, Peru, Mexico, Colombia, Paraguay, Chile, Panama, Costa Rica, Guatemala, Honduras, Dominican Republic, Uruguay and (why not?) my own country, which I will include in the review often for the surprisingly limited progress of open government in this OECD member, which shares similar institutions and challenges with Latin America.

A Road Full of Promise…and Obstacles

Most of the progress in Latin America is quite recent, and the real impact is still often more limited once you abandon the halls of the Digital Government directorates and secretarías or look if you look beyond the typical government data portal. The resistance to change is as human as laughing, but it is particularly intense among the public sector side of human beings. Politics also typically plays a enormous role in resisting transparency open government, and in a context of weak institutions and pervasive corruption, the temptation to politically block or water down open data/open government projects is just too high. Selective release of data (if any) is too frequent, government agencies often act as silos by not sharing information with other government departments, and irrational fears by policy-makers combined with adoption barriers (well explained here) all contribute to deter the progress of the open government promise in Latin America…”

Special Issue on Innovation through Open Data


A Review of the State-of-the-Art and an Emerging Research Agenda in the Journal of Theoretical and Applied Electronic Commerce Research:

  • Going Beyond Open Data: Challenges and Motivations for Smart Disclosure in Ethical Consumption (Djoko Sigit Sayogo, Jing Zhang, Theresa A. Pardo, Giri K. Tayi, Jana Hrdinova, David F. Andersen and Luis Felipe Luna-Reyes)
  • Shaping Local Open Data Initiatives: Politics and Implications (Josefin Lassinantti, Birgitta Bergvall-Kåreborn and Anna Ståhlbröst)
  • A State-of-the-Art Analysis of the Current Public Data Landscape from a Functional, Semantic and Technical Perspective (Michael Petychakis, Olga Vasileiou, Charilaos Georgis, Spiros Mouzakitis and John Psarras)
  • Using a Method and Tool for Hybrid Ontology Engineering: an Evaluation in the Flemish Research Information Space (Christophe Debruyne and Pieter De Leenheer)
  • A Metrics-Driven Approach for Quality Assessment of Linked Open Data (Behshid Behkamal, Mohsen Kahani, Ebrahim Bagheri and Zoran Jeremic)
  • Open Government Data Implementation Evaluation (Peter Parycek, Johann Höchtl and Michael Ginner)
  • Data-Driven Innovation through Open Government Data (Thorhildur Jetzek, Michel Avital and Niels Bjorn-Andersen)

Technological Innovations and Future Shifts in International Politics


Paper by Askar Akaev and Vladimir Pantin in International Studies Quaterly: “How are large technological changes and important shifts in international politics interconnected? It is shown in the article that primary technological innovations, which take place in each Kondratieff cycle, change the balance of power between the leading states and cause shifts in international politics. In the beginning of the twenty-first century, the genesis and initial development of the cluster of new technologies takes place in periods of crisis and depression. Therefore, the authors forecast that the period 2013–2020 will be marked by the advancement of important technological innovations and massive geopolitical shifts in many regions of the world.”

Who Influences Whom? Reflections on U.S. Government Outreach to Think Tanks


Jeremy Shapiro at Brookings: “The U.S. government makes a big effort to reach out to important think tanks, often through the little noticed or understood mechanism of small, private and confidential roundtables. Indeed, for the ambitious Washington think-tanker nothing quite gets the pulse racing like the idea of attending one of these roundtables with the most important government officials. The very occasion is full of intrigue and ritual.

When the Government Calls for Advice

First, an understated e-mail arrives from some polite underling inviting you in to a “confidential, off-the-record” briefing with some official with an impressive title—a deputy secretary or a special assistant to the president, maybe even (heaven forfend) the secretary of state or the national security advisor. The thinker’s heart leaps, “they read my article; they finally see the light of my wisdom, I will probably be the next national security advisor.”
He clears his schedule of any conflicting brown bags on separatism in South Ossetia and, after a suitable interval to keep the government guessing as to his availability, replies that he might be able to squeeze it in to his schedule. Citizenship data and social security numbers are provided for security purposes, times are confirmed and ground rules are established in a multitude of emails with a seemingly never-ending array of staffers, all of whose titles include the word “special.” The thinker says nothing directly to his colleagues, but searches desperately for opportunities to obliquely allude to the meeting: “I’d love to come to your roundtable on uncovered interest rate parity, but I unfortunately have a meeting with the secretary of defense.”
On the appointed day, the thinker arrives early as instructed at an impressively massive and well-guarded government building, clears his ways through multiple layers of redundant security, and is ushered into a wood-paneled room that reeks of power and pine-sol. (Sometimes it is a futuristic conference room filled with television monitors and clocks that give the time wherever the President happens to be.) Nameless peons in sensible suits clutch government-issue notepads around the outer rim of the room as the thinker takes his seat at the center table, only somewhat disappointed to see so many other familiar thinkers in the room—including some to whom he had been obliquely hinting about the meeting the day before.
At the appointed hour, an officious staffer arrives to announce that “He” (the lead government official goes only by personal pronoun—names are unnecessary at this level) is unfortunately delayed at another meeting on the urgent international crisis of the day, but will arrive just as soon as he can get break away from the president in the Situation Room. He is, in fact, just reading email, but his long career has taught him the advantage of making people wait.
After 15 minutes of stilted chit-chat with colleagues that the thinker has the misfortune to see at virtually every event he attends in Washington, the senior government official strides calmly into the room, plops down at the head of the table and declares solemnly what a honor it is to have such distinguished experts to help with this critical area of policy. He very briefly details how very hard the U.S. government is working on this highest priority issue and declares that “we are in listening mode and are anxious to hear your sage advice.” A brave thinker raises his hand and speaks truth to power by reciting the thesis of his latest article. From there, the group is off to races as the thinkers each struggle to get in the conversation and rehearse their well-worn positions.
Forty-three minutes later, the thinkers’ “hour” is up because, the officious staffer interjects, “He” must attend a Principals Committee meeting. The senior government official thanks the experts for coming, compliments them on their fruitful ideas and their full and frank debate, instructs a nameless peon at random to assemble “what was learned here” for distribution in “the building” and strides purposefully out of the room.
The pantomime then ends and the thinker retreats back to his office to continue his thoughts. But what precisely has happened behind the rituals? Have we witnessed the vaunted academic-government exchange that Washington is so famous for? Is this how fresh ideas re-invigorate stale government groupthink?..”

Blueprint on "The Open Data Era in Health and Social Care"


The GovLab Press ReleaseNHS England and The Governance Lab at NYU (The GovLab) have today launched a blueprint – The Open Data Era in Health and Social Care – for accelerating the use of open data in health and care settings.
The availability of open data can empower citizens and help care providers, patients and researchers make better decisions, spur new innovations and identify efficiencies. The report was commissioned by NHS England and written by The GovLab, part of New York University and world leaders in the field of open data usage. It puts forward a proposal for how the health and care system can maximise the impact of sharing open data through establishing priorities and clear ways of measuring benefits.
Tim Kelsey, National Director for Patients and Information for NHS England, said:
“There’s an urgent need for the NHS to use better information and evidence to guide decision-making and investment. We know with scientific and medical research, the rate of discovery is accelerated by better access to data. This report will kick off a conversation about how we can use open data in the NHS to build a meaningful evidence base to support better investment in health and care services. Over the coming months, I’m keen to hear the views of colleagues on how we can take this forward and build an evidence base to improve outcomes for patients.”
Stefaan Verhulst, Co-founder and Chief Research and Development Officer of the GovLab:
“The blueprint lays out a detailed plan to start a conversation about how to gather the evidence needed to understand and assess the shape and size of the impact of open health data. It is important to pay a comparable level of attention to an analysis of open data’s potential benefits, as well as potential risks.”
Download the full report: thegovlab.org/nhs