Heteromation and its (dis)contents: The invisible division of labor between humans and machines


Paper by Hamid Ekbia and Bonnie Nardi in First Monday: “The division of labor between humans and computer systems has changed along both technical and human dimensions. Technically, there has been a shift from technologies of automation, the aim of which was to disallow human intervention at nearly all points in the system, to technologies of “heteromation” that push critical tasks to end users as indispensable mediators. As this has happened, the large population of human beings who have been driven out by the first type of technology are drawn back into the computational fold by the second type. Turning artificial intelligence on its head, one technology fills the gap created by the other, but with a vengeance that unsettles established mechanisms of reward, fulfillment, and compensation. In this fashion, replacement of human beings and their irrelevance to technological systems has given way to new “modes of engagement” with remarkable social, economic, and ethical implications. In this paper we provide a historical backdrop for heteromation and explore and explicate some of these displacements through analysis of a number of cases, including Mechanical Turk, the video games FoldIt and League of Legends, and social media.

Full Text: HTML

Why Governments Should Adopt a Digital Engagement Strategy


Lindsay Crudele at StateTech: “Government agencies increasingly value digital engagement as a way to transform a complaint-based relationship into one of positive, proactive constituent empowerment. An engaged community is a stronger one.
Creating a culture of participatory government, as we strive to do in Boston, requires a data-driven infrastructure supported by IT solutions. Data management and analytics solutions translate a huge stream of social media data, drive conversations and creative crowdsourcing, and support transparency.
More than 50 departments across Boston host public conversations using a multichannel, multidisciplinary portfolio of accounts. We integrate these using an enterprise digital engagement management tool that connects and organizes them to break down silos and boost collaboration. Moreover, the technology provides a lens into ways to expedite workflow and improve service delivery.

A Vital Link in Times of Need

Committed and creative daily engagement builds trusting collaboration that, in turn, is vital in an inevitable crisis. As we saw during the tragic events of the 2013 Boston Marathon bombings and recent major weather events, rapid response through digital media clarifies the situation, provides information about safety and manages constituent expectations.
Boston’s enterprise model supports coordinated external communication and organized monitoring, intake and response. This provides a superadmin with access to all accounts for governance and the ability to easily amplify central messaging across a range of cultivated communities. These communities will later serve in recovery efforts.
The conversations must be seeded by a keen, creative and data-driven content strategy. For an agency to determine the correct strategy for the organization and the community it serves, a growing crop of social analytics tools can provide efficient insight into performance factors: type of content, deployment schedule, sentiment, service-based response time and team performance, to name a few. For example, in February, the city of Boston learned that tweets from our mayor with video saw 300 percent higher engagement than those without.
These insights can inform resource deployment, eliminating guesswork to more directly reach constituents by their preferred methods. Being truly present in a conversation demonstrates care and awareness and builds trust. This increased positivity can be measured through sentiment analysis, including change over time, and should be monitored for fluctuation.
During a major event, engagement managers may see activity reach new peaks in volume. IT solutions can interpret Big Data and bring a large-scale digital conversation back into perspective, identifying public safety alerts and emerging trends, needs and community influencers who can be engaged as amplifying partners.

Running Strong One Year Later

Throughout the 2014 Boston Marathon, we used three monitoring tools to deliver smart alerts to key partners across the organization:
• An engagement management tool organized conversations for account performance and monitoring.
• A brand listening tool scanned for emerging trends across the city and uncovered related conversations.
• A location-based predictive tool identified early alerts to discover potential problems along the marathon route.
With the team and tools in place, policy-based training supports the sustained growth and operation of these conversation channels. A data-driven engagement strategy unearths all of our stories, where we, as public servants and neighbors, build better communities together….”

New Book on 25 Years of Participatory Budgeting


Tiago Peixoto at Democracy Spot: “A little while ago I mentioned the launch of the Portuguese version of the book organized by Nelson Dias, “Hope for Democracy: 25 Years of Participatory Budgeting Worldwide”.

The good news is that the English version is finally out. Here’s an excerpt from the introduction:

This book represents the effort  of more than forty authors and many other direct and indirect contributions that spread across different continents seek to provide an overview on the Participatory Budgeting (PB) in the World. They do so from different backgrounds. Some are researchers, others are consultants, and others are activists connected to several groups and social movements. The texts reflect this diversity of approaches and perspectives well, and we do not try to influence that.
(….)
The pages that follow are an invitation to a fascinating journey on the path of democratic innovation in very diverse cultural, political, social and administrative settings. From North America to Asia, Oceania to Europe, from Latin America to Africa, the reader will find many reasons to closely follow the proposals of the different authors.

The book can be downloaded here [PDF]. I had the pleasure of being one of the book’s contributors, co-authoring an article with Rafael Sampaio on the use of ICT in PB processes: “Electronic Participatory Budgeting: False Dilemmas and True Complexities” [PDF]...”

The Emerging Science of Computational Anthropology


Emerging Technology From the arXiv: The increasing availability of big data from mobile phones and location-based apps has triggered a revolution in the understanding of human mobility patterns. This data shows the ebb and flow of the daily commute in and out of cities, the pattern of travel around the world and even how disease can spread through cities via their transport systems.
So there is considerable interest in looking more closely at human mobility patterns to see just how well it can be predicted and how these predictions might be used in everything from disease control and city planning to traffic forecasting and location-based advertising.
Today we get an insight into the kind of detailed that is possible thanks to the work of Zimo Yang at Microsoft research in Beijing and a few pals. These guys start with the hypothesis that people who live in a city have a pattern of mobility that is significantly different from those who are merely visiting. By dividing travelers into locals and non-locals, their ability to predict where people are likely to visit dramatically improves.
Zimo and co begin with data from a Chinese location-based social network called Jiepang.com. This is similar to Foursquare in the US. It allows users to record the places they visit and to connect with friends at these locations and to find others with similar interests.
The data points are known as check-ins and the team downloaded more than 1.3 million of them from five big cities in China: Beijing, Shanghai, Nanjing, Chengdu and Hong Kong. They then used 90 per cent of the data to train their algorithms and the remaining 10 per cent to test it. The Jiapang data includes the users’ hometowns so it’s easy to see whether an individual is checking in in their own city or somewhere else.
The question that Zimo and co want to answer is the following: given a particular user and their current location, where are they most likely to visit in the near future? In practice, that means analysing the user’s data, such as their hometown and the locations recently visited, and coming up with a list of other locations that they are likely to visit based on the type of people who visited these locations in the past.
Zimo and co used their training dataset to learn the mobility pattern of locals and non-locals and the popularity of the locations they visited. The team then applied this to the test dataset to see whether their algorithm was able to predict where locals and non-locals were likely to visit.
They found that their best results came from analysing the pattern of behaviour of a particular individual and estimating the extent to which this person behaves like a local. That produced a weighting called the indigenization coefficient that the researchers could then use to determine the mobility patterns this person was likely to follow in future.
In fact, Zimo and co say they can spot non-locals in this way without even knowing their home location. “Because non-natives tend to visit popular locations, like the Imperial Palace in Beijing and the Bund in Shanghai, while natives usually check in around their homes and workplaces,” they add.
The team say this approach considerably outperforms the mixed algorithms that use only individual visiting history and location popularity. “To our surprise, a hybrid algorithm weighted by the indigenization coefficients outperforms the mixed algorithm accounting for additional demographical information.”
It’s easy to imagine how such an algorithm might be useful for businesses who want to target certain types of travelers or local people. But there is a more interesting application too.
Zimo and co say that it is possible to monitor the way an individual’s mobility patterns change over time. So if a person moves to a new city, it should be possible to see how long it takes them to settle in.
One way of measuring this is in their mobility patterns: whether they are more like those of a local or a non-local. “We may be able to estimate whether a non-native person will behave like a native person after a time period and if so, how long in average a person takes to become a native-like one,” say Zimo and co.
That could have a fascinating impact on the way anthropologists study migration and the way immigrants become part of a local community. This is computational anthropology a science that is clearly in its early stages but one that has huge potential for the future.”
Ref: arxiv.org/abs/1405.7769 : Indigenization of Urban Mobility

A brief history of open data


Article by Luke Fretwell in FCW: “In December 2007, 30 open-data pioneers gathered in Sebastopol, Calif., and penned a set of eight open-government data principles that inaugurated a new era of democratic innovation and economic opportunity.
“The objective…was to find a simple way to express values that a bunch of us think are pretty common, and these are values about how the government could make its data available in a way that enables a wider range of people to help make the government function better,” Harvard Law School Professor Larry Lessig said. “That means more transparency in what the government is doing and more opportunity for people to leverage government data to produce insights or other great business models.”
The eight simple principles — that data should be complete, primary, timely, accessible, machine-processable, nondiscriminatory, nonproprietary and license-free — still serve as the foundation for what has become a burgeoning open-data movement.

The benefits of open data for agencies

  • Save time and money when responding to Freedom of Information Act requests.
  • Avoid duplicative internal research.
  • Use complementary datasets held by other agencies.
  • Empower employees to make better-informed, data-driven decisions.
  • Attract positive attention from the public, media and other agencies.
  • Generate revenue and create new jobs in the private sector.

Source: Project Open Data

In the seven years since those principles were released, governments around the world have adopted open-data initiatives and launched platforms that empower researchers, journalists and entrepreneurs to mine this new raw material and its potential to uncover new discoveries and opportunities. Open data has drawn civic hacker enthusiasts around the world, fueling hackathons, challenges, apps contests, barcamps and “datapaloozas” focused on issues as varied as health, energy, finance, transportation and municipal innovation.
In the United States, the federal government initiated the beginnings of a wide-scale open-data agenda on President Barack Obama’s first day in office in January 2009, when he issued his memorandum on transparency and open government, which declared that “openness will strengthen our democracy and promote efficiency and effectiveness in government.” The president gave federal agencies three months to provide input into an open-government directive that would eventually outline what each agency planned to do with respect to civic transparency, collaboration and participation, including specific objectives related to releasing data to the public.
In May of that year, Data.gov launched with just 47 datasets and a vision to “increase public access to high-value, machine-readable datasets generated by the executive branch of the federal government.”
When the White House issued the final draft of its federal Open Government Directive later that year, the U.S. open-government data movement got its first tangible marching orders, including a 45-day deadline to open previously unreleased data to the public.
Now five years after its launch, Data.gov boasts more than 100,000 datasets from 227 local, state and federal agencies and organizations….”

Big Data, new epistemologies and paradigm shifts


Paper by Rob Kitchin in the Journal “Big Data and Society”: This article examines how the availability of Big Data, coupled with new data analytics, challenges established epistemologies across the sciences, social sciences and humanities, and assesses the extent to which they are engendering paradigm shifts across multiple disciplines. In particular, it critically explores new forms of empiricism that declare ‘the end of theory’, the creation of data-driven rather than knowledge-driven science, and the development of digital humanities and computational social sciences that propose radically different ways to make sense of culture, history, economy and society. It is argued that: (1) Big Data and new data analytics are disruptive innovations which are reconfiguring in many instances how research is conducted; and (2) there is an urgent need for wider critical reflection within the academy on the epistemological implications of the unfolding data revolution, a task that has barely begun to be tackled despite the rapid changes in research practices presently taking place. After critically reviewing emerging epistemological positions, it is contended that a potentially fruitful approach would be the development of a situated, reflexive and contextually nuanced epistemology”

How Long Is Too Long? The 4th Amendment and the Mosaic Theory


Law and Liberty Blog: “Volume 8.2 of the NYU Journal of Law and Liberty has been sent to the printer and physical copies will be available soon, but the articles in the issue are already available online here. One article that has gotten a lot of attention so far is by Steven Bellovin, Renee Hutchins, Tony Jebara, and Sebastian Zimmeck titled “When Enough is Enough: Location Tracking, Mosaic Theory, and Machine Learning.” A direct link to the article is here.
The mosaic theory is a modern corollary accepted by some academics – and the D.C. Circuit Court of Appeals in Maynard v. U.S. – as a twenty-first century extension of the Fourth Amendment’s prohibition on unreasonable searches of seizures. Proponents of the mosaic theory argue that at some point enough individual data collections, compiled and analyzed together, become a Fourth Amendment search. Thirty years ago the Supreme Court upheld the use of a tracking device for three days without a warrant, however the proliferation of GPS tracking in cars and smartphones has made it significantly easier for the police to access a treasure trove of information about our location at any given time.
It is easy to see why this theory has attracted some support. Humans are creatures of habit – if our public locations are tracked for a few days, weeks, or a month, it is pretty easy for machines to learn our ways and assemble a fairly detailed report for the government about our lives. Machines could basically predict when you will leave your house for work, what route you will take, when and where you go grocery shopping, all before you even do it, once it knows your habits. A policeman could observe you moving about in public without a warrant of course, but limited manpower will always reduce the probability of continuous mass surveillance. With current technology, a handful of trained experts could easily monitor hundreds of people at a time from behind a computer screen, and gather even more information than most searches requiring a warrant. The Supreme Court indicated a willingness to consider the mosaic theory in U.S. v. Jones, but has yet to embrace it…”

The article in Law & Liberty details the need to determine at which point machine learning creates an intrusion into our reasonable expectations of privacy, and even discusses an experiment that could be run to determine how long data collection can proceed before it is an intrusion. If there is a line at which individual data collection becomes a search, we need to discover where that line is. One of the articles’ authors, Steven Bollovin, has argued that the line is probably at one week – at that point your weekday and weekend habits would be known. The nation’s leading legal expert on criminal law, Professor Orin Kerr, fired back on the Volokh Conspiracy that Bollovin’s one week argument is not in line with previous iterations of the mosaic theory.

Open Government Will Reshape Latin America


Alejandro Guerrero at Medium: “When people think on the place for innovations, they typically think on innovation being spurred by large firms and small startups based in the US. And particularly in that narrow stretch of land and water called Silicon Valley.
However, the flux of innovation taking place in the intersection between technology and government is phenomenal and emerging everywhere. From the marble hallways of parliaments everywhere —including Latin America’s legislative houses— to office hubs of tech-savvy non-profits full of enthusiastic social changers —also including Latin American startups— a driving force is starting to challenge our conception of how government and citizens can and should interact. And few people are discussing or analyzing these developments.
Open Government in Latin America
The potential for Open Government to improve government’s decision-making and performance is huge. And it is particularly immense in middle income countries such as the ones in Latin America, where the combination of growing incomes, more sophisticated citizens’ demands, and broken public services is generating a large bottom-up pressure and requesting more creative solutions from governments to meet the enormous social needs, while cutting down corruption and improving governance.
It is unsurprising that citizens from all over Latin America are increasingly taking the streets and demanding better public services and more transparent institutions.
While these protests are necessarily short-lived and unarticulated —a product of growing frustration with government— they are a symptom with deeper causes that won’t go easily away, and these protests will most likely come back with increasing frequency and the unresolved frustration may eventually transmute in political platforms with more radical ideas to challenge the status quo.
Behind the scene, governments across the region still face enormous weaknesses in public management, ill-prepared and underpaid public officials carry on with their duties as the platonic idea of a demotivated workforce, and the opportunities for corruption, waste, and nepotism are plenty. The growing segment of more affluent citizens simply opt out from government and resort to private alternatives, thus exacerbating inequalities in the already most unequal region in the world. The crumbling middle classes and the poor can just resort to voicing their complaints. And they are increasingly doing so.
And here is where open government initiatives might play a transformative role, disrupting the way governments make decisions and work while empowering citizens in the process.
The preconditions for OpenGov are almost here
In Latin America, connectivity rates are growing fast (reaching 61% in 2013 for the Americas as a whole), close to 90% of the population owns a cellphone, and access to higher levels of education keeps growing (as an example, the latest PISA report indicates that Mexico went from 58% in 2003 to 70% high-schoolers in 2012). The social conditions for a stronger role of citizens in government are increasingly there.
Moreover, most Latin American countries passed transparency laws during the 2000s, creating the enabling environment for open government initiatives to flourish. It is thus unsurprising that the next generation of young government bureaucrats, on average more internet-savvy and better educated than its predecessors, is taking over and embracing innovations in government. And they are finding echo (and suppliers of ideas and apps!) among local startups and civil society groups, while also being courted by large tech corporations (think of Google or Microsoft) behind succulent government contracts associated with this form of “doing good”.
This is an emerging galaxy of social innovators, technologically-savvy bureaucrats, and engaged citizens providing a large crowd-sourcing community and an opportunity to test different approaches. And the underlying tectonic shifts are pushing governments towards that direction. For a sampler, check out the latest developments for Brazil, Argentina, Peru, Mexico, Colombia, Paraguay, Chile, Panama, Costa Rica, Guatemala, Honduras, Dominican Republic, Uruguay and (why not?) my own country, which I will include in the review often for the surprisingly limited progress of open government in this OECD member, which shares similar institutions and challenges with Latin America.

A Road Full of Promise…and Obstacles

Most of the progress in Latin America is quite recent, and the real impact is still often more limited once you abandon the halls of the Digital Government directorates and secretarías or look if you look beyond the typical government data portal. The resistance to change is as human as laughing, but it is particularly intense among the public sector side of human beings. Politics also typically plays a enormous role in resisting transparency open government, and in a context of weak institutions and pervasive corruption, the temptation to politically block or water down open data/open government projects is just too high. Selective release of data (if any) is too frequent, government agencies often act as silos by not sharing information with other government departments, and irrational fears by policy-makers combined with adoption barriers (well explained here) all contribute to deter the progress of the open government promise in Latin America…”

Special Issue on Innovation through Open Data


A Review of the State-of-the-Art and an Emerging Research Agenda in the Journal of Theoretical and Applied Electronic Commerce Research:

  • Going Beyond Open Data: Challenges and Motivations for Smart Disclosure in Ethical Consumption (Djoko Sigit Sayogo, Jing Zhang, Theresa A. Pardo, Giri K. Tayi, Jana Hrdinova, David F. Andersen and Luis Felipe Luna-Reyes)
  • Shaping Local Open Data Initiatives: Politics and Implications (Josefin Lassinantti, Birgitta Bergvall-Kåreborn and Anna Ståhlbröst)
  • A State-of-the-Art Analysis of the Current Public Data Landscape from a Functional, Semantic and Technical Perspective (Michael Petychakis, Olga Vasileiou, Charilaos Georgis, Spiros Mouzakitis and John Psarras)
  • Using a Method and Tool for Hybrid Ontology Engineering: an Evaluation in the Flemish Research Information Space (Christophe Debruyne and Pieter De Leenheer)
  • A Metrics-Driven Approach for Quality Assessment of Linked Open Data (Behshid Behkamal, Mohsen Kahani, Ebrahim Bagheri and Zoran Jeremic)
  • Open Government Data Implementation Evaluation (Peter Parycek, Johann Höchtl and Michael Ginner)
  • Data-Driven Innovation through Open Government Data (Thorhildur Jetzek, Michel Avital and Niels Bjorn-Andersen)

Technological Innovations and Future Shifts in International Politics


Paper by Askar Akaev and Vladimir Pantin in International Studies Quaterly: “How are large technological changes and important shifts in international politics interconnected? It is shown in the article that primary technological innovations, which take place in each Kondratieff cycle, change the balance of power between the leading states and cause shifts in international politics. In the beginning of the twenty-first century, the genesis and initial development of the cluster of new technologies takes place in periods of crisis and depression. Therefore, the authors forecast that the period 2013–2020 will be marked by the advancement of important technological innovations and massive geopolitical shifts in many regions of the world.”