Paper by Lupton, Deborah, from the Sydney Unversity’s Department of Sociology and Social Policy . Abstract: “As part of the digital health phenomenon, a plethora of interactive digital platforms have been established in recent years to elicit lay people’s experiences of illness and healthcare. The function of these platforms, as expressed on the main pages of their websites, is to provide the tools and forums whereby patients and caregivers, and in some cases medical practitioners, can share their experiences with others, benefit from the support and knowledge of other contributors and contribute to large aggregated data archives as part of developing better medical treatments and services and conducting medical research.
However what may not always be readily apparent to the users of these platforms are the growing commercial uses by many of the platforms’ owners of the archives of the data they contribute. This article examines this phenomenon of what I term ‘the digital patient experience economy’. In so doing I discuss such aspects as prosumption, the phenomena of big data and metric assemblages, the discourse and ethic of sharing and the commercialisation of affective labour via such platforms. I argue that via these online platforms patients’ opinions and experiences may be expressed in more diverse and accessible forums than ever before, but simultaneously they have become exploited in novel ways.”
Human-Based Evolutionary Computing
Abstract of new paper by Jeffrey V. Nickerson on Human-Based Evolutionary Computing (in Handbook of Human Computation, P. Michelucci, eds., Springer, Forthcoming): “Evolution explains the way the natural world changes over time. It can also explain changes in the artificial world, such as the way ideas replicate, alter, and merge. This analogy has led to a family of related computer procedures called evolutionary algorithms. These algorithms are being used to produce product designs, art, and solutions to mathematical problems. While for the most part these algorithms are run on computers, they also can be performed by people. Such human-based evolutionary algorithms are useful when many different ideas, designs, or solutions need to be generated, and human cognition is called for”
Anticipatory Governance and the use of Nano-technology in Cities
Abstract of New Paper in the Journal of Urban Technology: “Visions about the use of nanotechnologies in the city, including in the design and construction of built environments, suggest that these technologies could be critically important for solving urban sustainability problems. We argue that such visions often overlook two critical and interrelated elements. First, conjectures about future nano-enhanced cities tend to rely on flawed concepts of urban sustainability that underestimate the challenges presented by deeply-rooted paradigms of market economics, risk assessment, and the absorption of disruptive technologies. Second, opportunities for stakeholders such as city officials, non-governmental organizations, and citizens to consider the nature and distribution of the potential benefits and adverse effects of nano-enabled urban technologies are rarely triggered sufficiently early. Limitations in early engagement will lead to problems and missed opportunities in the use of nanotechnologies for urban sustainability. In this article, we critically explore ideas about the nano-enhanced city and its promises and limitations related to urban sustainability. On this base, we outline an agenda for engaged research to support anticipatory governance of nanotechnologies in cities.”
The Next Great Internet Disruption: Authority and Governance
An essay by David Bollier and John Clippinger as part of their ongoing work of ID3, the Institute for Data-Driven Design : “As the Internet and digital technologies have proliferated over the past twenty years, incumbent enterprises nearly always resist open network dynamics with fierce determination, a narrow ingenuity and resistance….But the inevitable rearguard actions to defend old forms are invariably overwhelmed by the new, network-based ones. The old business models, organizational structures, professional sinecures, cultural norms, etc., ultimately yield to open platforms.
When we look back on the past twenty years of Internet history, we can more fully appreciate the prescience of David P. Reed’s seminal 1999 paper on “Group Forming Networks” (GFNs). “Reed’s Law” posits that value in networks increases exponentially as interactions move from a broadcasting model that offers “best content” (in which value is described by n, the number of consumers) to a network of peer-to-peer transactions (where the network’s value is based on “most members” and mathematically described by n2). But by far the most valuable networks are based on those that facilitate group affiliations, Reed concluded. When users have tools for “free and responsible association for common purposes,” he found, the value of the network soars exponentially to 2n – a fantastically large number. This is the Group Forming Network. Reed predicted that “the dominant value in a typical network tends to shift from one category to another as the scale of the network increases.…”
What is really interesting about Reed’s analysis is that today’s world of GFNs, as embodied by Facebook, Twitter, Wikipedia and other Web 2.0 technologies, remains highly rudimentary. It is based on proprietary platforms (as opposed to open source, user-controlled platforms), and therefore provides only limited tools for members of groups to develop trust and confidence in each other. This suggests a huge, unmet opportunity to actualize greater value from open networks. Citing Francis Fukuyama’ book Trust, Reed points out that “there is a strong correlation between the prosperity of national economies and social capital, which [Fukuyama] defines culturally as the ease with which people in a particular culture can form new associations.”
Data-Driven Public Transport Planning
David Talbot in MIT Technology Review: “Researchers at IBM, using movement data collected from millions of cell-phone users in Ivory Coast in West Africa, have developed a new model for optimizing an urban transportation system….
While the results were preliminary, they point to the new ways that urban planners can use cell-phone data to design infrastructure, says Francesco Calabrese, a researcher at IBM’s research lab in Dublin, and a coauthor of a paper on the work. “This represents a new front with a potentially large impact on improving urban transportation systems,” he says. “People with cell phones can serve as sensors and be the building blocks of development efforts.”
The IBM work was done as part of a research challenge dubbed Data for Development, in which the telecom giant Orange released 2.5 billion call records from five million cell-phone users in Ivory Coast. The records were gathered between December 2011 and April 2012. The data release is the largest of its kind ever done. The records were cleaned to prevent anyone identifying the users, but they still include useful information about these users’ movements. The IBM paper is one of scores being aired later this week at a conference at MIT.”
The need of a Theory of Participation for 21st Century Governance
A new paper by Prof. Vickie Edwards in the International Journal of Organization Theory and Behavior (IJOTB) concludes: “Research in the years to come should focus on how organizational forms are best adapted to integrate these participatory methods. Specifically, by examining organizational structures and how they integrate with the public in both successful and unsuccessful cases, scholars can gain a better understanding of appropriate organizational forms for stronger democracies at the local level. Such studies should also consider behavioral aspects and the views of both administrators and citizens who become involved in such efforts, as a lack of buy-in at any stage in the participatory process can cause the process to fail – something evidenced by the Great Society efforts in engagement. Personality factors may also play a role in the success or failure of engagement efforts, as such efforts may hinge on the public popularity of individual administrators who interface with citizens. Documenting efforts at participation and engagement through the use of case studies, survey methods, and social network analysis can aid practitioners in identifying best practices, as well as scholars as we seek to better understand each new piece of the participation puzzle”.
From Open Data to Information Justice
The Social Affordances of the Internet for Networked Individualism
Paper by NetLab (Toronto University) scholars in the latest issue of the Journal of Computer-Mediated Communication: “We review the evidence from a number of surveys in which our NetLab has been involved about the extent to which the Internet is transforming or enhancing community. The studies show that the Internet is used for connectivity locally as well as globally, although the nature of its use varies in different countries. Internet use is adding on to other forms of communication, rather than replacing them. Internet use is reinforcing the pre-existing turn to societies in the developed world that are organized around networked individualism rather than group or local solidarities. The result has important implications for civic involvement.”
The Dangers of Surveillance
Paper by Neil M. Richards in Harvard Law Review. Abstract: “From the Fourth Amendment to George Orwell’s Nineteen Eighty-Four, our culture is full of warnings about state scrutiny of our lives. These warnings are commonplace, but they are rarely very specific. Other than the vague threat of an Orwellian dystopia, as a society we don’t really know why surveillance is bad, and why we should be wary of it. To the extent the answer has something to do with “privacy,” we lack an understanding of what “privacy” means in this context, and why it matters. Developments in government and corporate practices have made this problem more urgent. Although we have laws that protect us against government surveillance, secret government programs cannot be challenged until they are discovered.
… I propose a set of four principles that should guide the future development of surveillance law, allowing for a more appropriate balance between the costs and benefits of government surveillance. First, we must recognize that surveillance transcends the public-private divide. Even if we are ultimately more concerned with government surveillance, any solution must grapple with the complex relationships between government and corporate watchers. Second, we must recognize that secret surveillance is illegitimate, and prohibit the creation of any domestic surveillance programs whose existence is secret. Third, we should recognize that total surveillance is illegitimate and reject the idea that it is acceptable for the government to record all Internet activity without authorization. Fourth, we must recognize that surveillance is harmful. Surveillance menaces intellectual privacy and increases the risk of blackmail, coercion, and discrimination; accordingly, we must recognize surveillance as a harm in constitutional standing doctrine.
Policy Modeling through Collaboration and Simulation
New paper on “Bridging narrative scenario texts and formal policy modeling through conceptual policy modeling” in Artificial Intelligence and Law.
Abstract: “Engaging stakeholders in policy making and supporting policy development with advanced information and communication technologies including policy simulation is currently high on the agenda of research. In order to involve stakeholders in providing their input to policy modeling via online means, simple techniques need to be employed such as scenario technique. Scenarios enable stakeholders to express their views in narrative text. At the other end of policy development, a frequently used approach to policy modeling is agent-based simulation. So far, effective support to transform narrative text input to formal simulation statements is not widely available. In this paper, we present a novel approach to support the transformation of narrative texts via conceptual modeling into formal simulation models. The approach also stores provenance information which is conveyed via annotations of texts to the conceptual model and further on to the simulation model. This way, traceability of information is provided, which contributes to better understanding and transparency, and therewith enables stakeholders and policy modelers to return to the sources that informed the conceptual and simulation model.”