Who you are/where you live: do neighbourhood characteristics explain co-production?


Paper by Peter Thijssen and Wouter Van Dooren in the International Review of Administrative Sciences: “Co-production establishes an interactive relationship between citizens and public service providers. Successful co-production hence requires the engagement of citizens. Typically, individual characteristics such as age, gender, and income are used to explain why citizens co-produce. In contrast, neighbourhood-level variables receive less attention. Nevertheless, the co-production literature, as well as social capital and urban planning theory, provides good arguments why neighbourhood variables may be relevant. In this study, we examine the administrative records of citizen-initiated contacts in a reporting programme for problems in the public domain. This co-production programme is located in the district of Deurne in the city of Antwerp, Belgium. A multilevel analysis is used to simultaneously assess the impact of neighbourhood characteristics and individual variables. While the individual variables usually found to explain co-production are present in our case, we also find that neighbourhood characteristics significantly explain co-production. Thus, our findings suggest that participation in co-production activities is determined not only by who you are, but also by where you live.

Points for practitioners In order to facilitate co-production and participation, the neighbourhood should be the first place to look. Co-production benefits may disproportionaly accrue to strong citizens, but also to strong neighbourhoods. Social corrections should take both into account. More broadly, a good understanding of the neighbourhoods in the city is needed to grasp citizen behaviour. Place-based policies in the city should focus on the neighbourhood….(More)”

The tools of social change: A critique of techno-centric development and activism


Paper by Jan Servaes and Rolien Hoyng in New Media and Society: “Generally, the literatures on Information and Communication Technologies for Development (ICT4D) and on networked resistance are evolving isolated from one another. This article aims to integrate these literatures in order to critically review differences and similarities in the techno-centric conceptions of agency and social change by political adversaries that are rooted in their socio-technical practices. We repurpose the critique of technological determinism to develop a multi-layered conception of agency that contains three interrelated dimensions: (1) “access” versus “skill” and the normative concept of inclusion; (2) fixed “system” versus “open-ended network” and savoir vivre; and (3) “institution” versus “extra-institutional network” and political efficacy. Building on our critique, we end by exploring the political possibilities at the intersections of conventional institutions or communities and emerging, extra-institutional networked formations…(More)”

Towards decision support for disclosing data: Closed or open data?


Article by Zuiderwijk , Anneke and Janssen , Marijn in Information Polity: “The disclosure of open government data is a complex activity that may create public value yet might also encounter risks, such as the misinterpretation and misuse of data. While politicians support data release and assume that the positive value of open data will dominate, many governmental organizations are reluctant to open their data, as they are afraid of the dark side. The objective of this paper is to provide a decision-making model that assists in trade-offs between the pros and cons of open data. Data disclosure is dependent on the type of data (e.g. its sensitivity, structure and quality) and the context (e.g. organizational policies, legislation and the political influences). Based on the literature and fifteen in-depth interviews with public sector officials and data archivists, this paper identifies contextual and dataset-related variables which influence a trade-off. A decision-making model is presented capturing trade-offs, and in this way providing guidance for weighing the creation of public value and the risks. The model can be used for decision-making to open or not to open data. It is likely that the decision regarding which data should be opened or closed will shift over time….(More)”

Democratic Rulemaking


New paper by de Figueiredo, John M. and Stiglitz, Edward for the Oxford Handbook of Law and Economics, Forthcoming: “To what extent is agency rulemaking democratic? This paper examines the soundness and empirical support for the leading theories that purport to endow the administrative state with democratic legitimacy. We study the theories in light of two normative benchmarks: a “democratic” benchmark based on voter preferences, and a “republican” benchmark based on the preferences of elected representatives. We conclude that all of the proposed theories lack empirical support and many have substantial conceptual flaws; we point to directions for possible future research….(More)”

Public service coding: the BBC as an open software developer


Juan Mateos-Garcia at NESTA: “On Monday, the BBC published British, Bold, Creative, a paper where it put forward a vision for its future based on openness and collaboration with its audiences and the UK’s wider creative industries.

In this blog post, we focus on an area where the BBC is already using an open and collaborative model for innovation: software development.

The value of software

Although less visible to the public than its TV, radio and online content programming, the BBC’s software development activities may create value and drive innovation beyond the BBC, providing an example of how the corporation can put its “technology and digital capabilities at the service of the wider industry.

Software is an important form of innovation investment that helps the BBC deliver new products and services, and become more efficient. One might expect that much of the software developed by the BBC would also be of value to other media and digital organisations. Such beneficial “spillovers” are encouraged by the BBC’s use of open source licensing, which enables other organisations to download its software for free, change it as they see fit, and share the results.

Current debates about the future of the BBC – including the questions about its role in influencing the future technology landscape in the Government’s Charter Review Consultation – need to be informed by robust evidence about how it develops software, and the impact that this has.

In this blog post, we use data from the world’s biggest collaborative software development platform, GitHub, to study the BBC as an open software developer.

GitHub gives organisations and individuals hosting space to store their projects (referred to as “repos”), and tools to coordinate development. This includes the option to “fork” (copy) other users’ software, change it and redistribute the improvements. Our key questions are:

  • How active is the BBC on GitHub?
  • How has its presence on GitHub changed over time?
  • What is the level of adoption (forking) of BBC projects on GitHub?
  • What types of open source projects is the BBC developing?
  • Where in the UK and in the rest of the world are the people interested in BBC projects based?

But before tackling these questions, it is important to address a question often raised in relation to open source software:

Why might an organisation like the BBC want to share its valuable code on a platform like GitHub?

There are several possible reasons:

  • Quality: Opening up a software project attracts help from other developers, making it better
  • Adoption: Releasing software openly can help turn it into a widely adopted standard
  • Signalling: It signals the organisation as an interesting place to work and partner with
  • Public value: Some organisations release their code openly with the explicit goal of creating public value

The webpage introducing TAL (Television Application Layer), a BBC project on GitHub, is a case in point: “Sharing TAL should make building applications on TV easier for others, helping to drive the uptake of this nascent technology. The BBC has a history of doing this and we are always looking at new ways to reach our audience.”…(More)

Policies for a fair re-use of data: Big Data and the application of anonymization techniques


Paper by Giuseppe D’Acquisto: “The legal framework on data protection in Europe sets a high standard with regard to the possible re-use of personal data. Principles like purpose limitation and data minimization challenge the emerging Big Data paradigm, where the “value” of data is linked to its somehow still unpredictable potential future uses. Nevertheless, the re-use of data is not impossible, once they are properly anonymized. The EU’s Article 29 Working Party published in 2014 an Opinion on the application of anonymization techniques, which can be implemented to enable potential re-use of previously collected data within a framework of safeguards for individuals. The paper reviews the main elements of the Opinion, with a view to the widespread adoption of anonymization policies enabling a fair re-use of data, and gives an overview of the legal and technical aspects related to anonymization, pointing out the many misconceptions on the issue…(More)”

The Spectrum of Control: A Social Theory of the Smart City


Sadowski , Jathan and Pasquale, Frank at First Monday: “There is a certain allure to the idea that cities allow a person to both feel at home and like a stranger in the same place. That one can know the streets and shops, avenues and alleys, while also going days without being recognized. But as elites fill cities with “smart” technologies — turning them into platforms for the “Internet of Things” (IoT): sensors and computation embedded within physical objects that then connect, communicate, and/or transmit information with or between each other through the Internet — there is little escape from a seamless web of surveillance and power. This paper will outline a social theory of the “smart city” by developing our Deleuzian concept of the “spectrum of control.” We present two illustrative examples: biometric surveillance as a form of monitoring, and automated policing as a particularly brutal and exacting form of manipulation. We conclude by offering normative guidelines for governance of the pervasive surveillance and control mechanisms that constitute an emerging critical infrastructure of the “smart city.”…(More)”

 

A systematic review of open government data initiatives


Paper by J. Attard, F. Orlandi, S. Scerri, and S. Auer in Government Information Quarterly: “We conduct a systematic survey with the aim of assessing open government data initiatives, that is; any attempt, by a government or otherwise, to open data that is produced by a governmental entity. We describe the open government data life-cycle and we focus our discussion on publishing and consuming processes required within open government data initiatives. We cover current approaches undertaken for such initiatives, and classify them. A number of evaluations found within related literature are discussed, and from them we extract challenges and issues that hinder open government initiatives from reaching their full potential. In a bid to overcome these challenges, we also extract guidelines for publishing data and provide an integrated overview. This will enable stakeholders to start with a firm foot in a new open government data initiative. We also identify the impacts on the stakeholders involved in such initiatives….(More)”

Dissecting the Spirit of Gezi: Influence vs. Selection in the Occupy Gezi Movement


New study by Ceren Budak and Duncan J. Watts in Sociological Science: “Do social movements actively shape the opinions and attitudes of participants by bringing together diverse groups that subsequently influence one another? Ethnographic studies of the 2013 Gezi uprising seem to answer “yes,” pointing to solidarity among groups that were traditionally indifferent, or even hostile, to one another. We argue that two mechanisms with differing implications may generate this observed outcome: “influence” (change in attitude caused by interacting with other participants); and “selection” (individuals who participated in the movement were generally more supportive of other groups beforehand).

We tease out the relative importance of these mechanisms by constructing a panel of over 30,000 Twitter users and analyzing their support for the main Turkish opposition parties before, during, and after the movement. We find that although individuals changed in significant ways, becoming in general more supportive of the other opposition parties, those who participated in the movement were also significantly more supportive of the other parties all along. These findings suggest that both mechanisms were important, but that selection dominated. In addition to our substantive findings, our paper also makes a methodological contribution that we believe could be useful to studies of social movements and mass opinion change more generally. In contrast with traditional panel studies, which must be designed and implemented prior to the event of interest, our method relies on ex post panel construction, and hence can be used to study unanticipated or otherwise inaccessible events. We conclude that despite the well known limitations of social media, their “always on” nature and their widespread availability offer an important source of public opinion data….(More)”

On the morals of network research and beyond


Conspicuous Chatter:”…Discussion on ethics have become very popular in computer science lately — and to some extent I am glad about this. However, I think we should dispel three key fallacies.

The first one is that things we do not like (some may brand “immoral”) happen because others do not think of the moral implications of their actions. In fact it is entirely possible that they do and decide to act in a manner we do not like none-the-less. This could be out of conviction: those who built the surveillance equipment, that argue against strong encryption, and also those that do the torture and the killing (harm), may have entirely self-righteous ways of justifying their actions to themselves and others. Others, may simply be doing a good buck — and there are plenty of examples of this in the links above.

The second fallacy is that ethics, and research ethics more specifically, comes down to a “common sense” variant of “do no harm” — and that is that. In fact Ethics, as a philosophical discipline is extremely deep, and there are plenty of entirely legitimate ways to argue that doing harm is perfectly fine. If the authors of the paper were a bit more sophisticated in their philosophy they could, for example have made reference to the “doctrine of double effect” or the nature of free will of those that will bring actual harm to users, and therefore their moral responsibility. It seems that a key immoral aspect of this work was that the authors forgot to write that, confusing section.

Finally, we should dispel in conversations about research ethics, the myth that morality equals legality. The public review mentions “informed consent”, but in fact this is an extremely difficult notion — and legalistically it has been used to justify terrible things. The data protection variant of informed consent allows large internet companies, and telcos, to basically scoop most users’ data because of some small print in lengthy terms and conditions. In fact it should probably be our responsibility to highlight the immorality of this state of affairs, before writing public reviews about the immorality of a hypothetical censorship detection system.

Thus, I would argue, if one is to make an ethical point relating to the values and risks of technology they have to make it in the larger context of how technology is fielded and used, the politics around it, who has power, who makes the money, who does the torturing and the killing, and why. Technology lives within a big moral picture that a research community has a responsibility to comment on. Focusing moral attention on the microcosm of a specific hypothetical use case — just because it is the closest to our research community — misses the point, perpetuating silently a terrible state of moral affairs….(More)”