Sensor Law


Paper by Sandra Braman: For over two decades, information policy-making for human society has been increasingly supplemented, supplanted, and/or superceded by machinic decision-making; over three decades since legal decision-making has been explicitly put in place to serve machinic rather than social systems; and over four decades since designers of the Internet took the position that they were serving non-human (machinic, or daemon) users in addition to humans. As the “Internet of Things” becomes more and more of a reality, these developments increasingly shape the nature of governance itself. This paper’s discussion of contemporary trends in these diverse modes of human-computer interaction at the system level — interactions between social systems and technological systems — introduces the changing nature of the law as a sociotechnical problem in itself. In such an environment, technological innovations are often also legal innovations, and legal developments require socio-technical analysis as well as social, legal, political, and cultural approaches.

Examples of areas in which sensors are already receiving legal attention are rife. A non-comprehensive listing includes privacy concerns beginning but not ending with those raised by sensors embedded in phones and geolocation devices, which are the most widely discussed and those of which the public is most aware. Sensor issues arise in environmental law, health law, marine law, intellectual property law, and as they are raised by new technologies in use for national security purposes that include those confidence- and security-building measures intended for peacekeeping. They are raised by liability issues for objects that range from cars to ovens. And sensor issues are at the core of concerns about “telemetric policing,” as that is coming into use not only in North America and Europe, but in societies such as that of Brazil as well.

Sensors are involved in every stage of legal processes, from identification of persons of interest to determination of judgments and consequences of judgments. Their use significantly alters the historically-developed distinction among types of decision-making meant to come into use at different stages of the process, raising new questions about when, and how, human decision-making needs to dominate and when, and how, technological innovation might need to be shaped by the needs of social rather than human systems.

This paper will focus on the legal dimensions of sensors used in ubiquitous embedded computing….(More)”

Does Crowdsourcing Legislation Increase Political Legitimacy? The Case of Avoin Ministeriö in Finland


Paper by Henrik Serup Christensen, Maija Karjalainen and Laura Nurminen: “Crowdsourcing legislation gives ordinary citizens, rather than political and bureaucratic elites, the chance to cooperate to come up with innovative new policies. By increasing popular involvement, representative democracies hope to restock dwindling reserves of political legitimacy. However, it is still not clear how involvement in legislative decision making affects the attitudes of the participants. It is therefore of central concern to establish whether crowdsourcing can actually help restore political legitimacy by creating more positive attitudes toward the political system. This article contributes to this research agenda by examining the developments in attitudes among the users on the Finnish website Avoin Ministeriö (“Open Ministry”) which orchestrates crowdsourcing of legislation by providing online tools for deliberating ideas for citizens’ initiatives. The developments in attitudes are investigated with a two-stage survey of 421 respondents who answered questions concerning political and social attitudes, as well as political activities performed. The results suggest that while crowdsourcing legislation has so far not affected political legitimacy in a positive manner, it has the potential to do so….(More)”

Discovering the Language of Data: Personal Pattern Languages and the Social Construction of Meaning from Big Data


Paper by ; ; in Interdisciplinary Science Reviews: “This paper attempts to address two issues relevant to the sense-making of Big Data. First, it presents a case study for how a large dataset can be transformed into both a visual language and, in effect, a ‘text’ that can be read and interpreted by human beings. The case study comes from direct observation of graduate students at the IIT Institute of Design who investigated task-switching behaviours, as documented by productivity software on a single user’s laptop and a smart phone. Through a series of experiments with the resulting dataset, the team effects a transformation of that data into a catalogue of visual primitives — a kind of iconic alphabet — that allow others to ‘read’ the data as a corpus and, more provocatively, suggest the formation of a personal pattern language. Second, this paper offers a model for human-technical collaboration in the sense-making of data, as demonstrated by this and other teams in the class. Current sense-making models tend to be data- and technology-centric, and increasingly presume data visualization as a primary point of entry of humans into Big Data systems. This alternative model proposes that meaningful interpretation of data emerges from a more elaborate interplay between algorithms, data and human beings….(More)”

 

Embracing Crowdsourcing


Paper by Jesse A. Sievers on “A Strategy for State and Local Governments Approaching “Whole Community” Emergency Planning”: “Over the last century, state and local governments have been challenged to keep proactive, emergency planning efforts ahead of the after-the-disaster, response efforts. After moving from decentralized to centralized planning efforts, the most recent policy has returned to the philosophy that a decentralized planning approach is the most effective way to plan for a disaster. In fact, under the Obama administration, a policy of using the “whole community” approach to emergency planning has been adopted. This approach, however, creates an obvious problem for state and local government practitioners already under pressure for funding, time, and the continuous need for higher and broader expertise—the problem of how to actually incorporate the whole community into emergency planning efforts. This article suggests one such approach, crowdsourcing, as an option for local governments. The crowdsourcer-problem-crowd-platform-solution (CPCPS) model is suggested as an initial framework for practitioners seeking a practical application and basic comprehension. The model, discussion, and additional examples in this essay provide a skeletal framework for state and local governments wishing to reach the whole community while under the constraints of time, budget, and technical expertise….(More).

Methods to Protect and Secure “Big Data” May Be Unknowingly Corrupting Research


New paper by John M. Abowd and Ian M. Schmutte: “…As the government and private companies increase the amount of data made available for public use (e.g. Census data, employment surveys, medical data), efforts to protect privacy and confidentiality (through statistical disclosure limitation or SDL) can often cause misleading and compromising effects on economic research and analysis, particularly in cases where data properties are unclear for the end-user.

Data swapping is a particularly insidious method of SDL and is frequently used by important data aggregators like the Census Bureau, the National Center for Health Statistics and others, which interferes with the results of empirical analysis in ways that few economists and other social scientists are aware of.

To encourage more transparency, the authors call for both government statistical agencies as well as the private sector (Amazon, Google, Microsoft, Netfix, Yahoo!, etc.) to release more information about parameters used in SDL methods, and insist that journals and editors publishing such research require documentation of the author’s entire methodological process….(More)

VIDEO:

An In-Depth Analysis of Open Data Portals as an Emerging Public E-Service


Paper by Martin Lnenicka: “Governments collect and produce large amounts of data. Increasingly, governments worldwide have started to implement open data initiatives and also launch open data portals to enable the release of these data in open and reusable formats. Therefore, a large number of open data repositories, catalogues and portals have been emerging in the world. The greater availability of interoperable and linkable open government data catalyzes secondary use of such data, so they can be used for building useful applications which leverage their value, allow insight, provide access to government services, and support transparency. The efficient development of successful open data portals makes it necessary to evaluate them systematic, in order to understand them better and assess the various types of value they generate, and identify the required improvements for increasing this value. Thus, the attention of this paper is directed particularly to the field of open data portals. The main aim of this paper is to compare the selected open data portals on the national level using content analysis and propose a new evaluation framework, which further improves the quality of these portals. It also establishes a set of considerations for involving businesses and citizens to create eservices and applications that leverage on the datasets available from these portals….(More)”

Secrecy versus openness: Internet security and the limits of open source and peer production


Dissertation by Andreas Schmidt:” Open source and peer production have been praised as organisational models that could change the world for the better. It is commonly asserted that almost any societal activity could benefit from distributed, bottom-up collaboration — by making societal interaction more open, more social, and more democratic. However, we also need to be mindful of the limits of these models. How could they function in environments hostile to openness? Security is a societal domain more prone to secrecy than any other, except perhaps for romantic love. In light of the destructive capacity of contemporary cyber attacks, how has the Internet survived without a comprehensive security infrastructure? Secrecy vs. openness describes the realities of Internet security production through the lenses of open source and peer production theories. The study offers a glimpse into the fascinating communities of technical experts, who played a pivotal role when the chips were down for the Internet after large-scale attacks. After an initial flirtation with openness in the early years, operational Internet security communities have put in place institutional mechanisms that have resulted in less open forms of social production…(More)”

Institutional isomorphism, policy networks, and the analytical depreciation of measurement indicators: The case of the EU e-government benchmarking


Paper by Cristiano Codagnone et al: “This article discusses the socio-political dimension of measurement in the context of benchmarking e-government within the European Union׳s Open Method of Coordination. It provides empirical evidence of how this has resulted in institutional isomorphism within the self-referential policy network community involved in the benchmarking process. It argues that the policy prominence retained by supply-side benchmarking of e-government has probably indirectly limited efforts made to measure and evaluate more tangible impacts. High scores in EU benchmarking have contributed to increasing the institutionally-perceived quality but not necessarily the real quality and utility of e-government services. The article concludes by outlining implications for policy and practical recommendations for filling the gaps identified in measurement and evaluation of e-government. It proposes a more comprehensive policy benchmarking framework, which aims to ensure a gradual improvement in measurement activities with indicators that reflect and follow the pace of change, align measurement activities to evaluation needs and, eventually, reduce measurement error….(More)”

How to Fight the Next Epidemic


Bill Gates in the New York Times: “The Ebola Crisis Was Terrible. But Next Time Could Be Much Worse….Much of the public discussion about the world’s response to Ebola has focused on whether the World Health Organization, the Centers for Disease Control and Prevention and other groups could have responded more effectively. These are worthwhile questions, but they miss the larger point. The problem isn’t so much that the system didn’t work well enough. The problem is that we hardly have a system at all.

To begin with, most poor countries, where a natural epidemic is most likely to start, have no systematic disease surveillance in place. Even once the Ebola crisis was recognized last year, there were no resources to effectively map where cases occurred, or to use people’s travel patterns to predict where the disease might go next….

Data is another crucial problem. During the Ebola epidemic, the database that tracks cases has not always been accurate. This is partly because the situation is so chaotic, but also because much of the case reporting has been done on paper and then sent to a central location for data entry….

I believe that we can solve this problem, just as we’ve solved many others — with ingenuity and innovation.

We need a global warning and response system for outbreaks. It would start with strengthening poor countries’ health systems. For example, when you build a clinic to deliver primary health care, you’re also creating part of the infrastructure for fighting epidemics. Trained health care workers not only deliver vaccines; they can also monitor disease patterns, serving as part of the early warning systems that will alert the world to potential outbreaks. Some of the personnel who were in Nigeria to fight polio were redeployed to work on Ebola — and that country was able to contain the disease very quickly.

We also need to invest in disease surveillance. We need a case database that is instantly accessible to the relevant organizations, with rules requiring countries to share their information. We need lists of trained personnel, from local leaders to global experts, prepared to deal with an epidemic immediately. … (More)”

Data democracy – increased supply of geospatial information and expanded participatory processes in the production of data


Paper by Max Craglia & Lea Shanley: “The global landscape in the supply, co-creation and use of geospatial data is changing very rapidly with new satellites, sensors and mobile devices reconfiguring the traditional lines of demand and supply and the number of actors involved. In this paper we chart some of these technology-led developments and then focus on the opportunities they have created for the increased participation of the public in generating and contributing information for a wide range of uses, scientific and non. Not all this information is open or geospatial, but sufficiently large portions of it are to make it one of the most significant phenomena of the last decade. In fact, we argue that while satellite and sensors have exponentially increased the volumes of geospatial information available, the participation of the public is transformative because it expands the range of participants and stakeholders in society using and producing geospatial information, with opportunities for more direct participation in science, politics and social action…(View full text)”