When Ostrom Meets Blockchain: Exploring the Potentials of Blockchain for Commons Governance


Paper by David Rozas et all:  “Blockchain technologies have generated excitement, yet their potential to enable new forms of governanceremains largely unexplored. Two confronting standpoints dominate the emergent debate around blockchain-based governance: discourses characterised by the presence of techno-determinist and market-driven values, which tend to ignore the complexity of social organisation; and critical accounts of such discourses which, whilst contributing to identifying limitations, consider the role of traditional centralised institutions as inherently necessary to enable democratic forms of governance. Therefore the question arises, can we build perspectives of blockchain-based governance that go beyond markets and states? In this article we draw on the Nobel laureate economist Elinor Ostrom’s principles for self-governance of communities to explore the transformative potential of blockchain.

We approach blockchain through the identification and conceptualisation of affordances that this technology may provide to communities. For each affordance, we carry out a detailed analysis situating each in the context of Ostrom’s principles, considering both the potentials of algorithmic governance and the importance of incorporating communities’ social practices. The relationships found between these affordances and Ostrom’s principles allow us to provide a perspective focussed on blockchain-based commons governance. By carrying out this analysis, we aim to expand the debate from one dominated by a culture of competition to one that promotes a culture of cooperation….(More)”.

Better “nowcasting” can reveal what weather is about to hit within 500 meters


MIT Technology Review: “Weather forecasting is impressively accurate given how changeable and chaotic Earth’s climate can be. It’s not unusual to get 10-day forecasts with a reasonable level of accuracy.

But there is still much to be done.  One challenge for meteorologists is to improve their “nowcasting,” the ability to forecast weather in the next six hours or so at a spatial resolution of a square kilometer or less.

In areas where the weather can change rapidly, that is difficult. And there is much at stake. Agricultural activity is increasingly dependent on nowcasting, and the safety of many sporting events depends on it too. Then there is the risk that sudden rainfall could lead to flash flooding, a growing problem in many areas because of climate change and urbanization. That has implications for infrastructure, such as sewage management, and for safety, since this kind of flooding can kill.

So meteorologists would dearly love to have a better way to make their nowcasts.

Enter Blandine Bianchi from EPFL in Lausanne, Switzerland, and a few colleagues, who have developed a method for combining meteorological data from several sources to produce nowcasts with improved accuracy. Their work has the potential to change the utility of this kind of forecasting for everyone from farmers and gardeners to emergency services and sewage engineers.

Current forecasting is limited by the data and the scale on which it is gathered and processed. For example, satellite data has a spatial resolution of 50 to 100 km and allows the tracking and forecasting of large cloud cells over a time scale of six to nine hours. By contrast, radar data is updated every five minutes, with a spatial resolution of about a kilometer, and leads to predictions on the time scale of one to three hours. Another source of data is the microwave links used by telecommunications companies, which are degraded by rainfall….(More)”

The Janus Face of the Liberal Information Order


Paper by Henry Farrell and Abraham L. Newman: “…Domestically, policy-makers and scholars argued that information openness, like economic openness, would go hand-in-glove with political liberalization and the spread of democratic values. This was perhaps, in part an accident of timing: the Internet – which seemed to many to be inherently resistant to censorship – burgeoned shortly after the collapse of Communism in the Soviet Union and Eastern Europe. Politicians celebrated the dawn of a new era of open communication, while scholars began to argue that the spread of the Internet would lead to the spread of democracy (Diamond 2010;Shirky 2008).

A second wave of literature suggested that Internet-based social media had played a crucial role in spreading freedom in the Arab Spring (Howard 2010; Hussain and Howard 2013). There were some skeptics who highlighted the vexed relationship between open networks and the closed national politics of autocracies (Goldsmith and Wu 2006), or who pointed out that the Internet was nowhere near as censorship-resistant as early optimists had supposed (Deibert et al. 2008). Even these pessimists seemed to believe that the Internet could bolster liberalism in healthy democracies, although it would by no means necessarily prevail over tyranny.

The international liberal order for information, however, finds itself increasingly on shaky ground. Non-democratic regimes ranging from China to Saudi Arabia have created domestic technological infrastructures, which undermine and provide an alternative to the core principles of the regime (Boas 2006; Deibert 2008).

The European Union, while still generally supportive of open communication and free speech, has grown skeptical of the regime’s focus on unfettered economic access and has used privacy and anti-trust policy to challenge its most neo-liberal elements (Newman 2008). Non-state actors like Wikileaks have relied on information openness as a channel of disruption and perhaps manipulation. 

More troubling are the arguments of a new literature – that open information flows are less a harbinger of democracy than a vector of attack…

How can IR scholars make sense of this Janus-face quality of information? In this brief memo, we argue that much of the existing work on information technology and information flows suffers from two key deficiencies.

First – there has been an unhelpful separation between two important debates about information flows and liberalism. One – primarily focused on the international level – concerned global governance of information networks, examining how states (especially the US) arrived at and justified their policy stances, and how power dynamics shaped the battles between liberal and illiberal states over what the relevant governance arrangements should be (Klein 2002; Singh 2008; Mueller 2009). …

This leads to the second problem – that research has failed to appreciate the dynamics of contestation over time…(More)”

What’s inside the black box of digital innovation?


George Atalla at Ernst and Young: “Analysis of the success or failure of government digital transformation projects tends to focus on the technology that has been introduced. Seldom discussed is the role played by organizational culture and by a government’s willingness to embrace new approaches and working practices. And yet factors such as an ability to transcend bureaucratic working styles and collaborate with external partners are just as vital to success as deploying the right IT…

The study, Inside the Black Box: Journey Mapping Digital Innovation in Government, used a range of qualitative research tools including rich pictures, journey maps and self-reporting questionnaires to tease out individual characteristics of team members, team sentiment, organizational governance and the role played by cultural factors. The approach was unique in that it captured the nuances of the process of digital innovation, rather than merely measuring inputs and outputs.

The aim of the study was to look inside the “black box” of digital transformation to find out what really goes on within the teams responsible for delivery. In every case, the implementation journey involved ups and downs, advances and setbacks, but there were always valuable lessons to learn. We have extracted the six key insights for governments, outlined below, to provide guidance for government and public sector leaders who are embarking on their own innovation journey…(More)”.

Technologies of International Relations


Book edited by Carolin Kaltofen, Madeline Carr and Michele Acuto: “This book examines the role of technology in the core voices for International Relations theory and how this has shaped the contemporary thinking of ‘IR’ across some of the discipline’s major texts. Through an interview format between different generations of IR scholars, the conversations of the book analyse the relationship between technology and concepts like power, security and global order. They explore to what extent ideas about the role and implications of technology help to understand the way IR has been framed and world politics are conceived of today. This innovative text will appeal to scholars in Politics and International Relations as well as STS, Human Geography and Anthropology….(More)” .

Security in Smart Cities: Models, Applications, and Challenges


Book edited by Aboul Ella Hassanien, Mohamed Elhoseny, Syed Hassan Ahmed and Amit Kumar Singh: “This book offers an essential guide to IoT Security, Smart Cities, IoT Applications, etc. In addition, it presents a structured introduction to the subject of destination marketing and an exhaustive review on the challenges of information security in smart and intelligent applications, especially for IoT and big data contexts. Highlighting the latest research on security in smart cities, it addresses essential models, applications, and challenges.

Written in plain and straightforward language, the book offers a self-contained resource for readers with no prior background in the field. Primarily intended for students in Information Security and IoT applications (including smart cities systems and data heterogeneity), it will also greatly benefit academic researchers, IT professionals, policymakers and legislators. It is well suited as a reference book for both undergraduate and graduate courses on information security approaches, the Internet of Things, and real-world intelligent applications….(More)

The free flow of non-personal data


Joint statement by Vice-President Ansip and Commissioner Gabriel on the European Parliament’s vote on the new EU rules facilitating the free flow of non-personal data: “The European Parliament adopted today a Regulation on the free flow of non-personal data proposed by the European Commission in September 2017. …

We welcome today’s vote at the European Parliament. A digital economy and society cannot exist without data and this Regulation concludes another key pillar of the Digital Single Market. Only if data flows freely can Europe get the best from the opportunities offered by digital progress and technologies such as artificial intelligence and supercomputers.  

This Regulation does for non-personal data what the General Data Protection Regulation has already done for personal data: free and safe movement across the European Union. 

With its vote, the European Parliament has sent a clear signal to all businesses of Europe: it makes no difference where in the EU you store and process your data – data localisation requirements within the Member States are a thing of the past. 

The new rules will provide a major boost to the European data economy, as it opens up potential for European start-ups and SMEs to create new services through cross-border data innovation. This could lead to a 4% – or €739 billion – higher EU GDP until 2020 alone. 

Together with the General Data Protection Regulation, the Regulation on the free flow of non-personal data will allow the EU to fully benefit from today’s and tomorrow’s data-based global economy.” 

Background

Since the Communication on the European Data Economy was adopted in January 2017 as part of the Digital Single Market strategy, the Commission has run a public online consultation, organised structured dialogues with Member States and has undertaken several workshops with different stakeholders. These evidence-gathering initiatives have led to the publication of an impact assessment….The Regulation on the free flow of non-personal data has no impact on the application of the General Data Protection Regulation (GDPR), as it does not cover personal data. However, the two Regulations will function together to enable the free flow of any data – personal and non-personal – thus creating a single European space for data. In the case of a mixed dataset, the GDPR provision guaranteeing free flow of personal data will apply to the personal data part of the set, and the free flow of non-personal data principle will apply to the non-personal part. …(More)”.

Craft metrics to value co-production


Liz Richardson and Beth Perry at Nature: “Advocates of co-production encourage collaboration between professional researchers and those affected by that research, to ensure that the resulting science is relevant and useful. Opening up science beyond scientists is essential, particularly where problems are complex, solutions are uncertain and values are salient. For example, patients should have input into research on their conditions, and first-hand experience of local residents should shape research on environmental-health issues.

But what constitutes success on these terms? Without a better understanding of this, it is harder to incentivize co-production in research. A key way to support co-production is reconfiguring that much-derided feature of academic careers: metrics.

Current indicators of research output (such as paper counts or the h-index) conceptualize the value of research narrowly. They are already roundly criticized as poor measures of quality or usefulness. Less appreciated is the fact that these metrics also leave out the societal relevance of research and omit diverse approaches to creating knowledge about social problems.

Peer review also has trouble assessing the value of research that sits at disciplinary boundaries or that addresses complex social challenges. It denies broader social accountability by giving scientists a monopoly on determining what is legitimate knowledge1. Relying on academic peer review as a means of valuing research can discourage broader engagement.

This privileges abstract and theoretical research over work that is localized and applied. For example, research on climate-change adaptation, conducted in the global south by researchers embedded in affected communities, can make real differences to people’s lives. Yet it is likely to be valued less highly by conventional evaluation than research that is generalized from afar and then published in a high-impact English-language journal….(More)”.

The law and ethics of big data analytics: A new role for international human rights in the search for global standards


David Nersessian at Business Horizons: “The Economist recently declared that digital information has overtaken oil as the world’s most valuable commodity. Big data technology is inherently global and borderless, yet little international consensus exists over what standards should govern its use. One source of global standards benefitting from considerable international consensus might be used to fill the gap: international human rights law.

This article considers the extent to which international human rights law operates as a legal or ethical constraint on global commercial use of big data technologies. By providing clear baseline standards that apply worldwide, human rights can help shape cultural norms—implemented as ethical practices and global policies and procedures—about what businesses should do with their information technologies. In this way, human rights could play a broad and important role in shaping business thinking about the proper handling of this increasingly valuable commodity in the modern global society…(More)”.

(In)Equalities and Social (In)Visibilities in the Digital Age


Intro by Inês Amaral, Maria João Barata and Vasco Almeida to the Special issue of Interações: “The influence of new technologies in public and private spheres of society, rather than a reformulation, has given rise to a new social field and directly interferes with how we perceive the world, relate to it and to others. One should note that in Pierre Bourdieu›s (2001) theory, field arises as a configuration of socially distributed relations.
Progressively, a universe of socialisation has emerged and consolidated: cyber-space. Although virtual, it exists and produces effects. It can be defined as the space boosted by the different digital communication platforms and assumes itself as an
individual communication model, allowing the receiver to be simultaneously emitter. Space of flows (Castells, 1996), cyberspace translates the social dimension of the Internet, enabling the diffusion of communication/information on a global scale. This causes an intense process of inclusion and exclusion of people in the network.
The reference to info-inclusive and info-excluded societies of the digital scenario is imperative when it is reflected in the geography of the new socio-technological spaces. The dynamics of these territories are directly associated with the way social,
demographic, economic and technological variables condition each other, revealing the potential for dissemination of information and knowledge through technologies.
In this special issue of the journal Interações we propose a reflection on (In)Equalities and Social (In)Visibilities in the Digital Age. The articles in the volume present research results and/or theoretical reflection on social visibilities and invisibilities
created by dynamics of media and digital inclusion and exclusion, relations between the digital and inequalities in different geographical, social and professional contexts, digital literacy and vulnerable social groups, conditioning created by technology to the individual in social context, among others….(More)”