Data Pools: Wi-Fi Geolocation Spoofing


AH Projects: “DataPools is a Wi-Fi geolocation spoofing project that virtually relocates your phone to the latitudes and longitudes of Silicon Valley success. It includes a catalog and a SkyLift device with 12 pre-programmed locations. DataPools was produced for the Tropez summer art event in Berlin and in collaboration with Anastasia Kubrak.

DataPools catalog pool index

DataPools catalog pool index

Weren’t invited to Jeff Bezos’s summer pool party? No problem. DataPools uses the SkyLift device to mimick the Wi-Fi network infrastructure at 12 of the top Silicon Valley CEOs causing your phone to show up, approximately, at their pool. Because Wi-Fi spoofing affects the core geolocation services of iOS and Android smartphones, all apps on phone and the metadata they generate, will be located in the spoofed location…

Data Pools is a metaphor for a store of wealth that is private. The luxurious pools and mansions of Silicon Valley are financed by the mechanisms of economic surveillance and ownership of our personal information. Yet, the geographic locations of these premises are often concealed, hidden, and removed from open source databases. What if we could reverse this logic and plunge into the pools of ludicrous wealth, both virtually and physically? Could we apply the same methods of data extraction to highlight the ridiculous inequalities between CEOs and platform users?

Comparison of wealth distribution among top Silicon Valley CEOs

Comparison of wealth distribution among top Silicon Valley CEOs

Data

Technically, DataPools uses a Wi-Fi microcontroller programmed with the BSSIDs and SSIDs from the target locations, which were all obtained using openly published information from web searches and wigle.net. This data is then programmed onto the firmware of the SkyLift device. One SkyLift device contains all 12 pool locations. However, throughout the installation improvements were made and the updated firmware now uses one main location with multiple sub-locations to cover a larger area during installations. This method was more effective at spoofing many phones in large area and is ideal for installations….(More)”.

Reconnecting citizens with EU decision-making is possible – and needs to happen now


Opinion piece by Anthony Zacharzewski: “Maybe it’s the Brexit effect, or perhaps the memories of the great recession are fading, but in poll after poll, Europe’s citizens are saying that they feel more European and strongly supportive of EU membership. …

While sighs of relief can be heard from Schuman to Strasbourg, after a decade where the EU has bounced from crisis to crisis, the new Parliament and Commission will inherit a fragile and fractious Europe this year. One of their most important tasks will immediately be to connect EU citizens more closely to the institutions and their decision making….

The new European Commission and Parliament have the chance to change that, by adopting an ambitious open government agenda that puts citizen participation in decision making at its heart.

There are three things on our wish list for doing this.

The first thing on our list is an EU-wide commitment to policy making “in the open.” Built on a renewed commitment to transparency, it would set a unified approach to consultation, as well as identifying major policy areas where citizen involvement is both valuable and where citizens are likely to want to be involved. This could include issues such as migration and climate change. Member states, particularly those who are in the Open Government Partnership, have already had a lot of good practice which can help to inform this while the Open Government Network for Europe, which brings together civil society and government voices, is ready to help.

Secondly, the connection to civil society and citizens also needs to be made beyond the European level, supporting and making use of the rapidly growing networks of democratic innovation at local level. We are seeing an increasing shift from citizen participation as one-off events into a part of the governing system, and as such, the European institutions need to listen to local conversations and support them with better information. Public Square, our own project run in partnership with mySociety and funded by Luminate, is a good example. It is working with local government and citizens to understand how meaningful citizen participation can become an everyday part of the way all local decision-making happens.

The last item on our wish list would be greater coherence between the institutions in Brussels and Strasbourg to better involve citizens. While the European Parliament, Commission and Council all have their different roles and prerogatives, without a co-ordinated approach, the attention and resources they have will be dissipated across multiple conversations. Most importantly, it will be harder to demonstrate to citizens that their contributions have made a difference….(More)”.

The Blockchain Game: A great new tool for your classroom


IBM Blockchain Blog: “Blockchain technology can be a game-changer for accounting, supply chainbanking, contract law, and many other fields. But it will only be useful if lots and lots of non-technical managers and leaders trust and adopt it. And right now, just understanding what blockchain is, can be difficult to understand even for the brightest in these fields. Enter The Blockchain Game, a hands-on exercise that explains blockchain’s core principals, and serves as a launching pad for discussion of blockchain’s real-world applications.

In The Blockchain Game students act as nodes and miners on a blockchain network for storing student grades at a university. Participants record the grade and course information, and then “build the block” by calculating a unique identifier (a hash) to secure the grade ledger, and miners get rewarded for their work. As the game is played, the audience learns about hashes, private keys, and what uses are appropriate for a blockchain ledger.

Basics of the Game

  • A hands-on simulation centering around a blockchain for academic scores, including a discussion at the end of the simulation regarding if storing grades would be a good application for blockchain.
  • No computers. Participants are the computors and calculate blocks.
  • The game seeks to teach core concepts about a distributed ledger but can be modified to whichever use case the educator wishes to use — smart contracts, supply chain, applications and others.
  • Additional elements can be added if instructors want to facilitate the game on a computer….(More)”.

Many Across the Globe Are Dissatisfied With How Democracy Is Working


Pew Research Center: “Anger at political elites, economic dissatisfaction and anxiety about rapid social changes have fueled political upheaval in regions around the world in recent years. Anti-establishment leaders, parties and movements have emerged on both the right and left of the political spectrum, in some cases challenging fundamental norms and institutions of liberal democracy. Organizations from Freedom House to the Economist Intelligence Unit to V-Demhave documented global declines in the health of democracy.

As previous Pew Research Center surveys have illustrated, ideas at the core of liberal democracy remain popular among global publics, but commitment to democracy can nonetheless be weak. Multiple factors contribute to this lack of commitment, including perceptions about how well democracy is functioning. And as findings from a new Pew Research Center survey show, views about the performance of democratic systems are decidedly negative in many nations. Across 27 countries polled, a median of 51% are dissatisfied with how democracy is working in their country; just 45% are satisfied.

Assessments of how well democracy is working vary considerably across nations. In Europe, for example, more than six-in-ten Swedes and Dutch are satisfied with the current state of democracy, while large majorities in Italy, Spain and Greece are dissatisfied.

To better understand the discontent many feel with democracy, we asked people in the 27 nations studied about a variety of economic, political, social and security issues. The results highlight some key areas of public frustration: Most believe elections bring little change, that politicians are corrupt and out of touch and that courts do not treat people fairly. On the other hand, people are more positive about how well their countries protect free expression, provide economic opportunity and ensure public safety.

We also asked respondents about other topics, such as the state of the economy, immigration and attitudes toward major political parties. And in Europe, we included additional questions about immigrants and refugees, as well as opinions about the European Union….(More)”.

AI & Global Governance: Robots Will Not Only Wage Future Wars but also Future Peace


Daanish Masood & Martin Waehlisch at the United Nations University: “At the United Nations, we have been exploring completely different scenarios for AI: its potential to be used for the noble purposes of peace and security. This could revolutionize the way of how we prevent and solve conflicts globally.

Two of the most promising areas are Machine Learning and Natural Language Processing. Machine Learning involves computer algorithms detecting patterns from data to learn how to make predictions and recommendations. Natural Language Processing involves computers learning to understand human languages.

At the UN Secretariat, our chief concern is with how these emerging technologies can be deployed for the good of humanity to de-escalate violence and increase international stability.

This endeavor has admirable precedent. During the Cold War, computer scientists used multilayered simulations to predict the scale and potential outcome of the arms race between the East and the West.

Since then, governments and international agencies have increasingly used computational models and advanced Machine Learning to try to understand recurrent conflict patterns and forecast moments of state fragility.

But two things have transformed the scope for progress in this field.

The first is the sheer volume of data now available from what people say and do online. The second is the game-changing growth in computational capacity that allows us to crunch unprecedented, inconceivable quantities data with relative speed and ease.

So how can this help the United Nations build peace? Three ways come to mind.

Firstly, overcoming cultural and language barriers. By teaching computers to understand human language and the nuances of dialects, not only can we better link up what people write on social media to local contexts of conflict, we can also more methodically follow what people say on radio and TV. As part of the UN’s early warning efforts, this can help us detect hate speech in a place where the potential for conflict is high. This is crucial because the UN often works in countries where internet coverage is low, and where the spoken languages may not be well understood by many of its international staff.

Natural Language Processing algorithms can help to track and improve understanding of local debates, which might well be blind spots for the international community. If we combine such methods with Machine Learning chatbots, the UN could conduct large-scale digital focus groups with thousands in real-time, enabling different demographic segments in a country to voice their views on, say, a proposed peace deal – instantly testing public support, and indicating the chances of sustainability.

Secondly, anticipating the deeper drivers of conflict. We could combine new imaging techniques – whether satellites or drones – with automation. For instance, many parts of the world are experiencing severe groundwater withdrawal and water aquifer depletion. Water scarcity, in turn, drives conflicts and undermines stability in post-conflict environments, where violence around water access becomes more likely, along with large movements of people leaving newly arid areas.

One of the best predictors of water depletion is land subsidence or sinking, which can be measured by satellite and drone imagery. By combining these imaging techniques with Machine Learning, the UN can work in partnership with governments and local communities to anticipate future water conflicts and begin working proactively to reduce their likelihood.

Thirdly, advancing decision making. In the work of peace and security, it is surprising how many consequential decisions are still made solely on the basis of intuition.

Yet complex decisions often need to navigate conflicting goals and undiscovered options, against a landscape of limited information and political preference. This is where we can use Deep Learning – where a network can absorb huge amounts of public data and test it against real-world examples on which it is trained while applying with probabilistic modeling. This mathematical approach can help us to generate models of our uncertain, dynamic world with limited data.

With better data, we can eventually make better predictions to guide complex decisions. Future senior peace envoys charged with mediating a conflict would benefit from such advances to stress test elements of a peace agreement. Of course, human decision-making will remain crucial, but would be informed by more evidence-driven robust analytical tools….(More)”.

Introducing the Contractual Wheel of Data Collaboration


Blog by Andrew Young and Stefaan Verhulst: “Earlier this year we launched the Contracts for Data Collaboration (C4DC) initiative — an open collaborative with charter members from The GovLab, UN SDSN Thematic Research Network on Data and Statistics (TReNDS), University of Washington and the World Economic Forum. C4DC seeks to address the inefficiencies of developing contractual agreements for public-private data collaboration by informing and guiding those seeking to establish a data collaborative by developing and making available a shared repository of relevant contractual clauses taken from existing legal agreements. Today TReNDS published “Partnerships Founded on Trust,” a brief capturing some initial findings from the C4DC initiative.

The Contractual Wheel of Data Collaboration [beta]

The Contractual Wheel of Data Collaboration [beta] — Stefaan G. Verhulst and Andrew Young, The GovLab

As part of the C4DC effort, and to support Data Stewards in the private sector and decision-makers in the public and civil sectors seeking to establish Data Collaboratives, The GovLab developed the Contractual Wheel of Data Collaboration [beta]. The Wheel seeks to capture key elements involved in data collaboration while demystifying contracts and moving beyond the type of legalese that can create confusion and barriers to experimentation.

The Wheel was developed based on an assessment of existing legal agreements, engagement with The GovLab-facilitated Data Stewards Network, and analysis of the key elements of our Data Collaboratives Methodology. It features 22 legal considerations organized across 6 operational categories that can act as a checklist for the development of a legal agreement between parties participating in a Data Collaborative:…(More)”.

Trivialization and Public Opinion: Slogans, Substance, and Styles of Thought in the Age of Complexity


Book by Oldrich Bubak and Henry Jacek: “Centering on public discourse and its fundamental lapses, this book takes a unique look at key barriers to social and political advancement in the information age. Public discourse is replete with confident, easy to manage claims, intuitions, and other shortcuts; outstanding of these is trivialization, the trend to distill multifaceted dilemmas to binary choices, neglect the big picture, gloss over alternatives, or filter reality through a lens of convenience—leaving little room for nuance and hence debate.

Far from superficial, such lapses are symptoms of deeper, intrinsically connected shortcomings inviting further attention. Focusing primarily on industrialized democracies, the authors take their readers on a transdisciplinary journey into the world of trivialization, engaging as they do so the intricate issues borne of a modern environment both enabled and constrained by technology. Ultimately, the authors elaborate upon the emerging counterweights to conventional worldviews and the paradigmatic alternatives that promise to help open new avenues for progress….(More)”.

Drones to deliver medicines to 12m people in Ghana


Neil Munshi in the Financial Times: “The world’s largest drone delivery network, ferrying 150 different medicines and vaccines, as well as blood, to 2,000 clinics in remote parts of Ghana, is set to be announced on Wednesday.

The network represents a big expansion for the Silicon Valley start-up Zipline, which began delivering blood in Rwanda in 2016 using pilotless, preprogrammed aircraft. The move, along with a new agreement in Rwanda signed in December, takes the company beyond simple blood distribution to more complicated vaccine and plasma deliveries.

“What this is going to show is that you can reach every GPS co-ordinate, you can serve everybody,” said Keller Rinaudo, Zipline chief executive. “Every human in that region or country [can be] within a 15-25 minute delivery of any essential medical product — it’s a different way of thinking about universal coverage.”

Zipline will deliver vaccines for yellow fever, polio, diptheria and tetanus which are provided by the World Health Organisation’s Expanded Project on Immunisation. The WHO will also use the company’s system for future mass immunisation programmes in Ghana.

Later this year, Zipline has plans to start operations in the US, in North Carolina, and in south-east Asia. The company said it will be able to serve 100m people within a year, up from the 22m that its projects in Ghana and Rwanda will cover.

In Ghana, Zipline said health workers will receive deliveries via a parachute drop within about 30 minutes of placing their orders by text message….(More)”.

Whose Commons? Data Protection as a Legal Limit of Open Science


Mark Phillips and Bartha M. Knoppers in the Journal of Law, Medicine and Ethics: “Open science has recently gained traction as establishment institutions have come on-side and thrown their weight behind the movement and initiatives aimed at creation of information commons. At the same time, the movement’s traditional insistence on unrestricted dissemination and reuse of all information of scientific value has been challenged by the movement to strengthen protection of personal data. This article assesses tensions between open science and data protection, with a focus on the GDPR.

Powerful institutions across the globe have recently joined the ranks of those making substantive commitments to “open science.” For example, the European Commission and the NIH National Cancer Institute are supporting large-scale collaborations, such as the Cancer Genome Collaboratory, the European Open Science Cloud, and the Genomic Data Commons, with the aim of making giant stores of genomic and other data readily available for analysis by researchers. In the field of neuroscience, the Montreal Neurological Institute is midway through a novel five-year project through which it plans to adopt open science across the full spectrum of its research. The commitment is “to make publicly available all positive and negative data by the date of first publication, to open its biobank to registered researchers and, perhaps most significantly, to withdraw its support of patenting on any direct research outputs.” The resources and influence of these institutions seem to be tipping the scales, transforming open science from a longstanding aspirational ideal into an existing reality.

Although open science lacks any standard, accepted definition, one widely-cited model proposed by the Austria-based advocacy effort openscienceASAP describes it by reference to six principles: open methodology, open source, open data, open access, open peer review, and open educational resources. The overarching principle is “the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process.” This article adopts this principle as a working definition of open science, with a particular emphasis on open sharing of human data.

As noted above, many of the institutions committed to open science use the word “commons” to describe their initiatives, and the two concepts are closely related. “Medical information commons” refers to “a networked environment in which diverse sources of health, medical, and genomic information on large populations become widely shared resources.” Commentators explicitly link the success of information commons and progress in the research and clinical realms to open science-based design principles such as data access and transparent analysis (i.e., sharing of information about methods and other metadata together with medical or health data).

But what legal, as well as ethical and social, factors will ultimately shape the contours of open science? Should all restrictions be fought, or should some be allowed to persist, and if so, in what form? Given that a commons is not a free-for-all, in that its governing rules shape its outcomes, how might we tailor law and policy to channel open science to fulfill its highest aspirations, such as universalizing practical access to scientific knowledge and its benefits, and avoid potential pitfalls? This article primarily concerns research data, although passing reference is also made to the approach to the terms under which academic publications are available, which are subject to similar debates….(More)”.

The Importance of Data Access Regimes for Artificial Intelligence and Machine Learning


JRC Digital Economy Working Paper by Bertin Martens: “Digitization triggered a steep drop in the cost of information. The resulting data glut created a bottleneck because human cognitive capacity is unable to cope with large amounts of information. Artificial intelligence and machine learning (AI/ML) triggered a similar drop in the cost of machine-based decision-making and helps in overcoming this bottleneck. Substantial change in the relative price of resources puts pressure on ownership and access rights to these resources. This explains pressure on access rights to data. ML thrives on access to big and varied datasets. We discuss the implications of access regimes for the development of AI in its current form of ML. The economic characteristics of data (non-rivalry, economies of scale and scope) favour data aggregation in big datasets. Non-rivalry implies the need for exclusive rights in order to incentivise data production when it is costly. The balance between access and exclusion is at the centre of the debate on data regimes. We explore the economic implications of several modalities for access to data, ranging from exclusive monopolistic control to monopolistic competition and free access. Regulatory intervention may push the market beyond voluntary exchanges, either towards more openness or reduced access. This may generate private costs for firms and individuals. Society can choose to do so if the social benefits of this intervention outweigh the private costs.

We briefly discuss the main EU legal instruments that are relevant for data access and ownership, including the General Data Protection Regulation (GDPR) that defines the rights of data subjects with respect to their personal data and the Database Directive (DBD) that grants ownership rights to database producers. These two instruments leave a wide legal no-man’s land where data access is ruled by bilateral contracts and Technical Protection Measures that give exclusive control to de facto data holders, and by market forces that drive access, trade and pricing of data. The absence of exclusive rights might facilitate data sharing and access or it may result in a segmented data landscape where data aggregation for ML purposes is hard to achieve. It is unclear if incompletely specified ownership and access rights maximize the welfare of society and facilitate the development of AI/ML…(More)”