Satellites Predict a Cholera Outbreak Weeks in Advance


Sarah Derouin at Scientific American: “Orbiting satellites can warn us of bad weather and help us navigate to that new taco joint. Scientists are also using data satellites to solve a worldwide problem: predicting cholera outbreaks.

Cholera infects millions of people each year, leading to thousands of deaths. Often communities do not realize an epidemic is underway until infected individuals swarm hospitals. Advanced warning for impending epidemics could help health workers prepare for the onslaught—stockpiling rehydration supplies, medicines and vaccines—which can save lives and quell the disease’s spread. Back in May 2017 a team of scientists used satellite information to assess whether an outbreak would occur in Yemen, and they ended up predicting an outburst that spread across the country in June….

At the American Geophysical Union annual meeting in December, Jutla presented the group’s prediction model of cholera for Yemen. The team used a handful of satellites to monitor temperatures, water storage, precipitation and land around the country. By processing that information in algorithms they developed, the team predicted areas most at risk for an outbreak over the upcoming month.

Weeks later an epidemic occurred that closely resembled what the model had predicted. “It was something we did not expect,” Jutla says, because they had built the algorithms—and calibrated and validated them—on data from the Bengal Delta in southern Asia as well as parts of Africa. They were unable to go into war-torn Yemen directly, however. For those reasons, the team had not informed Yemen officials of the predicted June outbreak….(More).”

On democracy


Sophie in ‘t Veld (European Parliament) in a Special Issue of Internet Policy Review on Political micro-targeting edited by Balazs Bodo, Natali Helberger and Claes de Vreese: Democracy is valuable and vulnerable, which is reason enough to remain alert for new developments that can undermine her. In recent months, we have seen enough examples of the growing impact of personal data in campaigns and elections. It is important and urgent for us to publicly debate this development. It is easy to see why we should take action against extremist propaganda of hatemongers aiming to recruit young people for violent acts. But we euphemistically speak of ‘fake news’ when lies, ‘half-truths’, conspiracy theories, and sedition creepily poison public opinion.

The literal meaning of democracy is ‘the power of the people’. ‘Power’ presupposes freedom. Freedom to choose and to decide. Freedom from coercion and pressure. Freedom from manipulation. ‘Power’ also presupposes knowledge. Knowledge of all facts, aspects, and options. And knowing how to balance them against each other. When freedom and knowledge are restricted, there can be no power.

In a democracy, every individual choice influences society as a whole. Therefore, the common interest is served with everyone’s ability to make their choices in complete freedom, and with complete knowledge.

The interests of parties and political candidates who compete for citizen’s votes may differ from that higher interest. They want citizens to see their political advertising, and only theirs, not that of their competitors. Not only do parties and candidates compete for the voter’s favour. They contend for his exclusive time and attention as well.

POLITICAL TARGETING

No laws dictate what kind of information a voter should rely on to be able to make the right consideration. For lamb chops, toothpaste, mortgages or cars, for example, it’s mandatory for producers to mention the origin and properties. This enables consumers to make a responsible decision. Providing false information is illegal. All ingredients, properties, and risks have to be mentioned on the label.

Political communication, however, is protected by freedom of speech. Political parties are allowed to use all kinds of sales tricks.

And, of course, campaigns do their utmost and continuously test the limits of the socially acceptable….(More)”.

Big Data and medicine: a big deal?


V. Mayer-Schönberger and E. Ingelsson in the Journal of Internal Medicine: “Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research.

Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data’s role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval….(More)”.

How Tenants Use Digital Mapping to Track Bad Landlords and Gentrification


Hannah Norman at Yes! Magazine: “When Teresa Salazar first encountered the notice posted to her front door—which offered tenants $10,000 to move out of their East Oakland, California, apartment building—she knew the place she called home was in jeopardy.

“All of us were surprised and afraid because it is not easy to move to some other place when the rents are so high,” Salazar said in a video produced by the Anti-Eviction Mapping Project. The project uses mapping as well as data analysis and digital storytelling as organizing tools for low-income tenants to combat eviction and displacement amid the Bay Area’s raging housing crisis.

The jarring move-out offer was left by the Bay Area Property Group, founded by landlord attorney Daniel Bornstein—known for holding landlord workshops on how to evict tenants. The property management firm buys and flips apartment buildings, Salazar said, driving gentrification in neighborhoods like hers. In fear of being displaced, Salazar and other tenants from her building met with counselors from Causa Justa :: Just Cause, a community legal services group. There, they learned about their rights under Oakland’s Just Cause of Eviction Ordinance. With this information, they successfully stood their ground and remained in their homes.

But not all Bay Area tenants are as fortunate as Salazar. Between 2005 and 2015, Oakland witnessed more than 32,402 unlawful detainers, or eviction proceedings, according to data obtained by AEMP through record requests. But AEMP hopes to change these statistics by arming tenants and housing advocates with map-based data to fight evictions and displacements and, ultimately, drive local and state policies on the issue. In addition to mapping, AEMP uses videos of tenants like Salazar to raise awareness of the human experience behind jaw-dropping statistics.

The project is part of a rising tide of social justice cartography, where maps are being harnessed for activism as the technology becomes more accessible….(More)”.

Artificial intelligence and smart cities


Essay by Michael Batty at Urban Analytics and City Sciences: “…The notion of the smart city of course conjures up these images of such an automated future. Much of our thinking about this future, certainly in the more popular press, is about everything ranging from the latest App on our smart phones to driverless cars while somewhat deeper concerns are about efficiency gains due to the automation of services ranging from transit to the delivery of energy. There is no doubt that routine and repetitive processes – algorithms if you like – are improving at an exponential rate in terms of the data they can process and the speed of execution, faithfully following Moore’s Law.

Pattern recognition techniques that lie at the basis of machine learning are highly routinized iterative schemes where the pattern in question – be it a signature, a face, the environment around a driverless car and so on – is computed as an elaborate averaging procedure which takes a series of elements of the pattern and weights them in such a way that the pattern can be reproduced perfectly by the combinations of elements of the original pattern and the weights. This is in essence the way neural networks work. When one says that they ‘learn’ and that the current focus is on ‘deep learning’, all that is meant is that with complex patterns and environments, many layers of neurons (elements of the pattern) are defined and the iterative procedures are run until there is a convergence with the pattern that is to be explained. Such processes are iterative, additive and not much more than sophisticated averaging but using machines that can operate virtually at the speed of light and thus process vast volumes of big data. When these kinds of algorithm can be run in real time and many already can be, then there is the prospect of many kinds of routine behaviour being displaced. It is in this sense that AI might herald in an era of truly disruptive processes. This according to Brynjolfsson and McAfee is beginning to happen as we reach the second half of the chess board.

The real issue in terms of AI involves problems that are peculiarly human. Much of our work is highly routinized and many of our daily actions and decisions are based on relatively straightforward patterns of stimulus and response. The big questions involve the extent to which those of our behaviours which are not straightforward can be automated. In fact, although machines are able to beat human players in many board games and there is now the prospect of machines beating the very machines that were originally designed to play against humans, the real power of AI may well come from collaboratives of man and machine, working together, rather than ever more powerful machines working by themselves. In the last 10 years, some of my editorials have tracked what is happening in the real-time city – the smart city as it is popularly called – which has become key to many new initiatives in cities. In fact, cities – particularly big cities, world cities – have become the flavour of the month but the focus has not been on their long-term evolution but on how we use them on a minute by minute to week by week basis.

Many of the patterns that define the smart city on these short-term cycles can be predicted using AI largely because they are highly routinized but even for highly routine patterns, there are limits on the extent to which we can explain them and reproduce them. Much advancement in AI within the smart city will come from automation of the routine, such as the use of energy, the delivery of location-based services, transit using information being fed to operators and travellers in real time and so on. I think we will see some quite impressive advances in these areas in the next decade and beyond. But the key issue in urban planning is not just this short term but the long term and it is here that the prospects for AI are more problematic….(More)”.

Toward Information Justice


Book by Jeffrey Alan Johnson: “…presents a theory of information justice that subsumes the question of control and relates it to other issues that influence just social outcomes. Data does not exist by nature. Bureaucratic societies must provide standardized inputs for governing algorithms, a problem that can be understood as one of legibility. This requires, though, converting what we know about social objects and actions into data, narrowing the many possible representations of the objects to a definitive one using a series of translations. Information thus exists within a nexus of problems, data, models, and actions that the social actors constructing the data bring to it.

This opens information to analysis from social and moral perspectives, while the scientistic view leaves us blind to the gains from such analysis—especially to the ways that embedded values and assumptions promote injustice. Toward Information Justice answers a key question for the 21st Century: how can an information-driven society be just?

Many of those concerned with the ethics of data focus on control over data, and argue that if data is only controlled by the right people then just outcomes will emerge. There are serious problems with this control metaparadigm, however, especially related to the initial creation of data and prerequisites for its use.  This text is suitable for academics in the fields of information ethics, political theory, philosophy of technology, and science and technology studies, as well as policy professionals who rely on data to reach increasingly problematic conclusions about courses of action….(More)”.

Open Banking ‘revolution’ to challenge banks’ dominance


Shane Hickey at the Guardian: “This week sees the beginning of a quiet revolution in banking which some have championed as one of the greatest shake-ups in personal finance in years, while others have warned it could have serious implications for people’s private data.

It’s the start of a new series of rules concerning “open banking”, where customers will be able to share their personal financial information with companies other than their bank, opening up opportunities to get better deals on mortgages, overdrafts and comparing insurance and broadband deals.

For eager enthusiasts, it will revolutionise banking, make the system more competitive, and give consumers access to the best products for them.

For the more sceptical, among them consumer groups, it could cause problems with the security of previously private data…..

Financial data about how you spend your money, how often you are overdrawn and other details are currently held by your bank.

Under the new rules, the ownership of this data will essentially be transferred to the consumer, meaning that account holders will be able to give companies, other than their own bank, permission to access their details. Underlying this new regulation, which will spread across Europe this month, are EU rules that mean financial institutions must let customers share their data easily and securely. Extra measures are being taken in the UK to push through the changes, with the setting up of standards so data can be shared securely….

While consumers may feel empowered that their data is now in their own hands to do with as they please, with the potential to save money, it could also lead to unease, particularly when dealing with third parties that have brand names they might not recognise, says Shaw.

“One of the things to be mindful of is that consumers could find themselves in a complicated chain of providers. If you authorise one third party to access your money, and if there are potential losses, where does that fall?

“I think data regulators and financial regulators need to be really clear with consumers about how that is going to work. In order for consumers to really engage with this, they need to be confident that there are safeguards in place to protect them. There has been good progress on that.”…(More)”.

Who Owns Urban Mobility Data?


David Zipper at City Lab: “How, exactly, should policymakers respond to the rapid rise of new private mobility services such as ride-hailing, dockless shared bicycles, and microtransit?   … The most likely solution is via a data exchange that anonymizes rider data and gives public experts (and perhaps academic and private ones too) the ability to answer policy questions.

This idea is starting to catch on. The World Bank’s OpenTraffic project, founded in 2016, initially developed ways to aggregate traffic information derived from commercial fleets. A handful of private companies like Grab and Easy Taxi pledged their support when OpenTraffic launched. This fall, the project become part of SharedStreets, a collaboration between the National Association of City Transportation Officials (NACTO), the World Resources Institute, and the OECD’s International Transport Forum to pilot new ways of collecting and sharing a variety of public and private transport data. …(More).

Can Big Data Revolutionize International Human Rights Law?


Galit A. Sarfaty in the Journal of International Law: “International human rights efforts have been overly reliant on reactive tools and focused on treaty compliance, while often underemphasizing the prevention of human rights violations. I argue that data analytics can play an important role in refocusing the international human rights regime on its original goal of preventing human rights abuses, but it comes at a cost.

There are risks in advancing a data-driven approach to human rights, including the privileging of certain rights subject to quantitative measurement and the precipitation of further human rights abuses in the process of preventing other violations. Moreover, the increasing use of big data can ultimately privatize the international human rights regime by transforming the corporation into a primary gatekeeper of rights protection. Such unintended consequences need to be addressed in order to maximize the benefits and minimize the risks of using big data in this field….(More)”.

Using new data sources for policymaking


Technical report by the Joint Research Centre (JRC) of the European Commission: “… synthesises the results of our work on using new data sources for policy-making. It reflects a recent shift from more general considerations in the area of Big Data to a more dedicated investigation of Citizen Science, and it summarizes the state of play. With this contribution, we start promoting Citizen Science as an integral component of public participation in policy in Europe.

The particular need to focus on the citizen dimension emerged due to (i) the increasing interest in the topic from policy Directorate-Generals (DGs) of the European Commission (EC); (ii) the considerable socio-economic impact policy making has on citizens’ life and society as a whole; and (iii) the clear potentiality of citizens’ contributions to increase the relevance of policy making and the effectiveness of policies when addressing societal challenges.

We explicitly concentrate on Citizen Science (or public participation in scientific research) as a way to engage people in practical work, and to develop a mutual understanding between the participants from civil society, research institutions and the public sector by working together on a topic that is of common interest.

Acknowledging this new priority, this report concentrates on the topic of Citizen Science and presents already ongoing collaborations and recent achievements. The presented work particularly addresses environment-related policies, Open Science and aspects of Better Regulation. We then introduce the six phases of the ‘cyclic value chain of Citizen Science’ as a concept to frame citizen engagement in science for policy. We use this structure in order to detail the benefits and challenges of existing approaches – building on the lessons that we learned so far from our own practical work and thanks to the knowledge exchange from third parties. After outlining additional related policy areas, we sketch the future work that is required in order to overcome the identified challenges, and translate them into actions for ourselves and our partners.

Next steps include the following:

 Develop a robust methodology for data collection, analysis and use of Citizen Science for EU policy;

 Provide a platform as an enabling framework for applying this methodology to different policy areas, including the provision of best practices;

 Offer guidelines for policy DGs in order to promote the use of Citizen Science for policy in Europe;

 Experiment and evaluate possibilities of overarching methodologies for citizen engagement in science and policy, and their case specifics; and

 Continue to advance interoperability and knowledge sharing between currently disconnected communities of practise. …(More)”.