Responsible Data in Agriculture


Report by Lindsay Ferris and Zara Rahman for GODAN: “The agriculture sector is creating increasing amounts of data, from many different sources. From tractors equipped with GPS tracking, to open data released by government ministries, data is becoming ever more valuable, as agricultural business development and global food policy decisions are being made based upon data. But the sector is also home to severe resource inequality. The largest agricultural companies make billions of dollars per year, in comparison with subsistence farmers growing just enough to feed themselves, or smallholder farmers who grow enough to sell on a year-by-year basis. When it comes to data and technology, these differences in resources translate to stark power imbalances in data access and use. The most well resourced actors are able to delve into new technologies and make the most of those insights, whereas others are unable to take any such risks or divert any of their limited resources. Access to and use of data has radically changed the business models and behaviour of some of those well resourced actors, but in contrast, those with fewer resources are receiving the same, limited access to information that they always have.

In this paper, we have approached these issues from a responsible data perspective, drawing upon the experience of the Responsible Data community1 who over the past three years have created tools, questions and resources to deal with the ethical, legal, privacy and security challenges that come from new uses of data in various sectors. This piece aims to provide a broad overview of some of the responsible data challenges facing these actors, with a focus on the power imbalance between actors, and looking into how that inequality affects behaviour when it comes to the agricultural data ecosystem. What are the concerns of those with limited resources, when it comes to this new and rapidly changing data environment? In addition, what are the ethical grey areas or uncertainties that we need to address in the future? As a first attempt to answer these questions, we spoke to 14 individuals with various perspectives on the sector to understand what the challenges are for them and for the people they work with. We also carried out desk research to dive deeper into these issues, and we provide here an analysis of our findings and responsible data challenges….(More)”

Infostorms. Why do we ‘like’? Explaining individual behavior on the social net.


Book by Hendricks, Vincent F. and  Hansen, Pelle G.: “With points of departure in philosophy, logic, social psychology, economics, and choice and game theory, Infostorms shows how information may be used to improve the quality of personal decision and group thinking but also warns against the informational pitfalls which modern information technology may amplify: From science to reality culture and what it really is, that makes you buy a book like this.

The information society is upon us. New technologies have given us back pocket libraries, online discussion forums, blogs, crowdbased opinion aggregators, social media and breaking news wherever, whenever. But are we more enlightened and rational because of it?

Infostorms provides the nuts and bolts of how irrational group behaviour may get amplified by social media and information technology. If we could be collectively dense before, now we can do it at light speed and with potentially global reach. That’s how things go viral, that is how cyberbullying, rude comments online, opinion bubbles, status bubbles, political polarisation and a host of other everyday unpleasantries start. Infostorms will give the story of the mechanics of these phenomena. This will help you to avoid them if you want or learn to start them if you must. It will allow you to stay sane in an insane world of information….(More)”

Research Handbook on Digital Transformations


Book edited by F. Xavier Olleros and Majlinda Zhegu: “The digital transition of the world economy is now entering a phase of broad and deep societal impact. While there is one overall transition, there are many different sectoral transformations, from health and legal services to tax reports and taxi rides, as well as a rising number of transversal trends and policy issues, from widespread precarious employment and privacy concerns to market monopoly and cybercrime. This Research Handbook offers a rich and interdisciplinary synthesis of some of the recent research on the digital transformations currently under way.

This comprehensive study contains chapters covering sectoral and transversal analyses, all of which are specially commissioned and include cutting-edge research. The contributions featured are global, spanning four continents and seven different countries, as well as interdisciplinary, including experts in economics, sociology, law, finance, urban planning and innovation management. The digital transformations discussed are fertile ground for researchers, as established laws and regulations, organizational structures, business models, value networks and workflow routines are contested and displaced by newer alternatives….(More)”

‘Homo sapiens is an obsolete algorithm’


Extract from Homo Deus: A Brief History of Tomorrow by Yuval Noah Harari: “There’s an emerging market called Dataism, which venerates neither gods nor man – it worships data. From a Dataist perspective, we may interpret the entire human species as a single data-processing system, with individual humans serving as its chips. If so, we can also understand the whole of history as a process of improving the efficiency of this system, through four basic methods:

1. Increasing the number of processors. A city of 100,000 people has more computing power than a village of 1,000 people.

2. Increasing the variety of processors. Different processors may use diverse ways to calculate and analyse data. Using several kinds of processors in a single system may therefore increase its dynamism and creativity. A conversation between a peasant, a priest and a physician may produce novel ideas that would never emerge from a conversation between three hunter-gatherers.

3. Increasing the number of connections between processors. There is little point in increasing the mere number and variety of processors if they are poorly connected. A trade network linking ten cities is likely to result in many more economic, technological and social innovations than ten isolated cities.

4. Increasing the freedom of movement along existing connections. Connecting processors is hardly useful if data cannot flow freely. Just building roads between ten cities won’t be very useful if they are plagued by robbers, or if some autocratic despot doesn’t allow merchants and travellers to move as they wish.
These four methods often contradict one another. The greater the number and variety of processors, the harder it is to freely connect them. The construction of the sapiens data-processing system accordingly passed through four main stages, each of which was characterised by an emphasis on different methods.

The first stage began with the cognitive revolution, which made it possible to connect unlimited sapiens into a single data-processing network. This gave sapiens an advantage over all other human and animal species. Although there is a limit to the number of Neanderthals, chimpanzees or elephants you can connect to the same net, there is no limit to the number of sapiens.

Sapiens used their advantage in data processing to overrun the entire world. However, as they spread into different lands and climates they lost touch with one another, and underwent diverse cultural transformations. The result was an immense variety of human cultures, each with its own lifestyle, behaviour patterns and world view. Hence the first phase of history involved an increase in the number and variety of human processors, at the expense of connectivity: 20,000 years ago there were many more sapiens than 70,000 years ago, and sapiens in Europe processed information differently from sapiens in China. However, there were no connections between people in Europe and China, and it would have seemed utterly impossible that all sapiens may one day be part of a single data-processing web.
The second stage began with agriculture and continued until the invention of writing and money. Agriculture accelerated demographic growth, so the number of human processors rose sharply, while simultaneously enabling many more people to live together in the same place, thereby generating dense local networks that contained an unprecedented number of processors. In addition, agriculture created new incentives and opportunities for different networks to trade and communicate.

Nevertheless, during the second phase, centrifugal forces remained predominant. In the absence of writing and money, humans could not establish cities, kingdoms or empires. Humankind was still divided into innumerable little tribes, each with its own lifestyle and world view. Uniting the whole of humankind was not even a fantasy.
The third stage kicked off with the appearance of writing and money about 5,000 years ago, and lasted until the beginning of the scientific revolution. Thanks to writing and money, the gravitational field of human co-operation finally overpowered the centrifugal forces. Human groups bonded and merged to form cities and kingdoms. Political and commercial links between different cities and kingdoms also tightened. At least since the first millennium BC – when coinage, empires, and universal religions appeared – humans began to consciously dream about forging a single network that would encompass the entire globe.

This dream became a reality during the fourth and last stage of history, which began around 1492. Early modern explorers, conquerors and traders wove the first thin threads that encompassed the whole world. In the late modern period, these threads were made stronger and denser, so that the spider’s web of Columbus’s days became the steel and asphalt grid of the 21st century. Even more importantly, information was allowed to flow increasingly freely along this global grid. When Columbus first hooked up the Eurasian net to the American net, only a few bits of data could cross the ocean each year, running the gauntlet of cultural prejudices, strict censorship and political repression.

But as the years went by, the free market, the scientific community, the rule of law and the spread of democracy all helped to lift the barriers. We often imagine that democracy and the free market won because they were “good”. In truth, they won because they improved the global data-processing system.

So over the last 70,000 years humankind first spread out, then separated into distinct groups and finally merged again. Yet the process of unification did not take us back to the beginning. When the different human groups fused into the global village of today, each brought along its unique legacy of thoughts, tools and behaviours, which it collected and developed along the way. Our modern larders are now stuffed with Middle Eastern wheat, Andean potatoes, New Guinean sugar and Ethiopian coffee. Similarly, our language, religion, music and politics are replete with heirlooms from across the planet.
If humankind is indeed a single data-processing system, what is its output? Dataists would say that its output will be the creation of a new and even more efficient data-processing system, called the Internet-of-All-Things. Once this mission is accomplished, Homo sapiens will vanish….(More)

The SAGE Handbook of Digital Journalism


Book edited by Tamara WitschgeC. W. AndersonDavid Domingo, and Alfred Hermida: “The production and consumption of news in the digital era is blurring the boundaries between professionals, citizens and activists. Actors producing information are multiplying, but still media companies hold central position. Journalism research faces important challenges to capture, examine, and understand the current news environment. The SAGE Handbook of Digital Journalism starts from the pressing need for a thorough and bold debate to redefine the assumptions of research in the changing field of journalism. The 38 chapters, written by a team of global experts, are organised into four key areas:

Section A: Changing Contexts

Section B: News Practices in the Digital Era

Section C: Conceptualizations of Journalism

Section D: Research Strategies

By addressing both institutional and non-institutional news production and providing ample attention to the question ‘who is a journalist?’ and the changing practices of news audiences in the digital era, this Handbook shapes the field and defines the roadmap for the research challenges that scholars will face in the coming decades….(More)”

Questioning Big Data: Crowdsourcing crisis data towards an inclusive humanitarian response


Femke Mulder, Julie Ferguson, Peter Groenewegen, Kees Boersma, and Jeroen Wolbers in Big Data and Society: “The aim of this paper is to critically explore whether crowdsourced Big Data enables an inclusive humanitarian response at times of crisis. We argue that all data, including Big Data, are socially constructed artefacts that reflect the contexts and processes of their creation. To support our argument, we qualitatively analysed the process of ‘Big Data making’ that occurred by way of crowdsourcing through open data platforms, in the context of two specific humanitarian crises, namely the 2010 earthquake in Haiti and the 2015 earthquake in Nepal. We show that the process of creating Big Data from local and global sources of knowledge entails the transformation of information as it moves from one distinct group of contributors to the next. The implication of this transformation is that locally based, affected people and often the original ‘crowd’ are excluded from the information flow, and from the interpretation process of crowdsourced crisis knowledge, as used by formal responding organizations, and are marginalized in their ability to benefit from Big Data in support of their own means. Our paper contributes a critical perspective to the debate on participatory Big Data, by explaining the process of in and exclusion during data making, towards more responsive humanitarian relief….(More)”.

Achieving Open Justice through Citizen Participation and Transparency


Book edited by Carlos E. Jiménez-Gómez and Mila Gascó-Hernández: “Open government initiatives have become a defining goal for public administrators around the world. However, progress is still necessary outside of the executive and legislative sectors.

Achieving Open Justice through Citizen Participation and Transparency is a pivotal reference source for the latest scholarly research on the implementation of open government within the judiciary field, emphasizing the effectiveness and accountability achieved through these actions. Highlighting the application of open government concepts in a global context, this book is ideally designed for public officials, researchers, professionals, and practitioners interested in the improvement of governance and democracy….(More)

 

The Four-Dimensional Human


Book by Laurence Scott: “You are a four-dimensional human.

Each of us exists in three-dimensional, physical space. But, as a constellation of everyday digital phenomena rewires our lives, we are increasingly coaxed from the containment of our predigital selves into a wonderful and eerie fourth dimension, a world of ceaseless communication, instant information, and global connection.

Our portals to this new world have been wedged open, and the silhouette of a figure is slowly taking shape. But what does it feel like to be four-dimensional? How do digital technologies influence the rhythms of our thoughts, the style and tilt of our consciousness? What new sensitivities and sensibilities are emerging with our exposure to the delights, sorrows, and anxieties of a networked world? And how do we live in public with these recoded private lives?

Laurence Scott—hailed as a “New Generation Thinker” by the Arts and Humanities Research Council and the BBC—shows how this four-dimensional life is dramatically changing us by redefining our social lives and extending the limits of our presence in the world. Blending tech-philosophy with insights on everything from Seinfeld to the fall of Gaddafi, Scott stands with a rising generation of social critics hoping to understand our new reality. His virtuosic debut is a revelatory and original exploration of life in the digital age….(More)”

Why Zika, Malaria and Ebola should fear analytics


Frédéric Pivetta at Real Impact Analytics:Big data is a hot business topic. It turns out to be an equally hot topic for the non profit sector now that we know the vital role analytics can play in addressing public health issues and reaching sustainable development goals.

Big players like IBM just announced they will help fight Zika by analyzing social media, transportation and weather data, among other indicators. Telecom data takes it further by helping to predict the spread of disease, identifying isolated and fragile communities and prioritizing the actions of aid workers.

The power of telecom data

Human mobility contributes significantly to epidemic transmission into new regions. However, there are gaps in understanding human mobility due to the limited and often outdated data available from travel records. In some countries, these are collected by health officials in the hospitals or in occasional surveys.

Telecom data, constantly updated and covering a large portion of the population, is rich in terms of mobility insights. But there are other benefits:

  • it’s recorded automatically (in the Call Detail Records, or CDRs), so that we avoid data collection and response bias.
  • it contains localization and time information, which is great for understanding human mobility.
  • it contains info on connectivity between people, which helps understanding social networks.
  • it contains info on phone spending, which allows tracking of socio-economic indicators.

Aggregated and anonymized, mobile telecom data fills the public data gap without questioning privacy issues. Mixing it with other public data sources results in a very precise and reliable view on human mobility patterns, which is key for preventing epidemic spreads.

Using telecom data to map epidemic risk flows

So how does it work? As in any other big data application, the challenge is to build the right predictive model, allowing decision-makers to take the most appropriate actions. In the case of epidemic transmission, the methodology typically includes five steps :

  • Identify mobility patterns relevant for each particular disease. For example, short-term trips for fast-spreading diseases like Ebola. Or overnight trips for diseases like Malaria, as it spreads by mosquitoes that are active only at night. Such patterns can be deduced from the CDRs: we can actually find the home location of each user by looking at the most active night tower, and then tracking calls to identify short or long-term trips. Aggregating data per origin-destination pairs is useful as we look at intercity or interregional transmission flows. And it protects the privacy of individuals, as no one can be singled out from the aggregated data.
  • Get data on epidemic incidence, typically from local organisations like national healthcare systems or, in case of emergency, from NGOs or dedicated emergency teams. This data should be aggregated on the same level of granularity than CDRs.
  • Knowing how many travelers go from one place to another, for how long, and the disease incidence at origin and destination, build an epidemiological model that can account for the way and speed of transmission of the particular disease.
  • With an import/export scoring model, map epidemic risk flows and flag areas that are at risk of becoming the new hotspots because of human travel.
  • On that base, prioritize and monitor public health measures, focusing on restraining mobility to and from hotspots. Mapping risk also allows launching prevention campaigns at the right places and setting up the necessary infrastructure on time. Eventually, the tool reduces public health risks and helps stem the epidemic.

That kind of application works in a variety of epidemiological contexts, including Zika, Ebola, Malaria, Influenza or Tuberculosis. No doubt the global boom of mobile data will proof extraordinarily helpful in fighting these fierce enemies….(More)”

Effect of Government Data Openness on a Knowledge-based Economy


Jae-Nam LeeJuyeon Ham and Byounggu Choi at Procedia Computer Science: “Many governments have recently begun to adopt the concept of open innovation. However, studies on the openness of government data and its effect on the global competitiveness have not received much attention. Therefore, this study aims to investigate the effects of government data openness on a knowledge-based economy at the government level. The proposed model was analyzed using secondary data collected from three different reports. The findings indicate that government data openness positively affects the formation of knowledge bases in a country and that the level of knowledge base of a country positively affects the global competitiveness of a country….(More)”