How Technology is Crowd-Sourcing the Fight Against Hunger


Beth Noveck at Media Planet: “There is more than enough food produced to feed everyone alive today. Yet access to nutritious food is a challenge everywhere and depends on getting every citizen involved, not just large organizations. Technology is helping to democratize and distribute the job of tackling the problem of hunger in America and around the world.

Real-time research

One of the hardest problems is the difficulty of gaining real-time insight into food prices and shortages. Enter technology. We no longer have to rely on professional inspectors slowly collecting information face-to-face. The UN World Food Programme, which provides food assistance to 80 million people each year, together with Nielsen is conducting mobile phone surveys in 15 countries (with plans to expand to 30), asking people by voice and text about what they are eating. Formerly blank maps are now filled in with information provided quickly and directly by the most affected people, making it easy to prioritize the allocation of resources.

Technology helps the information flow in both directions, enabling those in need to reach out, but also to become more effective at helping themselves. The Indian Ministry of Agriculture, in collaboration with Reuters Market Light, provides information services in nine Indian languages to 1.4 million registered farmers in 50,000 villages across 17 Indian states via text and voice messages.

“In the United States, 40 percent of the food produced here is wasted, and yet 1 in 4 American children (and 1 in 6 adults) remain food insecure…”

Data to the people

New open data laws and policies that encourage more transparent publication of public information complement data collection and dissemination technologies such as phones and tablets. About 70 countries and hundreds of regions and cities have adopted open data policies, which guarantee that the information these public institutions collect be available for free use by the public. As a result, there are millions of open datasets now online on websites such as the Humanitarian Data Exchange, which hosts 4,000 datasets such as country-by-country stats on food prices and undernourishment around the world.

Companies are compiling and sharing data to combat food insecurity, too. Anyone can dig into the data on the Global Open Data for Agriculture and Nutrition platform, a data collaborative where 300 private and public partners are sharing information.

Importantly, this vast quantity of open data is available to anyone, not only to governments. As a result, large and small entrepreneurs are able to create new apps and programs to combat food insecurity, such as Plantwise, which uses government data to offer a knowledge bank and run “plant clinics” that help farmers lose less of what they grow to pests. Google uses open government data to show people the location of farmers markets near their homes.

Students, too, can learn to play a role. For the second summer in a row, the Governance Lab at New York University, in partnership with the United States Department of Agriculture (USDA), mounted a two-week open data summer camp for 40 middle and high school students. The next generation of problem solvers is learning new data science skills by working on food safety and other projects using USDA open data.

Enhancing connection

Ultimately, technology enables greater communication and collaboration among the public, social service organizations, restaurants, farmers and other food producers who must work together to avoid food crises. The European Food Safety Authority in Italy has begun exploring how to use internet-based collaboration (often called citizen science or crowdsourcing) to get more people involved in food and feed risk assessment.

In the United States, 40 percent of the food produced here is wasted, and yet 1 in 4 American children (and 1 in 6 adults) remain food insecure, according to the Rockefeller Foundation. Copia, a San Francisco based smartphone app facilitates donations and deliveries of those with excess food in six cities in the Bay Area. Zero Percent in Chicago similarly attacks the distribution problem by connecting restaurants to charities to donate their excess food. Full Harvest is a tech platform that facilitates the selling of surplus produce that otherwise would not have a market.

Mobilizing the world

Prize-backed challenges create the incentives for more people to collaborate online and get involved in the fight against hunger….(More)”

Crowdsource Europe wants people to write their own constitution


Deutsche Welle: “A public interest organization called Crowdsource Europe wants citizens to formulate their own constitution. If successful, the document could even replace the Lisbon treaty, the campaign’s organizers say.

“Crowdsource Europe is building a platform to work together with all Europeans to create a People’s Constitution, by the people, for the people,” the organizers said on their website. The goal is to create a document that captures the shared values and collective ideas for the future of Europe.

“We launched the project in May 2016. The motivation was to let the people of Europe decide what the EU should be about. Cooperation within Europe is important, but too often people don’t feel connected with the EU where technocrats decide from their ivory tower,” the project’s organizers Thomas de Groot, Mathijs Pontier and Melissa Koutouzis told DW.

 De Groot, Pontier and Koutouzis, who are members of the Amsterdam Pirate Party, want to show the European Parliament that people can work together to shape a European future.

“In the current representative democracy, people have the ability to vote once every several years (five years in the case of the European Parliament). After that, the possibilities for participation are very limited. As a result, many people don’t feel represented, and many people don’t even make the effort to vote for the European Parliament,” they told DW.

Crowdsource Europe’s idea of “interactive democracy” helps bridge that gap. In this concept, everyone has the ability to propose ideas and discuss them….Writing a constitution, especially the way de Groot and his partners Pontier and Koutouzis envisage it, is very easy. Interested people can log on to the People’s Constitution website (https://peoplesconstitution.eu/). A “How-to” tab explains users the ways in which they can enter their details and descriptions of laws they want added into the “constitution.”…The idea of writing a people’s constitution for all of Europe was inspired by

The idea of writing a people’s constitution for all of Europe was inspired by Iceland’s experiment in redrafting the document….(More)”

The Internet for farmers without Internet


Project Breakthrough: “Mobile Internet is rapidly becoming a primary source of knowledge for rural populations in developing countries. But not every one of the world’s 500 million smallholder farmers is connected to the Internet – which means they can struggle to solve daily agricultural challenges. With no way to access to information on things like planting, growing and selling, farmers in Asia, Latin America and Africa simply cannot grow. Many live on less than a dollar a day and don’t have smartphones to ask Google what to do.

London-based startup WeFarm is the world’s first free peer-to-peer network that spreads crowdsourced knowledge via SMS messages, which only need simple mobile phones. Since launching in November 2015, its aim has been to give remote, offline farmers access to the vital innovative insight, such as crop diversification, tackling soil erosion or changing climatic conditions. Billing itself as ‘The internet for people without the internet’, WeFarm strongly believes in the power of grassroots information. That’s why it costs nothing.

“With WeFarm we want all farmers in the world to be able to search for and access the information they need to improve their livelihoods,” Kenny Ewan, CEO tells us. The seeds for his idea were planted after many years working with indigenous communities in Latin America, based in Peru. “To me it makes perfect sense to allow farmers to connect with other farmers in order to find solutions to their problems. These farmers are experts in agriculture, and they come up with low-cost, innovative solutions, that are easy to implement.”

Farmers send questions by SMS to a local WeFarm number. Then they are connected to a huge crowdsourcing platform. The network’s back-end uses machine-learning algorithms to match them to farmers with answers. This data creates a sort of Google for agriculture…(More)”

Bouchra Khalili: The Mapping Journey Project


MOMA (NYC): “This exhibition presents, in its entirety, Bouchra Khalili’s The Mapping Journey Project (2008–11), a series of videos that details the stories of eight individuals who have been forced by political and economic circumstances to travel illegally and whose covert journeys have taken them throughout the Mediterranean basin. Khalili (Moroccan-French, born 1975) encountered her subjects by chance in transit hubs across Europe, North Africa, and the Middle East. Following an initial meeting, the artist invited each person to narrate his or her journey and trace it in thick permanent marker on a geopolitical map of the region. The videos feature the subjects’ voices and their hands sketching their trajectories across the map, while their faces remain unseen.

The stories are presented on individual screens positioned throughout MoMA’s Donald B. and Catherine C. Marron Atrium. In this way, a complex network of migration is narrated by those who have experienced it, refusing the forms of representation and visibility demanded by systems of surveillance, international border control, and the news media. Shown together, the videos function as an alternative geopolitical map defined by the precarious lives of stateless people. Khalili’s work takes on the challenge of developing critical and ethical approaches to questions of citizenship, community, and political agency….(More)”

Artificial intelligence is hard to see


Kate Crawford and Meredith Whittaker on “Why we urgently need to measure AI’s societal impacts“: “How will artificial intelligence systems change the way we live? This is a tough question: on one hand, AI tools are producing compelling advances in complex tasks, with dramatic improvements in energy consumption, audio processing, and leukemia detection. There is extraordinary potential to do much more in the future. On the other hand, AI systems are already making problematic judgements that are producing significant social, cultural, and economic impacts in people’s everyday lives.

AI and decision-support systems are embedded in a wide array of social institutions, from influencing who is released from jail to shaping the news we see. For example, Facebook’s automated content editing system recently censored the Pulitzer-prize winning image of a nine-year old girl fleeing napalm bombs during the Vietnam War. The girl is naked; to an image processing algorithm, this might appear as a simple violation of the policy against child nudity. But to human eyes, Nick Ut’s photograph, “The Terror of War”, means much more: it is an iconic portrait of the indiscriminate horror of conflict, and it has an assured place in the history of photography and international politics. The removal of the image caused an international outcry before Facebook backed down and restored the image. “What they do by removing such images, no matter what good intentions, is to redact our shared history,” said the Prime Minister of Norway, Erna Solberg.

It’s easy to forget that these high-profile instances are actually the easy cases. As Tarleton Gillespie has observed, hundreds of content reviews are occurring with Facebook images thousand of times per day, and rarely is there a Pulitzer prize to help determine lasting significance. Some of these reviews include human teams, and some do not. In this case, there is alsoconsiderable ambiguity about where the automated process ended and the human review began: which is part of the problem. And Facebook is just one player in complex ecology of algorithmically-supplemented determinations with little external monitoring to see how decisions are made or what the effects might be.

The ‘Terror of War’ case, then, is the tip of the iceberg: a rare visible instance that points to a much larger mass of unseen automated and semi-automated decisions. The concern is that most of these ‘weak AI’ systems are making decisions that don’t garner such attention. They are embedded at the back-end of systems, working at the seams of multiple data sets, with no consumer-facing interface. Their operations are mainly unknown, unseen, and with impacts that take enormous effort to detect.

Sometimes AI techniques get it right, and sometimes they get it wrong. Only rarely will those errors be seen by the public: like the Vietnam war photograph, or when a AI ‘beauty contest’ held this month was called out for being racist for selecting white women as the winners. We can dismiss this latter case as a problem of training data — they simply need a more diverse selection of faces to train their algorithm with, and now that 600,000 people have sent in their selfies, they certainly have better means to do so. But while a beauty contest might seem like a bad joke, or just a really good trick to get people to give up their photos to build a large training data set, it points to a much bigger set of problems. AI and decision-support systems are reaching into everyday life: determining who will be on a predictive policing‘heat list’, who will be hired or promoted, which students will be recruited to universities, or seeking to predict at birth who will become a criminal by the age of 18. So the stakes are high…(More)”

“Big Data Europe” addresses societal challenges with data technologies


Press Release: “Across society, from health to agriculture and transport, from energy to climate change and security, practitioners in every discipline recognise the potential of the enormous amounts of data being created every day. The challenge is to capture, manage and process that information to derive meaningful results and make a difference to people’s lives. The Big Data Europe project has just released the first public version of its open source platform designed to do just that. In 7 pilot studies, it is helping to solve societal challenges by putting cutting edge technology in the hands of experts in fields other than IT.

Although many crucial big data technologies are freely available as open source software, they are often difficult for non-experts to integrate and deploy. Big Data Europe solves that problem by providing a package that can readily be installed locally or at any scale in a cloud infrastructure by a systems administrator, and configured via a simple user interface. Tools like Apache Hadoop, Apache Spark, Apache Flink and many others can be instantiated easily….

The tools included in the platform were selected after a process of requirements-gathering across the seven societal challenges identified by the European Commission (Health, Food, Energy, Transport, Climate, Social Sciences and Security). Tasks like message passing are handled using Kafka and Flume, storage by Hive and Cassandra, or publishing through geotriples. The platform uses the Docker system to make it easy to add new tools and, again, for them to operate at a scale limited only by the computing infrastructure….

The platform can be downloaded from GitHub.
See also the installation instructions, Getting Started and video.”

The Ethics of Influence: Government in the Age of Behavioral Science


New book by Cass R. Sunstein: “In recent years, ‘Nudge Units’ or ‘Behavioral Insights Teams’ have been created in the United States, the United Kingdom, Germany, and other nations. All over the world, public officials are using the behavioral sciences to protect the environment, promote employment and economic growth, reduce poverty, and increase national security. In this book, Cass R. Sunstein, the eminent legal scholar and best-selling co-author of Nudge (2008), breaks new ground with a deep yet highly readable investigation into the ethical issues surrounding nudges, choice architecture, and mandates, addressing such issues as welfare, autonomy, self-government, dignity, manipulation, and the constraints and responsibilities of an ethical state. Complementing the ethical discussion, The Ethics of Influence: Government in the Age of Behavioral Science contains a wealth of new data on people’s attitudes towards a broad range of nudges, choice architecture, and mandates…(More)”

Trust in Government


First issue of the Government Oxford Review focusing on trust (or lack of trust) in government:

“In 2016, governments are in the firing line. Their populations suspect them of accelerating globalisation for the benefit of the few, letting trade drive away jobs, and encouraging immigration so as to provide cheaper labour and to fill skills-gaps without having to invest in training. As a result the ‘anti-government’, ‘anti-expert’, ‘anti-immigration’ movements are rapidly gathering support. The Brexit campaign in the United Kingdom, the Presidential run of Donald Trump in the United States, and the Five Star movement in Italy are but three examples.” Dean Ngaire Woods

Our contributors have shed an interesting, and innovative, light on this issue. McKinsey’s Andrew Grant and Bjarne Corydon discuss the importance of transparency and accountability of government, while Elizabeth Linos, from the Behavioural Insights Team in North America, and Princeton’s Eldar Shafir discuss how behavioural science can be utilised to implement better policy, and Geoff Mulgan, CEO at Nesta, provides insights into how harnessing technology can bring about increased collective intelligence.

The Conference Addendum features panel summaries from the 2016 Challenges of Government Conference, written by our MPP and DPhil in Public Policy students.

Ideas to help civil servants understand the opportunities of data


, at Gov.UK: “Back in April we set out our plan for the discovery phase for what we are now calling “data science literacy”. We explained that we were going to undertake user research with civil servants to understand how they use data. The discovery phase has helped clarify the focus of this work, and we have now begun to develop options for a data science literacy service for government.

Discovery has helped us understand what we really mean when we say ‘data literacy’. For one person it can be a basic understanding of statistics, but to someone else it might mean knowledge of new data science approaches. But on the basis of our exploration, we have started to use the term “data science literacy” to mean the ability to understand how new data science techniques and approaches can be applied in real world contexts in the civil service, and to distinguish it from a broader definition of ‘data literacy’….

In the spirit of openness and transparency we are making this long list of ideas available here:

Data science driven apps

One way in which civil servants could come to understand the opportunities of data science would be to experience products and services which are driven by data science in their everyday roles. This could be something like having a recommendation engine for actions provided to them on the basis of information already held on the customer.

Sharing knowledge across government

A key user need from our user research was to understand how others had undertaken data science projects in government. This could be supported by something like a series of videos / podcasts created by civil servants, setting out case studies and approaches to data science in government. Alternatively, we could have a regularly organised speaker series where data science projects across government are presented alongside outside speakers.

Support for using data science in departments

Users in departments need to understand and experience data science projects in government so that they can undertake their own. Potentially this could be achieved through policy, analytical and data science colleagues working in multidisciplinary teams. Colleagues could also be supported by tools of differing levels of complexity ranging from a simple infographic showing at a high level the types of data available in a department to an online tool which diagnoses which approach people should take for a data science project on the basis of their aims and the data available to them.

In practice training

Users could learn more about how to use data science in their jobs by attending more formal training courses. These could take the form of something like an off-site, week-long training course where they experience the stages of undertaking a data science project (similar to the DWP Digital Academy). An alternative model could be to allocate one day a week to work on a project with departmental importance with a data scientist (similar to theData Science Accelerator Programme for analysts).

IMG_1603

Cross-government support for collaboration

For those users who have responsibility for leading on data science transformation in their departments there is also a need to collaborate with others in similar roles. This could be achieved through interventions such as a day-long unconference to discuss anything related to data science, and using online tools such as Google Groups, Slack, Yammer, Trello etc. We also tested the idea of a collaborative online resource where data science leads and others can contribute content and learning materials / approaches.

This is by no means an exhaustive list of potential ways to encourage data science thinking by policy and delivery colleagues across government. We hope this list is of interest to others in the field and we will update in the next six months about the transition of this project to Alpha….(More)”

‘Homo sapiens is an obsolete algorithm’


Extract from Homo Deus: A Brief History of Tomorrow by Yuval Noah Harari: “There’s an emerging market called Dataism, which venerates neither gods nor man – it worships data. From a Dataist perspective, we may interpret the entire human species as a single data-processing system, with individual humans serving as its chips. If so, we can also understand the whole of history as a process of improving the efficiency of this system, through four basic methods:

1. Increasing the number of processors. A city of 100,000 people has more computing power than a village of 1,000 people.

2. Increasing the variety of processors. Different processors may use diverse ways to calculate and analyse data. Using several kinds of processors in a single system may therefore increase its dynamism and creativity. A conversation between a peasant, a priest and a physician may produce novel ideas that would never emerge from a conversation between three hunter-gatherers.

3. Increasing the number of connections between processors. There is little point in increasing the mere number and variety of processors if they are poorly connected. A trade network linking ten cities is likely to result in many more economic, technological and social innovations than ten isolated cities.

4. Increasing the freedom of movement along existing connections. Connecting processors is hardly useful if data cannot flow freely. Just building roads between ten cities won’t be very useful if they are plagued by robbers, or if some autocratic despot doesn’t allow merchants and travellers to move as they wish.
These four methods often contradict one another. The greater the number and variety of processors, the harder it is to freely connect them. The construction of the sapiens data-processing system accordingly passed through four main stages, each of which was characterised by an emphasis on different methods.

The first stage began with the cognitive revolution, which made it possible to connect unlimited sapiens into a single data-processing network. This gave sapiens an advantage over all other human and animal species. Although there is a limit to the number of Neanderthals, chimpanzees or elephants you can connect to the same net, there is no limit to the number of sapiens.

Sapiens used their advantage in data processing to overrun the entire world. However, as they spread into different lands and climates they lost touch with one another, and underwent diverse cultural transformations. The result was an immense variety of human cultures, each with its own lifestyle, behaviour patterns and world view. Hence the first phase of history involved an increase in the number and variety of human processors, at the expense of connectivity: 20,000 years ago there were many more sapiens than 70,000 years ago, and sapiens in Europe processed information differently from sapiens in China. However, there were no connections between people in Europe and China, and it would have seemed utterly impossible that all sapiens may one day be part of a single data-processing web.
The second stage began with agriculture and continued until the invention of writing and money. Agriculture accelerated demographic growth, so the number of human processors rose sharply, while simultaneously enabling many more people to live together in the same place, thereby generating dense local networks that contained an unprecedented number of processors. In addition, agriculture created new incentives and opportunities for different networks to trade and communicate.

Nevertheless, during the second phase, centrifugal forces remained predominant. In the absence of writing and money, humans could not establish cities, kingdoms or empires. Humankind was still divided into innumerable little tribes, each with its own lifestyle and world view. Uniting the whole of humankind was not even a fantasy.
The third stage kicked off with the appearance of writing and money about 5,000 years ago, and lasted until the beginning of the scientific revolution. Thanks to writing and money, the gravitational field of human co-operation finally overpowered the centrifugal forces. Human groups bonded and merged to form cities and kingdoms. Political and commercial links between different cities and kingdoms also tightened. At least since the first millennium BC – when coinage, empires, and universal religions appeared – humans began to consciously dream about forging a single network that would encompass the entire globe.

This dream became a reality during the fourth and last stage of history, which began around 1492. Early modern explorers, conquerors and traders wove the first thin threads that encompassed the whole world. In the late modern period, these threads were made stronger and denser, so that the spider’s web of Columbus’s days became the steel and asphalt grid of the 21st century. Even more importantly, information was allowed to flow increasingly freely along this global grid. When Columbus first hooked up the Eurasian net to the American net, only a few bits of data could cross the ocean each year, running the gauntlet of cultural prejudices, strict censorship and political repression.

But as the years went by, the free market, the scientific community, the rule of law and the spread of democracy all helped to lift the barriers. We often imagine that democracy and the free market won because they were “good”. In truth, they won because they improved the global data-processing system.

So over the last 70,000 years humankind first spread out, then separated into distinct groups and finally merged again. Yet the process of unification did not take us back to the beginning. When the different human groups fused into the global village of today, each brought along its unique legacy of thoughts, tools and behaviours, which it collected and developed along the way. Our modern larders are now stuffed with Middle Eastern wheat, Andean potatoes, New Guinean sugar and Ethiopian coffee. Similarly, our language, religion, music and politics are replete with heirlooms from across the planet.
If humankind is indeed a single data-processing system, what is its output? Dataists would say that its output will be the creation of a new and even more efficient data-processing system, called the Internet-of-All-Things. Once this mission is accomplished, Homo sapiens will vanish….(More)