Massive Citizen Science Effort Seeks to Survey the Entire Great Barrier Reef


Jessica Wynne Lockhart at Smithsonian: “In August, marine biologists Johnny Gaskell and Peter Mumby and a team of researchers boarded a boat headed into unknown waters off the coasts of Australia. For 14 long hours, they ploughed over 200 nautical miles, a Google Maps cache as their only guide. Just before dawn, they arrived at their destination of a previously uncharted blue hole—a cavernous opening descending through the seafloor.

After the rough night, Mumby was rewarded with something he hadn’t seen in his 30-year career. The reef surrounding the blue hole had nearly 100 percent healthy coral cover. Such a find is rare in the Great Barrier Reef, where coral bleaching events in 2016 and 2017 led to headlines proclaiming the reef “dead.”

“It made me think, ‘this is the story that people need to hear,’” Mumby says.

The expedition from Daydream Island off the coast of Queensland was a pilot program to test the methodology for the Great Reef Census, a citizen science project headed by Andy Ridley, founder of the annual conservation event Earth Hour. His latest organization, Citizens of the Great Barrier Reef, has set the ambitious goal of surveying the entire 1,400-mile-long reef system in 2020…(More)”.

Data gaps threaten achievement of development goals in Africa


Sara Jerving at Devex: “Data gaps across the African continent threaten to hinder the achievement of the Sustainable Development Goals and the African Union’s Agenda 2063, according to the Mo Ibrahim Foundation’s first governance report released on Tuesday.

The report, “Agendas 2063 & 2030: Is Africa On Track?“ based on an analysis of the foundation’s Ibrahim index of African governance, found that since the adoption of both of these agendas, the availability of public data in Africa has declined. With data focused on social outcomes, there has been a notable decline in education, population and vital statistics, such as birth and death records, which allow citizens to access public services.

The index, on which the report is based, is the most comprehensive dataset on African governance, drawing on ten years of data of all 54 African nations. An updated index is released every two years….

The main challenge in the production of quality, timely data, according to the report, is a lack of funding and lack of independence of the national statistical offices.

Only one country, Mauritius, had a perfect score in terms of independence of its national statistics office – meaning that its office can collect the data it chooses, publish without approval from other arms of the government, and is sufficiently funded. Fifteen African nations scored zero in terms of the independence of their offices….(More)”.

Contracting for Personal Data


Paper by Kevin E. Davis and Florencia Marotta-Wurgler: “Is contracting for the collection, use, and transfer of data like contracting for the sale of a horse or a car or licensing a piece of software? Many are concerned that conventional principles of contract law are inadequate when some consumers may not know or misperceive the full consequences of their transactions. Such concerns have led to proposals for reform that deviate significantly from general rules of contract law. However, the merits of these proposals rest in part on testable empirical claims.

We explore some of these claims using a hand-collected data set of privacy policies that dictate the terms of the collection, use, transfer, and security of personal data. We explore the extent to which those terms differ across markets before and after the adoption of the General Data Protection Regulation (GDPR). We find that compliance with the GDPR varies across markets in intuitive ways, indicating that firms take advantage of the flexibility offered by a contractual approach even when they must also comply with mandatory rules. We also compare terms offered to more and less sophisticated subjects to see whether firms may exploit information barriers by offering less favorable terms to more vulnerable subjects….(More)”.

How to ensure that your data science is inclusive


Blog by Samhir Vasdev: “As a new generation of data scientists emerges in Africa, they will encounter relatively little trusted, accurate, and accessible data upon which to apply their skills. It’s time to acknowledge the limitations of the data sources upon which data science relies, particularly in lower-income countries.

The potential of data science to support, measure, and amplify sustainable development is undeniable. As public, private, and civic institutions around the world recognize the role that data science can play in advancing their growth, an increasingly robust array of efforts has emerged to foster data science in lower-income countries.

This phenomenon is particularly salient in Sub-Saharan Africa. There, foundations are investing millions into building data literacy and data science skills across the continent. Multilaterals and national governments are pioneering new investments into data science, artificial intelligence, and smart cities. Private and public donors are building data science centers to build cohorts of local, indigenous data science talent. Local universities are launching graduate-level data science courses.

Despite this progress, among the hype surrounding data science rests an unpopular and inconvenient truth: As a new generation of data scientists emerges in Africa, they will encounter relatively little trusted, accurate, and accessible data that they can use for data science.

We hear promises of how data science can help teachers tailor curricula according to students’ performances, but many school systems don’t collect or track that performance data with enough accuracy and timeliness to perform those data science–enabled tweaks. We believe that data science can help us catch disease outbreaks early, but health care facilities often lack the specific data, like patient origin or digitized information, that is needed to discern those insights.

These fundamental data gaps invite the question: Precisely what data would we perform data science on to achieve sustainable development?…(More)”.

Future Government 2030+: Policy Implications and Recommendations


European Commission: “This report provides follow-up insights into the policy implications and offers a set of 57 recommendations, organised in nine policy areas. These stem from a process based on interviews with 20 stakeholders. The recommendations include a series of policy options and actions that could be implemented at different levels of governance systems.

The Future of Government project started in autumn 2017 as a research project of the Joint Research Centre in collaboration with Directorate General Communication Network and Technologies. It explored how we can rethink the social contract according to the needs of today’s society, what elements need to be adjusted to deliver value and good to people and society, what values we need to improve society, and how we can obtain a new sense of responsibility.

Following the “The Future of Government 2030+: A Citizen-Centric Perspective on New Government Models report“, published on 6 March, the present follow-up report provides follow-up insights into the policy implications and offers a set of 54 recommendations, organised in nine policy areas.

The recommendations of this report include a series of policy options and actions that could be implemented at different levels of governance systems. Most importantly, they include essential elements to help us build our future actions on digital government and address foundational governance challenges of the modern online world (i.e regulation of AI ) in the following 9 axes:

  1. Democracy and power relations: creating clear strategies towards full adoption of open government
  2. Participatory culture and deliberation: skilled and equipped public administration and allocation of resources to include citizens in decision-making
  3. Political trust: new participatory governance mechanisms to raise citizens’ trust
  4. Regulation: regulation on technology should follow discussion on values with full observance of fundamental rights
  5. Public-Private relationship: better synergies between public and private sectors, collaboration with young social entrepreneurs to face forthcoming challenges
  6. Public services: modular and adaptable public services, support Member States in ensuring equal access to technology
  7. Education and literacy: increase digital data literacy, critical thinking and education reforms in accordance to the needs of job markets
  8. Big data and artificial intelligence: ensure ethical use of technology, focus on technologies’ public value, explore ways to use technology for more efficient policy-making
  9. Redesign and new skills for public administration: constant re-evaluation of public servants’ skills, foresight development, modernisation of recruitment processes, more agile forms of working.

As these recommendations have shown, collaboration is needed across different policy fields and they should be acted upon as integrated package. The majority of recommendations is intended for the EU policymakers but their implementation could be more effective if done through lower levels of governance, eg. local, regional or even national. (Read full text)… (More).

Supporting priority setting in science using research funding landscapes


Report by the Research on Research Institute: “In this working paper, we describe how to map research funding landscapes in order to support research funders in setting priorities. Based on data on scientific publications, a funding landscape highlights the research fields that are supported by different funders. The funding landscape described here has been created using data from the Dimensions database. It is presented using a freely available web-based tool that provides an interactive visualization of the landscape. We demonstrate the use of the tool through a case study in which we analyze funding of mental health research…(More)”.

Why policy networks don’t work (the way we think they do)


Blog by James Georgalakis: “Is it who you know or what you know? The literature on evidence uptake and the role of communities of experts mobilised at times of crisis convinced me that a useful approach would be to map the social network that emerged around the UK-led mission to Sierra Leone so it could be quantitatively analysed. Despite the well-deserved plaudits for my colleagues at IDS and their partners in the London School of Hygiene and Tropical Medicine, the UK Department for International Development (DFID), the Wellcome Trust and elsewhere, I was curious to know why they had still met real resistance to some of their policy advice. This included the provision of home care kits for victims of the virus who could not access government or NGO run Ebola Treatment Units (ETUs).

It seemed unlikely these challenges were related to poor communications. The timely provision of accessible research knowledge by the Ebola Response Anthropology Platform has been one of the most celebrated aspects of the mobilisation of anthropological expertise. This approach is now being replicated in the current Ebola response in the Democratic Republic of Congo (DRC).  Perhaps the answer was in the network itself. This was certainly indicated by some of the accounts of the crisis by those directly involved.

Social network analysis

I started by identifying the most important looking policy interactions that took place between March 2014, prior to the UK assuming leadership of the Sierra Leone international response and mid-2016, when West Africa was finally declared Ebola free. They had to be central to the efforts to coordinate the UK response and harness the use of evidence. I then looked for documents related to these events, a mixture of committee minutes, reports and correspondence , that could confirm who was an active participant in each. This analysis of secondary sources related to eight separate policy processes and produced a list of 129 individuals. However, I later removed a large UK conference that took place in early 2016 at which learning from the crisis was shared.  It appeared that most delegates had no significant involvement in giving policy advice during the crisis. This reduced the network to 77….(More)”.

Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations


Paper by Margot E. Kaminski and Gianclaudio Malgieri: “Policy-makers, scholars, and commentators are increasingly concerned with the risks of using profiling algorithms and automated decision-making. The EU’s General Data Protection Regulation (GDPR) has tried to address these concerns through an array of regulatory tools. As one of us has argued, the GDPR combines individual rights with systemic governance, towards algorithmic accountability. The individual tools are largely geared towards individual “legibility”: making the decision-making system understandable to an individual invoking her rights. The systemic governance tools, instead, focus on bringing expertise and oversight into the system as a whole, and rely on the tactics of “collaborative governance,” that is, use public-private partnerships towards these goals. How these two approaches to transparency and accountability interact remains a largely unexplored question, with much of the legal literature focusing instead on whether there is an individual right to explanation.

The GDPR contains an array of systemic accountability tools. Of these tools, impact assessments (Art. 35) have recently received particular attention on both sides of the Atlantic, as a means of implementing algorithmic accountability at early stages of design, development, and training. The aim of this paper is to address how a Data Protection Impact Assessment (DPIA) links the two faces of the GDPR’s approach to algorithmic accountability: individual rights and systemic collaborative governance. We address the relationship between DPIAs and individual transparency rights. We propose, too, that impact assessments link the GDPR’s two methods of governing algorithmic decision-making by both providing systemic governance and serving as an important “suitable safeguard” (Art. 22) of individual rights….(More)”.

Five Ethical Principles for Humanitarian Innovation


Peter Batali, Ajoma Christopher & Katie Drew in the Stanford Social Innovation Review: “…Based on this experience, UNHCR and CTEN developed a pragmatic, refugee-led, “good enough” approach to experimentation in humanitarian contexts. We believe a wide range of organizations, including grassroots community organizations and big-tech multinationals, can apply this approach to ensure that the people they aim to help hold the reigns of the experimentation process.

1. Collaborate Authentically and Build Intentional Partnerships

Resource and information asymmetry are inherent in the humanitarian system. Refugees have long been constructed as “‘victims”’ in humanitarian response, waiting for “salvation” from heroic humanitarians. Researcher Matthew Zagor describes this construct as follows: “The genuine refugee … is the passive, coerced, patient refugee, the one waiting in the queue—the victim, anticipating our redemptive touch, defined by the very passivity which in our gaze both dehumanizes them, in that they lack all autonomy in our eyes, and romanticizes them as worthy in their potentiality.”

Such power dynamics make authentic collaboration challenging….

2. Avoid Technocratic Language

Communication can divide us or bring us together. Using exclusive or “expert” terminology (terms like “ideation,” “accelerator,” and “design thinking”) or language that reinforces power dynamics or assigns an outsider role (such as “experimenting on”) can alienate community participants. Organizations should aim to use inclusive language than everyone understands, as well as set a positive and realistic tone. Communication should focus on the need to co-develop solutions with the community, and the role that testing or trying something new can play….

3. Don’t Assume Caution Is Best

Research tells us that we feel more regret over actions that lead to negative outcomes than we do over inactions that lead to the same or worse outcomes. As a result, we tend to perceive and weigh action and inaction unequally. So while humanitarian organizations frequently consider the implications of our actions and the possible negative outcome for communities, we don’t always consider the implications of doing nothing. Is it ethical to continue an activity that we know isn’t as effective as it could be, when testing small and learning fast could reap real benefits? In some cases, taking a risk might, in fact, be the least risky path of action. We need to always ask ourselves, “Is it really ethical to do nothing?”…

4. Choose Experiment Participants Based on Values

Many humanitarian efforts identify participants based on their societal role, vulnerability, or other selection criteria. However, these methods often lead to challenges related to incentivization—the need to provide things like tea, transportation, or cash payments to keep participants engaged. Organizations should instead consider identifying participants who demonstrate the values they hope to promote—such as collaboration, transparency, inclusivity, or curiosity. These community members are well-poised to promote inclusivity, model positive behaviors, and engage participants across the diversity of your community….

5. Monitor Community Feedback and Adapt

While most humanitarian agencies know they need to listen and adapt after establishing communication channels, the process remains notoriously challenging. One reason is that community members don’t always share their feedback on experimentation formally; feedback sometimes comes from informal channels or even rumors. Yet consistent, real-time feedback is essential to experimentation. Listening is the pressure valve in humanitarian experimentation; it allows organizations to adjust or stop an experiment if the community flags a negative outcome….(More)”.

Why data from companies should be a common good


Paula Forteza at apolitical: “Better planning of public transport, protecting fish from intensive fishing, and reducing the number of people killed in car accidents: for these and many other public policies, data is essential.

Data applications are diverse, and their origins are equally numerous. But data is not exclusively owned by the public sector. Data can be produced by private actors such as mobile phone operators, as part of marine traffic or by inter-connected cars to give just a few examples.

The awareness around the potential of private data is increasing, as the proliferation of data partnerships between companies, governments, local authorities show. However, these partnerships represent only a very small fraction of what could be done.

The opening of public data, meaning that public data is made freely available to everyone, has been conducted on a wide scale in the last 10 years, pioneered by the US and UK, soon followed by France and many other countries. In 2015, France took a first step, as the government introduced the Digital Republic Bill which made data open by default and introduced the concept of public interest data. Due to a broad definition and low enforcement, the opening of private sector data is, nevertheless, still lagging behind.

The main arguments for opening private data are that it will allow better public decision-making and it could trigger a new way to regulate Big Tech. There is, indeed, a strong economic case for data sharing, because data is a non-rival good: the value of data does not diminish when shared. On the contrary, new uses can be designed and data can be enriched by aggregation, which could improve innovation for start-ups….

Why Europe needs a private data act

Data hardly knows any boundaries.

Some states are opening like France did in 2015 by creating a framework for “public interest data,” but the absence of a common international legal framework for private data sharing is a major obstacle to its development. To scale up, a European Private Data Act is needed.

This framework must acknowledge the legitimate interest of the private companies that collect and control data. Data can be their main source of income or one they are wishing to develop, and this must be respected. Trade secrecy has to be protected too: data sharing is not open data.

Data can be shared to a limited and identified number of partners and it does not always have to be free. Yet private interest must be aligned with the public good. The European Convention on Human Rights and the European Charter of Fundamental Rights acknowledge that some legitimate and proportional limitations can be set to the freedom of enterprise, which gives everyone the right to pursue their own profitable business.

The “Private Data Act” should contain several fundamental data sharing principles in line with those proposed by the European Commission in 2018: proportionality, “do no harm”, full respect of the GDPR, etc. It should also include guidelines on which data to share, how to appreciate the public interest, and in which cases data should be opened for free or how the pricing should be set.

Two methods can be considered:

  • Defining high-value datasets, as it has been done for public data in the recent Open Data Directive, in areas like mobile communications, banking, transports, etc. This method is strong but is not flexible enough.
  • Alternatively, governments might define certain “public interest projects”. In doing so, governments could get access to specific data that is seen as a prerequisite to achieve the project. For example, understanding why there is a increasing mortality among bees, requires various data sources: concrete data on bee mortality from the beekeepers, crops and the use of pesticides from the farmers, weather data, etc. This method is more flexible and warrants that only the data needed for the project is shared.

Going ahead on open data and data sharing should be a priority for the upcoming European Commission and Parliament. Margrethe Vestager has been renewed as Competition Commissioner and Vice-President of the Commission and she already mentioned the opportunity to define access to data for newcomers in the digital market.

Public interest data is a new topic on the EU agenda and will probably become crucial in the near future….(More)”.