The Case for Better Governance of Children’s Data: A Manifesto


The Case for Better Governance of Children’s Data: A Manifesto

Report by Jasmina Byrne, Emma Day and Linda Raftree: “Every child is different, with unique identities and their capacities and circumstances evolve over their lifecycle. Children are more vulnerable than adults and are less able to understand the long-term implications of consenting to their data collection. For these reasons, children’s data deserve to be treated differently.

While responsible data use can underpin many benefits for children, ensuring that children are protected, empowered and granted control of their data is still a challenge.

To maximise the benefits of data use for children and to protect them from harm requires a new model of data governance that is fitting for the 21st century.

UNICEF has worked with 17 global experts to develop a Manifesto that articulates a vision for a better approach to children’s data.

This Manifesto includes key action points and a call for a governance model purposefully designed to deliver on the needs and rights of children. It is the first step in ensuring that children’s rights are given due weight in data governance legal frameworks and processes as they evolve around the world….(More)”

Ethiopia’s blockchain deal is a watershed moment – for the technology, and for Africa


Iwa Salami at The Conversation: “At the launch of bitcoin in 2009 the size of the potential of the underlying technology, the blockchain, was not fully appreciated.

What has not been fully exploited is the unique features of blockchain technology that can improve the lives of people and businesses. These include the fact that it is an open source software. This makes its source code legally and freely available to end-users who can use it to create new products and services. Another significant feature is that it is decentralised, democratising the operation of the services built on it. Control of the services built on the blockchain isn’t in the hands of an individual or a single entity but involves all those connected to the network.

In addition, it enables peer to peer interaction between those connected to the network. This is key as it enables parties to transact directly without using intermediaries or third parties. Finally, it has inbuilt security. Data stored on it is immutable and cannot be changed easily. New data can be added only after it is verified by everyone in the network.

Unfortunately, bitcoin, the project that introduced blockchain technology, has hogged the limelight, diverting attention from the technology’s underlying potential benefits….

But this is slowly changing.

A few companies have begun showcasing blockchain capabilities to various African countries. Unlike most other cryptocurrency blockchains which focus on private sector use in developed regions like Europe and North America, their approach has been to target the governments and public institutions in the developing world.

In April the Ethiopian government confirmed that it had signed a deal to create a national database of student and teacher IDs using a decentralised digital identity solution. The deal involves providing IDs for 5 million students across 3,500 schools which will be used to store educational records.

This is the largest blockchain deal ever to be signed by a government and has been making waves in the crypto-asset industry.

I believe that the deal marks a watershed moment for the use of blockchain and the crypto-asset industry, and for African economies because it offers the promise of blockchain being used for real socio-economic change. The deal means that blockchain technology will be used to provide digital identity to millions of Ethiopians. Digital identity – missing in most African countries – is the first step to real financial inclusion, which in turn has been shown to carry a host of benefits….(More)”.

Reimagining data responsibility: 10 new approaches toward a culture of trust in re-using data to address critical public needs


Commentary by Stefaan Verhulst in Data & Policy: “Data and data science offer tremendous potential to address some of our most intractable public problems (including the Covid-19 pandemic). At the same time, recent years have shown some of the risks of existing and emerging technologies. An updated framework is required to balance potential and risk, and to ensure that data is used responsibly. Data responsibility is not itself a new concept. However, amid a rapidly changing technology landscape, it has become increasingly clear that the concept may need updating, in order to keep up with new trends such as big data, open data, the Internet of things, and artificial intelligence, and machine learning. This paper seeks to outline 10 approaches and innovations for data responsibility in the 21st century….

10 New Approaches for Data Responsibility (Stefaan Verhulst)

Each of these is described at greater length in the paper, and illustrated with examples from around the world. Put together, they add up to a framework or outline for policy makers, scholars, and activists who seek to harness the potential of data to solve complex social problems and advance the public good. Needless to say, the 10 approaches outlined here represent just a start. We envision this paper more as an exercise in agenda-setting than a comprehensive survey…(More)”.

Tech for disabled people is booming around the world. So where’s the funding?


Article by Devi Lockwood: “Erick Ponce works in a government communications department in northern Ecuador. The 26-year-old happens to be deaf — a disability he has had since childhood. Communicating fluidly with his non-signing colleagues at work, and in public spaces like the supermarket, has been a lifelong challenge. 

In 2017, Ponce became one of the first users of an experimental app called SpeakLiz, developed by an Ecuadorian startup called Talov. It transforms written text to sound, transcribes spoken words, and can alert a deaf or hard-of-hearing person to sounds like that of an ambulance, motorcycles, music, or a crying baby. 

Once he began using SpeakLiz, Ponce’s coworkers — and his family — were able to understand him more easily. “You cannot imagine what it feels like to speak with your son after 20 years,” his father told the app’s engineers. Now a part of the Talov team, Ponce demos new products to make them better before they hit the market. 

The startup has launched two subscription apps on iOS and Android: SpeakLiz, in 2017, for the hearing impaired, and Vision, in 2019, for the visually impaired. Talov’s founders, Hugo Jácome and Carlos Obando, have been working on the apps for over five years. 

SpeakLiz and Vision are, by many measures, successful. Their software is used by more than 7,000 people in 81 countries and is available in 35 languages. The founders won an award from MIT Technology Review and a contest organized by the History Channel. Talov was named among the top 100 most innovative startups in Latin America in 2019. 

But the startup is still struggling. Venture capitalists aren’t knocking on its door. Jácome and Obando sold some of their possessions to raise enough money to launch, and the team has next to no funding to continue expanding.

Although the last few years have seen significant advances in technology and innovation for disabled people, critics say the market is undervalued….(More)”.

Lobbying in the 21st Century: Transparency, Integrity and Access


OECD Report: “Lobbying, as a way to influence and inform governments, has been part of democracy for at least two centuries, and remains a legitimate tool for influencing public policies. However, it carries risks of undue influence. Lobbying in the 21st century has also become increasingly complex, including new tools for influencing government, such as social media, and a wide range of actors, such as NGOs, think tanks and foreign governments. This report takes stock of the progress that countries have made in implementing the OECD Principles for Transparency and Integrity in Lobbying. It reflects on new challenges and risks related to the many ways special interest groups attempt to influence public policies, and reviews tools adopted by governments to effectively safeguard impartiality and fairness in the public decision-making process….(More)”.

Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis


A CDT Research report, entitled "Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis".
CDT Research report, entitled “Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis”.

Report by Dhanaraj Thakur and  Emma Llansó: “The ever-increasing amount of user-generated content online has led, in recent years, to an expansion in research and investment in automated content analysis tools. Scrutiny of automated content analysis has accelerated during the COVID-19 pandemic, as social networking services have placed a greater reliance on these tools due to concerns about health risks to their moderation staff from in-person work. At the same time, there are important policy debates around the world about how to improve content moderation while protecting free expression and privacy. In order to advance these debates, we need to understand the potential role of automated content analysis tools.

This paper explains the capabilities and limitations of tools for analyzing online multimedia content and highlights the potential risks of using these tools at scale without accounting for their limitations. It focuses on two main categories of tools: matching models and computer prediction models. Matching models include cryptographic and perceptual hashing, which compare user-generated content with existing and known content. Predictive models (including computer vision and computer audition) are machine learning techniques that aim to identify characteristics of new or previously unknown content….(More)”.

The Filing Cabinet


Essay by Craig Robertson: “The filing cabinet was critical to the information infrastructure of the 20th-century. Like most infrastructure, it was usually overlooked….The subject of this essay emerged by chance. I was researching the history of the U.S. passport, and had spent weeks at the National Archives, struggling through thousands of reels of unindexed microfilm records of 19th-century diplomatic correspondence; then I arrived at the records for 1906. That year, the State Department adopted a numerical filing system. Suddenly, every American diplomatic office began using the same number for passport correspondence, with decimal numbers subdividing issues and cases. Rather than scrolling through microfilm images of bound pages organized chronologically, I could go straight to passport-relevant information that had been gathered in one place.

I soon discovered that I had Elihu Root to thank for making my research easier. A lawyer whose clients included Andrew Carnegie, Root became secretary of state in 1905. But not long after he arrived, the prominent corporate lawyer described himself as “a man trying to conduct the business of a large metropolitan law-firm in the office of a village squire.”  The department’s record-keeping practices contributed to his frustration. As was then common in American offices, clerks used press books or copybooks to store incoming and outgoing correspondence in chronologically ordered bound volumes with limited indexing. For Root, the breaking point came when a request for a handful of letters resulted in several bulky volumes appearing on his desk. His response was swift: he demanded that a vertical filing system be adopted; soon the department was using a numerical subject-based filing system housed in filing cabinets. 

The shift from bound volumes to filing systems is a milestone in the history of classification; the contemporaneous shift to vertical filing cabinets is a milestone in the history of storage….(More)”.

Side-Stepping Safeguards, Data Journalists Are Doing Science Now


Article by Irineo Cabreros: “News stories are increasingly told through data. Witness the Covid-19 time series that decorate the homepages of every major news outlet; the red and blue heat maps of polling predictions that dominate the runup to elections; the splashy, interactive plots that dance across the screen.

As a statistician who handles data for a living, I welcome this change. News now speaks my favorite language, and the general public is developing a healthy appetite for data, too.

But many major news outlets are no longer just visualizing data, they are analyzing it in ever more sophisticated ways. For example, at the height of the second wave of Covid-19 cases in the United States, The New York Times ran a piece declaring that surging case numbers were not attributable to increased testing rates, despite President Trump’s claims to the contrary. The thrust of The Times’ argument was summarized by a series of plots that showed the actual rise in Covid-19 cases far outpacing what would be expected from increased testing alone. These weren’t simple visualizations; they involved assumptions and mathematical computations, and they provided the cornerstone for the article’s conclusion. The plots themselves weren’t sourced from an academic study (although the author on the byline of the piece is a computer science Ph.D. student); they were produced through “an analysis by The New York Times.”

The Times article was by no means an anomaly. News outlets have asserted, on the basis of in-house data analyses, that Covid-19 has killed nearly half a million more people than official records report; that Black and minority populations are overrepresented in the Covid-19 death toll; and that social distancing will usually outperform attempted quarantine. That last item, produced by The Washington Post and buoyed by in-house computer simulations, was the most read article in the history of the publication’s website, according to Washington Post media reporter Paul Farhi.

In my mind, a fine line has been crossed. Gone are the days when science journalism was like sports journalism, where the action was watched from the press box and simply conveyed. News outlets have stepped onto the field. They are doing the science themselves….(More)”.

The Conference on the Future of Europe—an Experiment in Citizens’ Participation


Stefan Lehne at Carnegie Europe: “If the future of Europe is to be decided at the Conference on the Future of Europe, we should be extremely worried.

Clearly, this has been one of the least fortunate EU projects of recent years. Conceived by French President Emmanuel Macron in 2019 as a response to the rise of populism, the conference fell victim, first to the pandemic and then to institutional squabbling over who should lead it, resulting in a delay of an entire year.

The setup of the conference emerging from months of institutional infighting is strangely schizophrenic.

On the one hand, it offers a forum for interinstitutional negotiations, where representatives of the European Parliament demanding deeper integration will confront a number of governments staunchly opposed to transferring further powers to the EU. 

On the other, the conference provides for an innovative experiment in citizens’ participation. A multilingual interactive website—futureu.europa.eu—offers citizens the opportunity to share and discuss ideas and to organize events. Citizens’ panels made up of randomly selected people from across the EU will discuss various policy areas and feed their findings into the debate of the conference’s plenary….

In the first three weeks 11,000 people participated in the digital platform, sharing more than 2,500 ideas on various aspects of the EU’s work.

A closer look reveals that many of the participants are engaged citizens and activists who use the website as just another format to propagate their demands. The platform thus offers a diverse and colorful discussion forum, but is unlikely to render a representative picture of the views of the average citizen.

This is precisely the objective of the citizens’ panels: an attempt to introduce an element of deliberative democracy into EU politics.

Deliberative assemblies have in recent decades become a prominent feature of democratic life in many countries. They work best at the local level, where participants understand each other well and are already roughly familiar with the issue at stake.

But they have also been employed at the national level, such as the citizens’ assembly preparing the referendum on abortion in Ireland or the citizens’ convention on climate in France.

The European Commission has rich experience, having held more than 1,800 citizens’ consultations, but apart from a single rather symbolic experiment in 2018, a genuine citizens’ panel based on sortition has never been attempted at the European level.

Deliberative democracy is all about initiating an open discussion, carefully weighing the evidence, and thus allowing convergence toward a broadly shared agreement. Given the language barriers and the highly diverse cultural background of European citizens, this is difficult to accomplish at the EU level.

Also, many of subject areas of the conference ranging from climate to economic and social policy are technically complex. It is clear that a great deal of expert advice and time will be necessary to enable citizens to engage in meaningful deliberation on these topics.

Unfortunately, the limited timeframe and the insufficient resources of the conference—financing depends on contributions from the institutions—make it doubtful that the citizens’ panels will be conducted in a sufficiently serious manner.

There is also—as is so often the case with citizens’ assemblies—the crucial question of the follow-up. In the case of the conference, the recommendations of the panels, together with the content of the digital platform and the outcome of events in the member states, will feed into the discussions of the plenary….(More)”

Theories of Change


Book by Karen Wendt: “Today, it has become strikingly obvious that companies no longer operate in an environment where only risk return and volatility describe the business environment. The business has to deal with volatility plus uncertainty, plus complexity and ambiguity (VUCA): that requires new qualities, competencies, frameworks; and it demands a new mind set to deal with the VUCA environment in investment, funding and financing. This book builds on a new megatrend beyond resilience, called anti-fragility. We have had the black swan  (financial crisis) and the red swan (COVID) – the Bank for International Settlement is preparing for regenerative capitalism, block chain based analysis of financial streams and is aiming to prevent the “Green Swan” – the climate crisis to lead to the next lockdown. In the light of the UN 17 Sustainable Development Goals, what is required, is Theories of Change.

Written by experts working in the fields of sustainable finance, impact investing, development finance, carbon divesting, innovation, scaling finance, impact entrepreneurship, social stock exchanges, alternative currencies, Initial Coin Offerings (ICOs), ledger technologies, civil action, co-creation, impact management, deep learning and transformation leadership, the book begins by analysing existing Theories of Change frameworks from various disciplines and creating a new integrated model – the meta-framework. In turn, it presents insights on creating and using Theories of Change to redirect investment capital to sustainable companies while implementing the Sustainable Development Goals and the Paris Climate Agreement. Further, it discusses the perspective of planetary boundaries as defined by the Stockholm Resilience Institute, and investigates various aspects of systems, organizations, entrepreneurship, investment and finance that are closely tied to the mission ingrained in the Theory of Change. As it demonstrates, solutions that ensure the parity of profit, people and planet through dynamic change can effectively address the needs of entrepreneurs and business. By exploring these concepts and their application, the book helps create and shape new markets and opportunities….(More)”.