10 Examples of Successful African e-Government Digital Services


Article by Wayan Vota: “African countries are implementing a diverse range of e-Government services, aiming to improve service delivery, enhance efficiency, and promote transparency. For example, common e-Government services in African countries include:

  • Online Government Portals: African countries are increasingly offering online services such as e-taxation, e-payment, and e-billing through online government portals, which allow citizens to access public services more efficiently and provide governments with prompt feedback on service quality.
  • Digital Identity Initiatives: Many African countries are working on digital identity initiatives to improve service delivery, including the introduction of national IDs with biometric data components to generate documents and provide services automatically, reducing paperwork and enhancing efficiency.
  • G2G, G2B, and G2C Activities: e-Government services to different groups, like Government-to-Government (G2G), Government-to-Business (G2B), and Government-to-Citizen (G2C) focuses on activities such as electoral processes, staff payroll payments, healthcare management systems, support for small businesses, and transparent procurement procedures…

Successful eGovernment initiatives in African countries have significantly improved government services and citizen engagement. These examples are part of a broader trend in Africa towards leveraging digital technologies to improve governance and public administration, with many countries making significant implementation progress…(More)”.

Do disappearing data repositories pose a threat to open science and the scholarly record?


Article by Dorothea Strecker, Heinz Pampel, Rouven Schabinger and Nina Leonie Weisweiler: “Research data repositories, such as Zenodo or the UK Data Archive, are specialised information infrastructures that focus on the curation and dissemination of research data. One of repositories’ main tasks is maintaining their collections long-term, see for example the TRUST Principles, or the requirements of the certification organization CoreTrustSeal. Long-term preservation is also a prerequisite for several data practices that are getting increasing attention, such as data reuse and data citation.

For data to remain usable, the infrastructures that host them also have to be kept operational. However, the long-term operation of research data repositories is challenging, and sometimes, for varying reasons and despite best efforts, they are shut down….

In a recent study we therefore set out to take an infrastructure perspective on the long-term preservation of research data by investigating repositories across disciplines and types that were shut down. We also tried to estimate the impact of repository shutdown on data availability…

We found that repository shutdown was not rare: 6.2% of all repositories listed in re3data were shut down. Since the launch of the registry in 2012, at least one repository has been shut down each year (see Fig.1). The median age of a repository when shutting down was 12 years…(More)”.

The global reach of the EU’s approach to digital transformation


Report by the European Parliament’s Think Tank: “The EU’s approach to digital transformation is rooted in protecting fundamental rights, sustainability, ethics and fairness. With this human-centric vision of the digital economy and society, the EU seeks to empower citizens and businesses, regardless of their size. In the EU’s view, the internet should remain open, fair, inclusive and focused on people. Digital technologies should work for citizens and help them to engage in society. Companies should be able to compete on equal terms, and consumers should be confident that their rights are respected. The European Commission has published a number of strategies and action plans recently that outline the EU’s vision for the digital future and set concrete targets for achieving it. The Commission has also proposed several digital regulations, including the artificial intelligence act, the Digital Services Act and the Digital Markets Act. These regulations are intended to ensure a safe online environment and fair and open digital markets, strengthen Europe’s competitiveness, improve algorithmic transparency and give citizens better control over how they share their personal data. Although some of these regulations have not yet been adopted, and others have been in force for only a short time, they are expected to have impact not only in the EU but also beyond its borders. For instance, several regulations target businesses – regardless of where they are based – that offer services to EU citizens or businesses. In addition, through the phenomenon known as ‘the Brussels effect’, these rules may influence tech business practices and national legislation around the world. The EU is an active participant in developing global digital cooperation and global governance frameworks for specific areas. Various international organisations are developing instruments to ensure that people and businesses can take advantage of artificial intelligence’s benefits and limit negative consequences. In these global negotiations, the EU promotes respect for various fundamental rights and freedoms, as well as compatibility with EU law….(More)”.

How Much of the World Is It Possible to Model?


Article by Dan Rockmore: “…Modelling, in general, is now routine. We model everything, from elections to economics, from the climate to the coronavirus. Like model cars, model airplanes, and model trains, mathematical models aren’t the real thing—they’re simplified representations that get the salient parts right. Like fashion models, model citizens, and model children, they’re also idealized versions of reality. But idealization and abstraction can be forms of strength. In an old mathematical-modelling joke, a group of experts is hired to improve milk production on a dairy farm. One of them, a physicist, suggests, “Consider a spherical cow.” Cows aren’t spheres any more than brains are jiggly sponges, but the point of modelling—in some ways, the joy of it—is to see how far you can get by using only general scientific principles, translated into mathematics, to describe messy reality.

To be successful, a model needs to replicate the known while generalizing into the unknown. This means that, as more becomes known, a model has to be improved to stay relevant. Sometimes new developments in math or computing enable progress. In other cases, modellers have to look at reality in a fresh way. For centuries, a predilection for perfect circles, mixed with a bit of religious dogma, produced models that described the motion of the sun, moon, and planets in an Earth-centered universe; these models worked, to some degree, but never perfectly. Eventually, more data, combined with more expansive thinking, ushered in a better model—a heliocentric solar system based on elliptical orbits. This model, in turn, helped kick-start the development of calculus, reveal the law of gravitational attraction, and fill out our map of the solar system. New knowledge pushes models forward, and better models help us learn.

Predictions about the universe are scientifically interesting. But it’s when models make predictions about worldly matters that people really pay attention.We anxiously await the outputs of models run by the Weather Channel, the Fed, and fivethirtyeight.com. Models of the stock market guide how our pension funds are invested; models of consumer demand drive production schedules; models of energy use determine when power is generated and where it flows. Insurers model our fates and charge us commensurately. Advertisers (and propagandists) rely on A.I. models that deliver targeted information (or disinformation) based on predictions of our reactions.

But it’s easy to get carried away..(More)”

Missing Evidence : Tracking Academic Data Use around the World


Worldbank Report: “Data-driven research on a country is key to producing evidence-based public policies. Yet little is known about where data-driven research is lacking and how it could be expanded. This paper proposes a method for tracking academic data use by country of subject, applying natural language processing to open-access research papers. The model’s predictions produce country estimates of the number of articles using data that are highly correlated with a human-coded approach, with a correlation of 0.99. Analyzing more than 1 million academic articles, the paper finds that the number of articles on a country is strongly correlated with its gross domestic product per capita, population, and the quality of its national statistical system. The paper identifies data sources that are strongly associated with data-driven research and finds that availability of subnational data appears to be particularly important. Finally, the paper classifies countries into groups based on whether they could most benefit from increasing their supply of or demand for data. The findings show that the former applies to many low- and lower-middle-income countries, while the latter applies to many upper-middle- and high-income countries…(More)”.

The City of Today is a Dying Thing: In Search of the Cities of Tomorrow


Book by Des Fitzgerald: “Cities are bad for us: polluted, noisy and fundamentally unnatural. We need green space, not concrete. Trees, not tower blocks. So goes the argument. But is it true? What would the city of the future look like if we tried to build a better life from the ground up? And would anyone want to live there?

Here, Des Fitzgerald takes us on an urgent, unforgettable journey into the future of urban life, from shimmering edifices in the Arizona desert to forest-bathing in deepest Wales, and from rats in mazes to neuroscientific studies of the effects of our surroundings. Along the way, he reveals the deep-lying and often controversial roots of today’s green city movement, and offers an argument for celebrating our cities as they are – in all their raucous, constructed and artificial glory…(More)”.

Are we entering a “Data Winter”?


Article by Stefaan G. Verhulst: “In an era where data drives decision-making, the accessibility of data for public interest purposes has never been more crucial. Whether shaping public policy, responding to disasters, or empowering research, data plays a pivotal role in our understanding of complex social, environmental, and economic issues. In 2015, I introduced the concept of Data Collaboratives to advance new and innovative partnerships between the public and private sectors that could make data more accessible for public interest purposes. More recently, I have been advocating for a reimagined approach to data stewardship to make data collaboration more systematic, agile, sustainable, and responsible.

We may be entering a “Data Winter”

Despite many advances toward data stewardship (especially during Covid19) and despite the creation of several important data collaboratives (e.g., the Industry Data for Society Partnership) the project of opening access to data is proving increasingly challenging. Indeed, unless we step up our efforts in 2024, we may be entering a prolonged data winter — analogous to previous Artificial Intelligence winters, marked by reduced funding and interest in AI research, in which data assets that could be leveraged for the common good are instead frozen and immobilized. Recent developments, such as a decline in access to social media data for research and the growing privatization of climate data, along with a decrease in open data policy activity, signify a worrying trend. This blog takes stock of these developments and, building on some recent expert commentary, raises a number of concerns about the current state of data accessibility and its implications for the public interest. We conclude by calling for a new Decade of Data — one marked by a reinvigorated commitment to open data and data reuse for the public interest…(More)”.

The world needs an International Decade for Data–or risk splintering into AI ‘haves’ and ‘have-nots,’ UN researchers warn


Article by Tshilidzi Marwala and David Passarelli: “The rapid rise in data-driven technologies is shaping how many of us live–from biometric data collected by our smartwatches, artificial intelligence (AI) tools and models changing how we work, to social media algorithms that seem to know more about our content preferences than we do. Greater amounts of data are affecting all aspects of our lives, and indeed, society at large.

This explosion in data risks creating new inequalities, equipping a new set of “haves” who benefit from the power of data while excluding, or even harming, a set of “have-nots”–and splitting the international community into “data-poor” and “data-rich” worlds.

We know that data, when harnessed correctly, can be a powerful tool for sustainable development. Intelligent and innovative use of data can support public health systems, improve our understanding of climate change and biodiversity loss, anticipate crises, and tackle deep-rooted structural injustices such as racism and economic inequality.

However, the vast quantity of data is fueling an unregulated Wild West. Instead of simply issuing more warnings, governments must instead work toward good governance of data on a global scale. Due to the rapid pace of technological innovation, policies intended to protect society will inevitably fall behind. We need to be more ambitious.

To begin with, governments must ensure that the benefits derived from data are equitably distributed by establishing global ground rules for data collection, sharing, taxation, and re-use. This includes dealing with synthetic data and cross-border data flows…(More)”.

Avoiding the News


Book by Benjamin Toff, Ruth Palmer, and Rasmus Kleis Nielsen: “A small but growing number of people in many countries consistently avoid the news. They feel they do not have time for it, believe it is not worth the effort, find it irrelevant or emotionally draining, or do not trust the media, among other reasons. Why and how do people circumvent news? Which groups are more and less reluctant to follow the news? In what ways is news avoidance a problem—for individuals, for the news industry, for society—and how can it be addressed?

This groundbreaking book explains why and how so many people consume little or no news despite unprecedented abundance and ease of access. Drawing on interviews in Spain, the United Kingdom, and the United States as well as extensive survey data, Avoiding the News examines how people who tune out traditional media get information and explores their “folk theories” about how news organizations work. The authors argue that news avoidance is about not only content but also identity, ideologies, and infrastructures: who people are, what they believe, and how news does or does not fit into their everyday lives. Because news avoidance is most common among disadvantaged groups, it threatens to exacerbate existing inequalities by tilting mainstream journalism even further toward privileged audiences. Ultimately, this book shows, persuading news-averse audiences of the value of journalism is not simply a matter of adjusting coverage but requires a deeper, more empathetic understanding of people’s relationships with news across social, political, and technological boundaries…(More)”.

The New Knowledge


Book by Blayne Haggart and Natasha Tusikov: “From the global geopolitical arena to the smart city, control over knowledge—particularly over data and intellectual property—has become a key battleground for the exercise of economic and political power. For companies and governments alike, control over knowledge—what scholar Susan Strange calls the knowledge structure—has become a goal unto itself.

The rising dominance of the knowledge structure is leading to a massive redistribution of power, including from individuals to companies and states. Strong intellectual property rights have concentrated economic benefits in a smaller number of hands, while the “internet of things” is reshaping basic notions of property, ownership, and control. In the scramble to create and control data and intellectual property, governments and companies alike are engaging in ever-more surveillance.

The New Knowledge is a guide to and analysis of these changes, and of the emerging phenomenon of the knowledge-driven society. It highlights how the pursuit of the control over knowledge has become its own ideology, with its own set of experts drawn from those with the ability to collect and manipulate digital data. Haggart and Tusikov propose a workable path forward—knowledge decommodification—to ensure that our new knowledge is not treated simply as a commodity to be bought and sold, but as a way to meet the needs of the individuals and communities that create this knowledge in the first place…(More)”.