Big data proves mobility is not gender-neutral


Blog by Ellin Ivarsson, Aiga Stokenberg and Juan Ignacio Fulponi: “All over the world, there is growing evidence showing that women and men travel differently. While there are many reasons behind this, one key factor is the persistence of traditional gender norms and roles that translate into different household responsibilities, different work schedules, and, ultimately, different mobility needs. Greater overall risk aversion and sensitivity to safety issues also play an important role in how women get around. Yet gender often remains an afterthought in the transport sector, meaning most policies or infrastructure investment plans are not designed to take into account the specific mobility needs of women.

The good news is that big data can help change that. In a recent study, the World Bank Transport team combined several data sources to analyze how women travel around the Buenos Aires Metropolitan Area (AMBA), including mobile phone signal data, congestion data from Waze, public transport smart card data, and data from a survey implemented by the team in early 2022 with over 20,300 car and motorcycle users.

Our research revealed that, on average, women in AMBA travel less often than men, travel shorter distances, and tend to engage in more complex trips with multiple stops and purposes. On average, 65 percent of the trips made by women are shorter than 5 kilometers, compared to 60 percent among men. Also, women’s hourly travel patterns are different, with 10 percent more trips than men during the mid-day off-peak hour, mostly originating in central AMBA. This reflects the larger burden of household responsibilities faced by women – such as picking children up from school – and the fact that women tend to work more irregular hours…(More)” See also Gender gaps in urban mobility.

China’s new AI rules protect people — and the Communist Party’s power


Article by Johanna M. Costigan: “In April, in an effort to regulate rapidly advancing artificial intelligence technologies, China’s internet watchdog introduced draft rules on generative AI. They cover a wide range of issues — from how data is trained to how users interact with generative AI such as chatbots. 

Under the new regulations, companies are ultimately responsible for the “legality” of the data they use to train AI models. Additionally, generative AI providers must not share personal data without permission, and must guarantee the “veracity, accuracy, objectivity, and diversity” of their pre-training data. 

These strict requirements by the Cyberspace Administration of China (CAC) for AI service providers could benefit Chinese users, granting them greater protections from private companies than many of their global peers. Article 11 of the regulations, for instance, prohibits providers from “conducting profiling” on the basis of information gained from users. Any Instagram user who has received targeted ads after their smartphone tracked their activity would stand to benefit from this additional level of privacy.  

Another example is Article 10 — it requires providers to employ “appropriate measures to prevent users from excessive reliance on generated content,” which could help prevent addiction to new technologies and increase user safety in the long run. As companion chatbots such as Replika become more popular, companies should be responsible for managing software to ensure safe use. While some view social chatbots as a cure for loneliness, depression, and social anxiety, they also present real risks to users who become reliant on them…(More)”.

AI-assisted diplomatic decision-making during crises—Challenges and opportunities


Article by Neeti Pokhriyal and Till Koebe: “Recent academic works have demonstrated the efficacy of employing or integrating “non-traditional” data (e.g., social media, satellite imagery, etc) for situational awareness tasks…

Despite these successes, we identify four critical challenges unique to the area of diplomacy that needs to be considered within the growing AI and diplomacy community going ahead:

1. First, decisions during crises are almost always taken using limited or incomplete information. There may be deliberate misuse and obfuscation of data/signals between different parties involved. At the start of a crisis, information is usually limited and potentially biased, especially along socioeconomic and rural-urban lines as crises are known to exacerbate the vulnerabilities already existing in the populations. This requires AI tools to quantify and visualize calibrated uncertainty in their outputs in an appropriate manner.

2. Second, in many cases, human lives and livelihoods are at stake. Therefore, any forecast, reasoning, or recommendation provided by AI assistance needs to be explainable and transparent for authorized users, but also secure against unauthorized access as diplomatic information is often highly sensitive. The question of accountability in case of misleading AI assistance needs to be addressed beforehand.

3. Third, in complex situations with high stakes but limited information, cultural differences and value-laden judgment driven by personal experiences play a central role in diplomatic decision-making. This calls for the use of learning techniques that can incorporate domain knowledge and experience.

4. Fourth, diplomatic interests during crises are often multifaceted, resulting in deep mistrust in and strategic misuse of information. Social media data, when used for consular tasks, has been shown to be susceptible to various d-/misinformation campaigns, some by the public, others by state actors for strategic manipulation…(More)”

What do data portals do? Tracing the politics of online devices for making data public


Paper by Jonathan Gray: “The past decade has seen the rise of “data portals” as online devices for making data public. They have been accorded a prominent status in political speeches, policy documents, and official communications as sites of innovation, transparency, accountability, and participation. Drawing on research on data portals around the world, data portal software, and associated infrastructures, this paper explores three approaches for studying the social life of data portals as technopolitical devices: (a) interface analysis, (b) software analysis, and (c) metadata analysis. These three approaches contribute to the study of the social lives of data portals as dynamic, heterogeneous, and contested sites of public sector datafication. They are intended to contribute to critically assessing how participation around public sector datafication is invited and organized with portals, as well as to rethinking and recomposing them…(More)”.

As the Quantity of Data Explodes, Quality Matters


Article by Katherine Barrett and Richard Greene: “With advances in technology, governments across the world are increasingly using data to help inform their decision making. This has been one of the most important byproducts of the use of open data, which is “a philosophy- and increasingly a set of policies – that promotes transparency, accountability and value creation by making government data available to all,” according to the Organisation for Economic Co-operation and Development (OECD).

But as data has become ever more important to governments, the quality of that data has become an increasingly serious issue. A number of nations, including the United States, are taking steps to deal with it. For example, according to a study from Deloitte, “The Dutch government is raising the bar to enable better data quality and governance across the public sector.” In the same report, a case study about Finland states that “data needs to be shared at the right time and in the right way. It is also important to improve the quality and usability of government data to achieve the right goals.” And the United Kingdom has developed its Government Data Quality Hub to help public sector organizations “better identify their data challenges and opportunities and effectively plan targeted improvements.”

Our personal experience is with U.S. states and local governments, and in that arena the road toward higher quality data is a long and difficult one, particularly as the sheer quantity of data has grown exponentially. As things stand, based on our ongoing research into performance audits, it is clear that issues with data are impediments to the smooth process of state and local governments…(More)”.

Digital Equity 2.0: How to Close the Data Divide


Report by Gillian Diebold: “For the last decade, closing the digital divide, or the gap between those subscribing to broadband and those not subscribing, has been a top priority for policymakers. But high-speed Internet and computing device access are no longer the only barriers to fully participating and benefiting from the digital economy. Data is also increasingly essential, including in health care, financial services, and education. Like the digital divide, a gap has emerged between the data haves and the data have-nots, and this gap has introduced a new set of inequities: the data divide.

Policymakers have put a great deal of effort into closing the digital divide, and there is now near-universal acceptance of the notion that obtaining widespread Internet access generates social and economic benefits. But closing the data divide has received little attention. Moreover, efforts to improve data collection are typically overshadowed by privacy advocates’ warnings against collecting any data. In fact, unlike the digital divide, many ignore the data divide or argue that the way to close it is to collect vastly less data.1 But without substantial efforts to increase data representation and access, certain individuals and communities will be left behind in an increasingly data-driven world.

This report describes the multipronged efforts needed to address digital inequity. For the digital divide, policymakers have expanded digital connectivity, increased digital literacy, and improved access to digital devices. For the data divide, policymakers should similarly take a holistic approach, including by balancing privacy and data innovation, increasing data collection efforts across a wide array of fronts, enhancing access to data, improving data quality, and improving data analytics efforts. Applying lessons from the digital divide to this new challenge will help policymakers design effective and efficient policy and create a more equitable and effective data economy for all Americans…(More)”.

International Data Governance – Pathways to Progress


Press Release: “In May 2023, the United Nations System Chief Executives Board for Coordination endorsed International Data Governance – Pathways to Progress, developed through the High-level Committee on Programmes (HLCP) which approved the paper at its 45th session in March 2023.  International Data Governance – Pathways to Progress and its addenda were developed by the HLCP Working Group on International Data Governance…(More)”. (See Annex 1: Mapping and Comparing Data Governance Frameworks).

Machines of mind: The case for an AI-powered productivity boom


Report by Martin Neil Baily, Erik Brynjolfsson, Anton Korinek: “ Large language models such as ChatGPT are emerging as powerful tools that not only make workers more productive but also increase the rate of innovation, laying the foundation for a significant acceleration in economic growth. As a general purpose technology, AI will impact a wide array of industries, prompting investments in new skills, transforming business processes, and altering the nature of work. However, official statistics will only partially capture the boost in productivity because the output of knowledge workers is difficult to measure. The rapid advances can have great benefits but may also lead to significant risks, so it is crucial to ensure that we steer progress in a direction that benefits all of society…(More)”.

Data portability and interoperability: A primer on two policy tools for regulation of digitized industries


Article by Sukhi Gulati-Gilbert and Robert Seamans: “…In this article we describe two other tools, data portability and interoperability, that may be particularly useful in technology-enabled sectors. Data portability allows users to move data from one company to another, helping to reduce switching costs and providing rival firms with access to valuable customer data. Interoperability allows two or more technical systems to exchange data interactively. Due to its interactive nature, interoperability can help prevent lock-in to a specific platform by allowing users to connect across platforms. Data portability and interoperability share some similarities; in addition to potential pro-competitive benefits, the tools promote values of openness, transparency, and consumer choice.

After providing an overview of these topics, we describe the tradeoffs involved with implementing data portability and interoperability. While these policy tools offer lots of promise, in practice there can be many challenges involved when determining how to fund and design an implementation that is secure and intuitive and accomplishes the intended result.  These challenges require that policymakers think carefully about the initial implementation of data portability and interoperability. Finally, to better show how data portability and interoperability can increase competition in an industry, we discuss how they could be applied in the banking and social media sectors. These are just two examples of how data portability and interoperability policy could be applied to many different industries facing increased digitization. Our definitions and examples should be helpful to those interested in understanding the tradeoffs involved in using these tools to promote competition and innovation in the U.S. economy…(More)” See also: Data to Go: The Value of Data Portability as a Means to Data Liquidity.

Regulating Cross-Border Data Flows


Book by Bryan Mercurio, and Ronald Yu: “Data is now one of, if not the world’s most valuable resource. The adoption of data-driven applications across economic sectors has made data and the flow of data so pervasive that it has become integral to everything we as members of society do – from conducting our finances to operating businesses to powering the apps we use every day. For this reason, governing cross-border data flows is inherently difficult given the ubiquity and value of data, and the impact government policies can have on national competitiveness, business attractiveness and personal rights. The challenge for governments is to address in a coherent manner the broad range of data-related issues in the context of a global data-driven economy.

This book engages with the unexplored topic of why and how governments should develop a coherent and consistent strategic framework regulating cross-border data flows. The objective is to fill a very significant gap in the legal and policy setting by considering multiple perspectives in order to assist in the development of a jurisdiction’s coherent and strategic policy framework…(More)“.