Case report by Giampiero Giacomello and Oltion Preka: “In an increasingly technology-dependent world, it is not surprising that STEM (Science, Technology, Engineering, and Mathematics) graduates are in high demand. This state of affairs, however, has made the public overlook the case that not only computing and artificial intelligence are naturally interdisciplinary, but that a huge portion of generated data comes from human–computer interactions, thus they are social in character and nature. Hence, social science practitioners should be in demand too, but this does not seem the case. One of the reasons for such a situation is that political and social science departments worldwide tend to remain in their “comfort zone” and see their disciplines quite traditionally, but by doing so they cut themselves off from many positions today. The authors believed that these conditions should and could be changed and thus in a few years created a specifically tailored course for students in Political Science. This paper examines the experience of the last year of such a program, which, after several tweaks and adjustments, is now fully operational. The results and students’ appreciation are quite remarkable. Hence the authors considered the experience was worth sharing, so that colleagues in social and political science departments may feel encouraged to follow and replicate such an example….(More)”
How Data Can Map and Make Racial Inequality More Visible (If Done Responsibly)
Reflection Document by The GovLab: “Racism is a systemic issue that pervades every aspect of life in the United States and around the world. In recent months, its corrosive influence has been made starkly visible, especially on Black people. Many people are hurting. Their rage and suffering stem from centuries of exclusion and from being subject to repeated bias and violence. Across the country, there have been protests decrying racial injustice. Activists have called upon the government to condemn bigotry and racism, to act against injustice, to address systemic and growing inequality.
Institutions need to take meaningful action to address such demands. Though racism is not experienced in the same way by all communities of color, policymakers must respond to the anxieties and apprehensions of Black people as well as those of communities of color more generally. This work will require institutions and individuals to reflect on how they may be complicit in perpetuating structural and systematic inequalities and harm and to ask better questions about the inequities that exist in society (laid bare in both recent acts of violence and in racial disadvantages in health outcomes during the ongoing COVID-19 crisis). This work is necessary but unlikely to be easy. As Rashida Richardson, Director of Policy Research at the AI Now Institute at NYU notes:
“Social and political stratifications also persist and worsen because they are embedded into our social and legal systems and structures. Thus, it is difficult for most people to see and understand how bias and inequalities have been automated or operationalized over time.”
We believe progress can be made, at least in part, through responsible data access and analysis, including increased availability of (disaggregated) data through data collaboration. Of course, data is only one part of the overall picture, and we make no claims that data alone can solve such deeply entrenched problems. Nonetheless, data can have an impact by making inequalities resulting from racism more quantifiable and inaction less excusable.
…Prioritizing any of these topics will also require increased community engagement and participatory agenda setting. Likewise, we are deeply conscious that data can have a negative as well as positive impact and that technology can perpetuate racism when designed and implemented without the input and participation of minority communities and organizations. While our report here focuses on the promise of data, we need to remain aware of the potential to weaponize data against vulnerable and already disenfranchised communities. In addition, (hidden) biases in data collected and used in AI algorithms, as well as in a host of other areas across the data life cycle, will only exacerbate racial inequalities if not addressed….(More)”
ALSO: The piece is supplemented by a crowdsourced listing of Data-Driven Efforts to Address Racial Inequality.
Fear of a Black and Brown Internet: Policing Online Activism
Paper by Sahar F. Aziz and Khaled A. Beydoun: “Virtual surveillance is the modern extension of established policing models that tie dissident Muslim advocacy to terror suspicion and Black activism to political subversion. Countering Violent Extremism (“CVE”) and Black Identity Extremism (“BIE”) programs that specifically target Muslim and Black populations are shifting from on the ground to online.
Law enforcement exploits social media platforms — where activism and advocacy is robust — to monitor and crack down on activists. In short, the new policing is the old policing, but it is stealthily morphing and moving onto virtual platforms where activism is fluidly unfolding in real time. This Article examines how the law’s failure to keep up with technological advancements in social media poses serious risks to the ability of minority communities to mobilize against racial and religious injustice….(More)”.
Sector-Specific (Data-) Access Regimes of Competitors
Paper by Jörg Hoffmann: “The expected economic and social benefits of data access and sharing are enormous. And yet, particularly in the B2B context, data sharing of privately held data between companies has not taken off at efficient scale. This already led to the adoption of sector specific data governance and access regimes. Two of these regimes are enshrined in the PSD2 that introduced an access to account and a data portability rule for specific account information for third party payment providers.
This paper analyses these sector-specific access and portability regimes and identifies regulatory shortcomings that should be addressed and can serve as further guidance for further data access regulation. It first develops regulatory guidelines that build around the multiple regulatory dimensions of data and the potential adverse effects that may be created by too broad data access regimes.
In this regard the paper assesses the role of factual data exclusivity for data driven innovation incentives for undertakings, the role of industrial policy driven market regulation within the principle of a free market economy, the impact of data sharing on consumer sovereignty and choice, and ultimately data induced-distortions of competition. It develops the findings by taking recourse to basic IP and information economics and the EU competition law case law pertaining refusal to supply cases, the rise of ‘surveillance capitalism’ and to current competition policy considerations with regard to the envisioned preventive competition control regime tackling data rich ‘undertakings of paramount importance for competition across markets’ in Germany. This is then followed by an analysis of the PSD2 access and portability regimes in light of the regulatory principles….(More)”.
The Long Shadow Of The Future
Steven Weber and Nils Gilman at Noema: “We’re living through a real-time natural experiment on a global scale. The differential performance of countries, cities and regions in the face of the COVID-19 pandemic is a live test of the effectiveness, capacity and legitimacy of governments, leaders and social contracts.
The progression of the initial outbreak in different countries followed three main patterns. Countries like Singapore and Taiwan represented Pattern A, where (despite many connections to the original source of the outbreak in China) vigilant government action effectively cut off community transmission, keeping total cases and deaths low. China and South Korea represented Pattern B: an initial uncontrolled outbreak followed by draconian government interventions that succeeded in getting at least the first wave of the outbreak under control.
Pattern C is represented by countries like Italy and Iran, where waiting too long to lock down populations led to a short-term exponential growth of new cases that overwhelmed the healthcare system and resulted in a large number of deaths. In the United States, the lack of effective and universally applied social isolation mechanisms, as well as a fragmented healthcare system and a significant delay in rolling out mass virus testing, led to a replication of Pattern C, at least in densely populated places like New York City and Chicago.“Regime type isn’t correlated with outcomes.”
Despite the Chinese and Americans blaming each other and crediting their own political system for successful responses, the course of the virus didn’t score easy political points on either side of the new Cold War. Regime type isn’t correlated with outcomes. Authoritarian and democratic countries are included in each of the three patterns of responses: authoritarian China and democratic South Korea had effective responses to a dramatic breakout; authoritarian Singapore and democratic Taiwan both managed to quarantine and contain the virus; authoritarian Iran and democratic Italy both experienced catastrophe.
It’s generally a mistake to make long-term forecasts in the midst of a hurricane, but some outlines of lasting shifts are emerging. First, a government or society’s capacity for technical competence in executing plans matters more than ideology or structure. The most effective arrangements for dealing with the pandemic have been found in countries that combine a participatory public culture of information sharing with operational experts competently executing decisions. Second, hyper-individualist views of privacy and other forms of risk are likely to be submerged as countries move to restrict personal freedoms and use personal data to manage public and aggregated social risks. Third, countries that are able to successfully take a longer view of planning and risk management will be at a significant advantage….(More)”.
Innovative Citizen Participation and New Democratic Institutions
Report by the OECD: “Public authorities from all levels of government increasingly turn to Citizens’ Assemblies, Juries, Panels and other representative deliberative processes to tackle complex policy problems ranging from climate change to infrastructure investment decisions. They convene groups of people representing a wide cross-section of society for at least one full day – and often much longer – to learn, deliberate, and develop collective recommendations that consider the complexities and compromises required for solving multifaceted public issues.
This “deliberative wave” has been building since the 1980s, gaining momentum since around 2010. This report has gathered close to 300 representative deliberative practices to explore trends in such processes, identify different models, and analyse the trade-offs among different design choices as well as the benefits and limits of public deliberation.
It includes Good Practice Principles for Deliberative Processes for Public Decision Making, based on comparative empirical evidence gathered by the OECD and in collaboration with leading practitioners from government, civil society, and academics. Finally, the report explores the reasons and routes for embedding deliberative activities into public institutions to give citizens a more permanent and meaningful role in shaping the policies affecting their lives….(More)”.
Using Algorithms to Address Trade-Offs Inherent in Predicting Recidivism
Paper by Jennifer L. Skeem and Christopher Lowenkamp: “Although risk assessment has increasingly been used as a tool to help reform the criminal justice system, some stakeholders are adamantly opposed to using algorithms. The principal concern is that any benefits achieved by safely reducing rates of incarceration will be offset by costs to racial justice claimed to be inherent in the algorithms themselves. But fairness tradeoffs are inherent to the task of predicting recidivism, whether the prediction is made by an algorithm or human.
Based on a matched sample of 67,784 Black and White federal supervisees assessed with the Post Conviction Risk Assessment (PCRA), we compare how three alternative strategies for “debiasing” algorithms affect these tradeoffs, using arrest for a violent crime as the criterion. These candidate algorithms all strongly predict violent re-offending (AUCs=.71-72), but vary in their association with race (r= .00-.21) and shift tradeoffs between balance in positive predictive value and false positive rates. Providing algorithms with access to race (rather than omitting race or ‘blinding’ its effects) can maximize calibration and minimize imbalanced error rates. Implications for policymakers with value preferences for efficiency vs. equity are discussed…(More)”.
EU Company Data: State of the Union 2020
Report by OpenCorporates: “… on access to company data in the EU. It’s completely revised, with more detail on the impact that the lack of access to this critical dataset has – on business, on innovation, on democracy, and society.
The results are still not great however:
- Average score is low
The average score across the EU in terms of access to company data is just 40 out of 100. This is better than the average score 8 years ago, which was just 23 out of 100, but still very low nevertheless. - Some major economies score badly
Some of the EU’s major economies continue to score very badly indeed, with Germany, for example, scoring just 15/100, Italy 10/100, and Spain 0/100. - EU policies undermined
The report identifies 15 areas where the lack of open company data frustrates, impedes or otherwise has a negative impact on EU policy. - Inequalities widened
The report also identifies how inequalities are further widened by poor access to this critical dataset, and how the recovery from COVID-19 will be hampered by it too.
On the plus side, the report also identifies the EU Open Data & PSI Directive passed last year as potentially game changing – but only if it is implemented fully, and there are significant doubts whether this will happen….(More)”
Technical Excellence and Scale
Cory Doctorow at EFF: “In America, we hope that businesses will grow by inventing amazing things that people love – rather than through deep-pocketed catch-and-kill programs in which every competitor is bought and tamed before it can grow to become a threat. We want vibrant, competitive, innovative markets where companies vie to create the best products. Growth solely through merger-and-acquisition helps create a world in which new firms compete to be bought up and absorbed into the dominant players, and customers who grow dissatisfied with a product or service and switch to a “rival” find that they’re still patronizing the same company—just another division.
To put it bluntly: we want companies that are good at making things as well as buying things.
This isn’t the whole story, though.
Small companies with successful products can become victims of their own success. As they are overwhelmed by eager new customers, they are strained beyond their technical and financial limits – for example, they may be unable to buy server hardware fast enough, and unable to lash that hardware together in efficient ways that let them scale up to meet demand.
When we look at the once small, once beloved companies that are now mere divisions of large, widely mistrusted ones—Instagram and Facebook; YouTube and Google; Skype and Microsoft; DarkSkies and Apple—we can’t help but notice that they are running at unimaginable scale, and moreover, they’re running incredibly well.
These services were once plagued with outages, buffering delays, overcapacity errors, slowdowns, and a host of other evils of scale. Today, they run so well that outages are newsworthy events.
There’s a reason for that: big tech companies are really good at being big. Whatever you think of Amazon, you can’t dispute that it gets a lot of parcels from A to B with remarkably few bobbles. Google’s search results arrive in milliseconds, Instagram photos load as fast as you can scroll them, and even Skype is far more reliable than in the pre-Microsoft days. These services have far more users than they ever did as independents, and yet, they are performing better than they did in those early days.
Can we really say that this is merely “buying things” and not also “making things?” Isn’t this innovation? Isn’t this technical accomplishment? It is. Does that mean big = innovative? It does not….(More)”.
Individualism During Crises: Big Data Analytics of Collective Actions amid COVID-19
Paper by Bo Bian et al: “Collective actions, such as charitable crowdfunding and social distancing, are useful for alleviating the negative impact of the COVID-19 pandemic. However, engagements in these actions across the U.S. are “consistently inconsistent” and are frequently linked to individualism in the press. We present the first evidence on how individualism shapes online and offline collective actions during a crisis through big data analytics. Following economic historical studies, we leverage GIS techniques to construct a U.S. county-level individualism measure that traces the time each county spent on the American frontier between 1790 and 1890. We then use high-dimensional fixed-effect models, text mining, geo-distributed big data computing and a novel identification strategy based on migrations to analyze GoFundMe fundraising activities as well as county- and individual-level social distancing compliance.
Our analysis uncovers several insights. First, higher individualism reduces both online donations and social distancing during the COVID-19 pandemic. An interquartile increase in individualism reduces COVID-related charitable campaigns and funding by 48% and offsets the effect of state lockdown orders on social distancing by 41%. Second, government interventions, such as stimulus checks, can potentially mitigate the negative effect of individualism on charitable crowdfunding. Third, the individualism effect may be partly driven by a failure to internalize the externality of collective actions: we find stronger results in counties where social distancing generates higher externalities (those with higher population densities or more seniors). Our research is the first to uncover the potential downsides of individualism during crises. It also highlights the importance of big data-driven, culture-aware policymaking….(More)”.