A New Model for Industry-Academic Partnerships


Working Paper by Gary King and Nathaniel Persily: “The mission of the academic social sciences is to understand and ameliorate society’s greatest challenges. The data held by private companies holds vast potential to further this mission. Yet, because of its interaction with highly politicized issues, customer privacy, proprietary content, and differing goals of firms and academics, these data are often inaccessible to university researchers.

We propose here a new model for industry-academic partnerships that addresses these problems via a novel organizational structure: Respected scholars form a commission which, as a trusted third party, receives access to all relevant firm information and systems, and then recruits independent academics to do research in specific areas following standard peer review protocols organized and funded by nonprofit foundations.

We also report on a partnership we helped forge under this model to make data available about the extremely visible and highly politicized issues surrounding the impact of social media on elections and democracy. In our partnership, Facebook will provide privacy-preserving data and access; seven major politically and substantively diverse nonprofit foundations will fund the research; and the Social Science Research Council will oversee the peer review process for funding and data access….(More)”.

Democracy is in danger when the census undercounts vulnerable populations


Emily Klancher Merchant at The Conversation: “The 2020 U.S. Census is still two years away, but experts and civil rights groups are already disputing the results.At issue is whether the census will fulfill the Census Bureau’s mandate to “count everyone once, only once, and in the right place.”

The task is hardly as simple as it seems and has serious political consequences. Recent changes to the 2020 census, such as asking about citizenship status, will make populations already vulnerable to undercounting even more likely to be missed. These vulnerable populations include the young, poor, nonwhite, non-English-speaking, foreign-born and transient.

An accurate count is critical to the functioning of the U.S. government. Census data determine how the power and resources of the federal government are distributed across the 50 states. This includes seats in the House, votes in the Electoral College and funds for federal programs. Census data also guide the drawing of congressional and other voting districts and the enforcement of civil and voting rights laws.

Places where large numbers of people go uncounted get less than their fair share of political representation and federal resources. When specific racial and ethnic groups are undercounted, it is harder to identify and rectify violations of their civil rights. My research on the international history of demography demonstrates that the question of how to equitably count the population is not new, nor is it unique to the United States. The experience of the United States and other countries may hold important lessons as the Census Bureau finalizes its plans for the 2020 count.

Let’s take a look at that history….

In 1790, the United States became the first country to take a regular census. Following World War II, the U.S. government began to promote census-taking in other countries. U.S. leaders believed data about the size and location of populations throughout the Western Hemisphere could help the government plan defense. What’s more, U.S. businesses could also use the data to identify potential markets and labor forces in nearby countries.

The U.S. government began investing in a program called the Census of the Americas. Through this program, the State Department provided financial support and the Census Bureau provided technical assistance to Western Hemisphere countries taking censuses in 1950.

United Nations demographers also viewed the Census of the Americas as an opportunity. Data that were standardized across countries could serve as the basis for projections of world population growth and the calculation of social and economic indicators. They also hoped that censuses would provide useful information to newly established governments. The U.N. turned the Census of the Americas into a global affair, recommending that “all Member States planning population censuses about 1950 use comparable schedules so far as possible.” Since 1960, the U.N. has sponsored a World Census Program every 10 years. The 2020 World Census Program will be the seventh round….

Not all countries went along with the program. For example, Lebanon’s Christian rulers feared that a census would show Christians to be a minority, undermining the legitimacy of their government. However, for the 65 sovereign countries taking censuses between 1945 and 1954, leaders faced the same question the U.S. faces today: How can we make sure that everyone has an equal chance of being counted?…(More)”.

Algorithmic Sovereignty


Thesis by Denis Roio: “This thesis describes a practice based research journey across various projects dealing with the design of algorithms, to highlight the governance implications in design choices made on them. The research provides answers and documents methodologies to address the urgent need for more awareness of decisions made by algorithms about the social and economical context in which we live. Algorithms consitute a foundational basis across different fields of studies: policy making, governance, art and technology. The ability to understand what is inscribed in such algorithms, what are the consequences of their execution and what is the agency left for the living world is crucial. Yet there is a lack of interdisciplinary and practice based literature, while specialised treatises are too narrow to relate to the broader context in which algorithms are enacted.

This thesis advances the awareness of algorithms and related aspects of sovereignty through a series of projects documented as participatory action research. One of the projects described, Devuan, leads to the realisation of a new, worldwide renown operating system. Another project, “sup”, consists of a minimalist approach to mission critical software and literate programming to enhance security and reliability of applications. Another project, D-CENT, consisted in a 3 year long path of cutting edge research funded by the EU commission on the emerging dynamics of participatory democracy connected to the technologies adopted by citizen organizations.

My original contribution to knowledge lies within the function that the research underpinning these projects has on the ability to gain a better understanding of sociopolitical aspects connected to the design and management of algorithms. It suggests that we can improve the design and regulation of future public, private and common spaces which are increasingly governed by algorithms by understanding not only economical and legal implications, but also the connections between design choices and the sociopolitical context for their development and execution….(More)”.

How Democracy Can Survive Big Data


Colin Koopman in The New York Times: “…The challenge of designing ethics into data technologies is formidable. This is in part because it requires overcoming a century-long ethos of data science: Develop first, question later. Datafication first, regulation afterward. A glimpse at the history of data science shows as much.

The techniques that Cambridge Analytica uses to produce its psychometric profiles are the cutting edge of data-driven methodologies first devised 100 years ago. The science of personality research was born in 1917. That year, in the midst of America’s fevered entry into war, Robert Sessions Woodworth of Columbia University created the Personal Data Sheet, a questionnaire that promised to assess the personalities of Army recruits. The war ended before Woodworth’s psychological instrument was ready for deployment, but the Army had envisioned its use according to the precedent set by the intelligence tests it had been administering to new recruits under the direction of Robert Yerkes, a professor of psychology at Harvard at the time. The data these tests could produce would help decide who should go to the fronts, who was fit to lead and who should stay well behind the lines.

The stakes of those wartime decisions were particularly stark, but the aftermath of those psychometric instruments is even more unsettling. As the century progressed, such tests — I.Q. tests, college placement exams, predictive behavioral assessments — would affect the lives of millions of Americans. Schoolchildren who may have once or twice acted out in such a way as to prompt a psychometric evaluation could find themselves labeled, setting them on an inescapable track through the education system.

Researchers like Woodworth and Yerkes (or their Stanford colleague Lewis Terman, who formalized the first SAT) did not anticipate the deep consequences of their work; they were too busy pursuing the great intellectual challenges of their day, much like Mr. Zuckerberg in his pursuit of the next great social media platform. Or like Cambridge Analytica’s Christopher Wylie, the twentysomething data scientist who helped build psychometric profiles of two-thirds of all Americans by leveraging personal information gained through uninformed consent. All of these researchers were, quite understandably, obsessed with the great data science challenges of their generation. Their failure to consider the consequences of their pursuits, however, is not so much their fault as it is our collective failing.

For the past 100 years we have been chasing visions of data with a singular passion. Many of the best minds of each new generation have devoted themselves to delivering on the inspired data science promises of their day: intelligence testing, building the computer, cracking the genetic code, creating the internet, and now this. We have in the course of a single century built an entire society, economy and culture that runs on information. Yet we have hardly begun to engineer data ethics appropriate for our extraordinary information carnival. If we do not do so soon, data will drive democracy, and we may well lose our chance to do anything about it….(More)”.

The People vs. Democracy: Why Our Freedom Is in Danger and How to Save It


Book by Yascha Mounk: “The world is in turmoil. From India to Turkey and from Poland to the United States, authoritarian populists have seized power. As a result, Yascha Mounk shows, democracy itself may now be at risk.

Two core components of liberal democracy—individual rights and the popular will—are increasingly at war with each other. As the role of money in politics soared and important issues were taken out of public contestation, a system of “rights without democracy” took hold. Populists who rail against this say they want to return power to the people. But in practice they create something just as bad: a system of “democracy without rights.”

The consequence, Mounk shows in The People vs. Democracy, is that trust in politics is dwindling. Citizens are falling out of love with their political system. Democracy is wilting away. Drawing on vivid stories and original research, Mounk identifies three key drivers of voters’ discontent: stagnating living standards, fears of multiethnic democracy, and the rise of social media. To reverse the trend, politicians need to enact radical reforms that benefit the many, not the few.

The People vs. Democracy is the first book to go beyond a mere description of the rise of populism. In plain language, it describes both how we got here and where we need to go. For those unwilling to give up on either individual rights or the popular will, Mounk shows, there is little time to waste: this may be our last chance to save democracy….(More)”

Psychographics: the behavioural analysis that helped Cambridge Analytica know voters’ minds


Michael Wade at The Conversation: “Much of the discussion has been on how Cambridge Analytica was able to obtain data on more than 50m Facebook users – and how it allegedly failed to delete this data when told to do so. But there is also the matter of what Cambridge Analytica actually did with the data. In fact the data crunching company’s approach represents a step change in how analytics can today be used as a tool to generate insights – and to exert influence.

For example, pollsters have long used segmentation to target particular groups of voters, such as through categorising audiences by gender, age, income, education and family size. Segments can also be created around political affiliation or purchase preferences. The data analytics machine that presidential candidate Hillary Clinton used in her 2016 campaign – named Ada after the 19th-century mathematician and early computing pioneer – used state-of-the-art segmentation techniques to target groups of eligible voters in the same way that Barack Obama had done four years previously.

Cambridge Analytica was contracted to the Trump campaign and provided an entirely new weapon for the election machine. While it also used demographic segments to identify groups of voters, as Clinton’s campaign had, Cambridge Analytica also segmented using psychographics. As definitions of class, education, employment, age and so on, demographics are informational. Psychographics are behavioural – a means to segment by personality.

This makes a lot of sense. It’s obvious that two people with the same demographic profile (for example, white, middle-aged, employed, married men) can have markedly different personalities and opinions. We also know that adapting a message to a person’s personality – whether they are open, introverted, argumentative, and so on – goes a long way to help getting that message across….

There have traditionally been two routes to ascertaining someone’s personality. You can either get to know them really well – usually over an extended time. Or you can get them to take a personality test and ask them to share it with you. Neither of these methods is realistically open to pollsters. Cambridge Analytica found a third way, with the assistance of two University of Cambridge academics.

The first, Aleksandr Kogan, sold them access to 270,000 personality tests completed by Facebook users through an online app he had created for research purposes. Providing the data to Cambridge Analytica was, it seems, against Facebook’s internal code of conduct, but only now in March 2018 has Kogan been banned by Facebook from the platform. In addition, Kogan’s data also came with a bonus: he had reportedly collected Facebook data from the test-takers’ friends – and, at an average of 200 friends per person, that added up to some 50m people.

However, these 50m people had not all taken personality tests. This is where the second Cambridge academic, Michal Kosinski, came in. Kosinski – who is said to believe that micro-targeting based on online data could strengthen democracy – had figured out a way to reverse engineer a personality profile from Facebook activity such as likes. Whether you choose to like pictures of sunsets, puppies or people apparently says a lot about your personality. So much, in fact, that on the basis of 300 likes, Kosinski’s model is able to predict someone’s personality profile with the same accuracy as a spouse….(More)”

Co-creation in Urban Governance: From Inclusion to Innovation


Paper by Dorthe Hedensted Lund: “This article sets out to establish what we mean by the recent buzzword ‘co-creation’ and what practical application this concept entails for democracy in urban governance, both in theory and practice.

The rise of the concept points to a shift in how public participation is understood. Whereas from the 1970s onwards the discussions surrounding participation centred on rights and power, following Sherry Arnstein, participation conceptualised as co-creation instead focuses on including diverse forms of knowledge in urban processes in order to create innovative solutions to complex problems.

Consequently, democratic legitimacy now relies to a much greater extent on output, rather than input legitimacy. Rather than provision of inclusive spaces for democratic debate and empowerment of the deprived, which have been the goals of numerous urban participatory efforts in the past, it is the ability to solve complex problems that has become the main criterion for the evaluation of co-creation. Furthermore, conceptualising participation as co-creation has consequences for the roles available to both citizens and public administrators in urban processes, which has implications for urban governance. An explicit debate, both in academia and in practice, about the normative content and implications of conceptualising participation as co-creation is therefore salient and necessary….(More).

Democracy Without Participation: A New Politics for a Disengaged Era


Phil Parvin at Rex Publica: “Changing patterns of political participation observed by political scientists over the past half-century undermine traditional democratic theory and practice. The vast majority of democratic theory, and deliberative democratic theory in particular, either implicitly or explicitly assumes the need for widespread citizen participation. It requires that all citizens possess the opportunity to participate and also that they take up this opportunity. But empirical evidence gathered over the past half-century strongly suggests that many citizens do not have a meaningful opportunity to participate in the ways that many democratic theorists require, and do not participate in anything like the numbers that they believe is necessary.

This paper outlines some of the profound changes that have been experienced by liberal democratic states in the 20th and early 21st Centuries, changes which are still ongoing, and which have resulted in declines in citizens participation and trust, the marginalisation of citizens from democratic life, and the entrenchment of social and economic inequalities which have damaged democracy. The paper challenges the conventional wisdom in rejecting the idea that the future of democracy lies in encouraging more widespread participation.

The paper takes seriously the failure of the strategies adopted by many states to increase participation, especially among the poor, and suggests that instead of requiring more of citizens, we should in fact be requiring less of them. Instead of seeking to encourage more citizen participation, we should acknowledge that citizens will probably not participate in the volume, or in the ways, many democratic theorists would like, and that therefore we need an alternative approach: a regime which can continue to produce democratic outcomes, and which satisfies the requirements of political equality, in the absence of widespread participation by citizens….(More)”.

Anthology on Democratic Innovation


Report by Democracy Lab: “Democratic systems are in a phase of systemic transition: from the post-war understanding of what democracy is – and how it works – towards a different, deeper democracy. In regards to the numerous challenges democracies faces, we need to question how to make democracies more resilient and to explore what the next steps towards a new form of democracy could be. It seems unlikely that today’s challenges, such as the destruction of our ecosystem or structural inequality, can be solved with the paradigms, structures and processes that helped produce them.

Democratic systems need to be able to shape an increasingly complex world and respond to the socio-economic, cultural, technological, and ecological transformation processes that societies are going through. Public discourse about the future of democracy often solely focuses on democratic reforms in order to improve existing structures and processes within the parameters of postwar democracy.

Many ideas and experiments thus aim at improving the “status quo of politics”. From citizens’ assemblies to digital tools for deliberation and participation, there is an abundance of ideas and tools that could help update our democratic systems. In his book “Realizing Democracy”, Harvard scholar Alberto Mangabeira Unger adds a new element to this “update” with his idea of radical reform: In his words, “reform is radical when it addresses and changes the basic arrangements of a society; its formative structure of its institutions and enacted beliefs; it is reform because it deals with one discrete part of this structure at a time.” According to Unger, societies must work on both the radical and incremental level of political reform. In addition to changes at policy level, societies must be willing to also reflect on what would make a difference and open up to a more fundamental perspective and self-reflection on why democracy is needed, and how its structures can be rebuild within the boundaries of the ecosystem….

The Anthology on Democratic Innovation presents a selection of the projects and ideas discussed during the Conference. It gives decision-makers, academia, journalists and civil society a glimpse into the vast array of ideas that are “already out there” in order to improve liberal democracies and make them fit for the 21st century….(More)”.

Could the open government movement shut the door on Freedom of Information


 and  in The Conversation: “For democracy to work, citizens need to know what their government is doing. Then they can hold government officials and institutions accountable.

Over the last 50 years, Freedom of Information – or FOI – laws have been one of the most useful methods for citizens to learn what government is doing. These state and federal laws give people the power to request, and get, government documents. From everyday citizens to journalists, FOI laws have proven a powerful way to uncover the often-secret workings of government.

But a potential threat is emerging – from an unexpected place – to FOI laws.

We are scholars of government administration, ethics and transparency. And our research leads us to believe that while FOI laws have always faced many challenges, including resistance, evasion,  and poor implementation and enforcement, the last decade has brought a different kind of challenge in the form of a new approach to transparency….

The open government movement could help FOI implementation. Government information posted online, which is a core goal of open government advocates, can reduce the number of FOI requests. Open government initiatives can explicitly promote FOI by encouraging the passage of FOI laws, offering more training for officials who fill FOI requests, and developing technologies to make it easier to process and track FOI requests.

On the other hand, the relationship between open government and FOI may not always be positive in practice.

First, as with all kinds of public policy issues, resources – both money and political attention – are inherently scarce. Government officials now have to divide their attention between FOI and other open government initiatives. And funders now have to divide their financial resources between FOI and other open government initiatives.

Second, the open government reform movement as well as the FOI movement have long depended on nonprofit advocacy groups – from the National Freedom of Information Coalition and its state affiliates to the Sunlight Foundation – to obtain and disseminate government information. This means that the financial stability of those nonprofit groups is crucial. But their efforts, as they grow, may each only get a shrinking portion of the total amount of grant money available. Freedominfo.org, a website for gathering and comparing information on FOI laws around the world, had to suspend its operations in 2017 due to resources drying up.

We believe that priorities among government officials and good government advocates may also shift away from FOI. At a time when open data is “hot,” FOI programs could get squeezed as a result of this competition. Further, by allowing governments to claim credit for more politically convenient reforms such as online data portals, the open government agenda may create a false sense of transparency – there’s a lot more government information that isn’t available in those portals.

This criticism was leveled recently against Kenya, whose government launched a high-profile open data portal for publishing data on government performance and activities in 2011, yet delayed passage of an FOI law until 2016.

Similarly, in the United Kingdom, one government minister said in 2012,“I’d like to make Freedom of Information redundant, by pushing out so much data that people won’t have to ask for it.”…(More)”