The Third Pillar: How Markets and the State leave the Community behind


Book by Raghuram Rajan: “….In The Third Pillar he offers up a magnificent big-picture framework for understanding how these three forces–the state, markets, and our communities–interact, why things begin to break down, and how we can find our way back to a more secure and stable plane. 

The “third pillar” of the title is the community we live in. Economists all too often understand their field as the relationship between markets and the state, and they leave squishy social issues for other people. That’s not just myopic, Rajan argues; it’s dangerous. All economics is actually socioeconomics – all markets are embedded in a web of human relations, values and norms. As he shows, throughout history, technological phase shifts have ripped the market out of those old webs and led to violent backlashes, and to what we now call populism. Eventually, a new equilibrium is reached, but it can be ugly and messy, especially if done wrong. 

Right now, we’re doing it wrong. As markets scale up, the state scales up with it, concentrating economic and political power in flourishing central hubs and leaving the periphery to decompose, figuratively and even literally. Instead, Rajan offers a way to rethink the relationship between the market and civil society and argues for a return to strengthening and empowering local communities as an antidote to growing despair and unrest. Rajan is not a doctrinaire conservative, so his ultimate argument that decision-making has to be devolved to the grass roots or our democracy will continue to wither, is sure to be provocative. But even setting aside its solutions, The Third Pillar is a masterpiece of explication, a book that will be a classic of its kind for its offering of a wise, authoritative and humane explanation of the forces that have wrought such a sea change in our lives….(More)”.

Facebook will open its data up to academics to see how it impacts elections


MIT Technology Review: “More than 60 researchers from 30 institutions will get access to Facebook user data to study its impact on elections and democracy, and how it’s used by advertisers and publishers.

A vast trove: Facebook will let academics see which websites its users linked to from January 2017 to February 2019. Notably, that means they won’t be able to look at the platform’s impact on the US presidential election in 2016, or on the Brexit referendum in the UK in the same year.

Despite this slightly glaring omission, it’s still hard to wrap your head around the scale of the data that will be shared, given that Facebook is used by 1.6 billion people every day. That’s more people than live in all of China, the most populous country on Earth. It will be one of the largest data sets on human behavior online to ever be released.

The process: Facebook didn’t pick the researchers. They were chosen by the Social Science Research Council, a US nonprofit. Facebook has been working on this project for over a year, as it tries to balance research interests against user privacy and confidentiality.

Privacy: In a blog post, Facebook said it will use a number of statistical techniques to make sure the data set can’t be used to identify individuals. Researchers will be able to access it only via a secure portal that uses a VPN and two-factor authentication, and there will be limits on the number of queries they can each run….(More)”.

Politics and Technology in the Post-Truth Era


Book edited by Anna Visvizi and Miltiadis D. Lytras: “Advances in information and communication technology (ICT) have directly impacted the way in which politics operates today. Bringing together research on Europe, the US, South America, the Middle East, Asia and Africa, this book examines the relationship between ICT and politics in a global perspective.

Technological innovations such as big data, data mining, sentiment analysis, cognitive computing, artificial intelligence, virtual reality, augmented reality, social media and blockchain technology are reshaping the way ICT intersects with politics and in this collection contributors examine these developments, demonstrating their impact on the political landscape. Chapters examine topics such as cyberwarfare and propaganda, post-Soviet space, Snowden, US national security, e-government, GDPR, democratization in Africa and internet freedom.


Providing an overview of new research on the emerging relationship between the promise and potential inherent in ICT and its impact on politics, this edited collection will prove an invaluable text for students, researchers and practitioners working in the fields of Politics, International Relations and Computer Science…..(More)”

2018 Global Go To Think Tank Index Report


Report by James G. McGann: “The Think Tanks and Civil Societies Program (TTCSP) of the Lauder Institute at the University of Pennsylvania conducts research on the role policy institutes play in governments and civil societies around the world. Often referred to as the “think tanks’ think tank,” TTCSP examines the evolving role and character of public policy research organizations. Over the last 27 years, the TTCSP has developed and led a series of global initiatives that have helped bridge the gap between knowledge and policy in critical policy areas such as international peace and security, globalization and governance, international economics, environmental issues, information and society, poverty alleviation, and healthcare and global health. These international collaborative efforts are designed to establish regional and international networks of policy institutes and communities that improve policy making while strengthening democratic institutions and civil societies around the world.

The TTCSP works with leading scholars and practitioners from think tanks and universities in a variety of collaborative efforts and programs, and produces the annual Global Go To think Tank Index that ranks the world’s leading think tanks in a variety of categories. This is achieved with the help of a panel of over 1,796 peer institutions and experts from the print and electronic media, academia, public and private donor institutions, and governments around the world. We have strong relationships with leading think tanks around the world, and our annual think Tank Index is used by academics, journalists, donors and the public to locate and connect with the leading centers of public policy research around the world. Our goal is to increase the profile and performance of think tanks and raise the public awareness of the important role think tanks play in governments and civil societies around the globe.”…(More)”.

Institutions as Social Theory


Blogpost by Titus Alexander: “The natural sciences comprise of a set of institutions and methods designed to improve our understanding of the physical world. One of the most powerful things science does is to produce theories – models of reality – that are used by others to change the world. The benefits of using science are so great that societies have created many channels to develop and use research to improve the human condition.

Social scientists also seek to improve the human condition. However, the channels from research to application are often weak and most social research is buried in academic papers and books. Some will inform policy via think tanks, civil servants or pressure groups but practitioners and politicians often prefer their own judgement and prejudices, using research only when it suits them. But a working example – the institution as the method – has more influence than a research paper. The evidence is tangible, like an experiment in natural science, and includes all the complexities of real life. It demonstrates its reliability over time and provides proof of what works.

Reflexivity is key to social science

In the physical sciences the investigator is separate from the subject of investigation and she or he has no influence on what they observe. Generally, theories in the human sciences cannot provide this kind of detached explanation, because societies are reflexive. When we study human behaviour we also influence it. People change what they do in response to being studied. They use theories to change their own behaviour or the behaviour of others. Many scholars and practitioners have explored reflexivity, including Albert BanduraPierre Bourdieu and the financier George Soros. Anthony Giddens called it the ‘double hermeneutic’.

The fact that society is reflexive is the key to effective social science. Like scientists, societies create systematic detachment to increase objectivity in decision-making, through advisers, boards, regulators, opinion polls and so on. Peer reviewed social science research is a form of detachment, but it is often so detached to be irrelevant….(More)”.

The Think-Tank Dilemma


Blog by Yoichi Funabashi: “Without the high-quality research that independent think tanks provide, there can be no effective policymaking, nor even a credible basis for debating major issues. Insofar as funding challenges, foreign influence-peddling, and populist attacks on truth pose a threat to such institutions tanks, they threaten democracy itself….

The Brookings Institution in Washington, DC – perhaps the world’s top think tank – is under scrutiny for receiving six-figure donations from Chinese telecommunications giant Huawei, which many consider to be a security threat. And since the barbaric murder of Saudi journalist Jamal Khashoggi last October, many other Washington-based think tanks have come under pressure to stop accepting donations from Saudi Arabia.

These recent controversies have given rise to a narrative that Washington-based think tanks are facing a funding crisis. In fact, traditional think tanks are confronting three major challenges that have put them in a uniquely difficult situation. Not only are they facing increased competition from for-profit think tanks such as the McKinsey Global Institute and the Eurasia Group; they also must negotiate rising geopolitical tensions, especially between the United States and China.And complicating matters further, many citizens, goaded by populist harangues, have become dismissive of “experts” and the fact-based analyses that think tanks produce (or at least should produce).

With respect to the first challenge, Daniel Drezner of Tufts University argues in The Ideas Industry: How Pessimists, Partisans, and Plutocrats are Transforming the Marketplace of Ideas that for-profit think tanks have engaged in thought leadership by operating as platforms for provocative thinkers who push big ideas. Whereas many non-profit think tanks – as well as universities and non-governmental organizations – remain “old-fashioned” in their approach to data, their for-profit counterparts thrive by finding the one statistic that captures public attention in the digital age. Given their access to both public and proprietary information, for-profit think tanks are also able to maximize the potential of big data in ways that traditional think tanks cannot.

Moreover, with the space for balanced foreign-policy arguments narrowing, think tanks are at risk of becoming tools of geopolitical statecraft. This is especially true now that US-China relations are deteriorating and becoming more ideologically tinged.

Over time, foreign governments of all stripes have cleverly sought to influence policymaking not only in Washington, but also in London, Brussels, Berlin, and elsewhere, by becoming significant donors to think tanks. Governments realize that the well-connected think tanks that act as “power brokers” vis-à-vis the political establishment have been facing fundraising challenges since the 2008 financial crisis. In some cases, locally based think tanks have even been accused of becoming fronts for foreign authoritarian governments….(More)”.


Index: Open Data


By Alexandra Shaw, Michelle Winowatan, Andrew Young, and Stefaan Verhulst

The Living Library Index – inspired by the Harper’s Index – provides important statistics and highlights global trends in governance innovation. This installment focuses on open data and was originally published in 2018.

Value and Impact

  • The projected year at which all 28+ EU member countries will have a fully operating open data portal: 2020

  • Between 2016 and 2020, the market size of open data in Europe is expected to increase by 36.9%, and reach this value by 2020: EUR 75.7 billion

Public Views on and Use of Open Government Data

  • Number of Americans who do not trust the federal government or social media sites to protect their data: Approximately 50%

  • Key findings from The Economist Intelligence Unit report on Open Government Data Demand:

    • Percentage of respondents who say the key reason why governments open up their data is to create greater trust between the government and citizens: 70%

    • Percentage of respondents who say OGD plays an important role in improving lives of citizens: 78%

    • Percentage of respondents who say OGD helps with daily decision making especially for transportation, education, environment: 53%

    • Percentage of respondents who cite lack of awareness about OGD and its potential use and benefits as the greatest barrier to usage: 50%

    • Percentage of respondents who say they lack access to usable and relevant data: 31%

    • Percentage of respondents who think they don’t have sufficient technical skills to use open government data: 25%

    • Percentage of respondents who feel the number of OGD apps available is insufficient, indicating an opportunity for app developers: 20%

    • Percentage of respondents who say OGD has the potential to generate economic value and new business opportunity: 61%

    • Percentage of respondents who say they don’t trust governments to keep data safe, protected, and anonymized: 19%

Efforts and Involvement

  • Time that’s passed since open government advocates convened to create a set of principles for open government data – the instance that started the open data government movement: 10 years

  • Countries participating in the Open Government Partnership today: 79 OGP participating countries and 20 subnational governments

  • Percentage of “open data readiness” in Europe according to European Data Portal: 72%

    • Open data readiness consists of four indicators which are presence of policy, national coordination, licensing norms, and use of data.

  • Number of U.S. cities with Open Data portals: 27

  • Number of governments who have adopted the International Open Data Charter: 62

  • Number of non-state organizations endorsing the International Open Data Charter: 57

  • Number of countries analyzed by the Open Data Index: 94

  • Number of Latin American countries that do not have open data portals as of 2017: 4 total – Belize, Guatemala, Honduras and Nicaragua

  • Number of cities participating in the Open Data Census: 39

Demand for Open Data

  • Open data demand measured by frequency of open government data use according to The Economist Intelligence Unit report:

    • Australia

      • Monthly: 15% of respondents

      • Quarterly: 22% of respondents

      • Annually: 10% of respondents

    • Finland

      • Monthly: 28% of respondents

      • Quarterly: 18% of respondents

      • Annually: 20% of respondents

    •  France

      • Monthly: 27% of respondents

      • Quarterly: 17% of respondents

      • Annually: 19% of respondents

        •  
    • India

      • Monthly: 29% of respondents

      • Quarterly: 20% of respondents

      • Annually: 10% of respondents

    • Singapore

      • Monthly: 28% of respondents

      • Quarterly: 15% of respondents

      • Annually: 17% of respondents 

    • UK

      • Monthly: 23% of respondents

      • Quarterly: 21% of respondents

      • Annually: 15% of respondents

    • US

      • Monthly: 16% of respondents

      • Quarterly: 15% of respondents

      • Annually: 20% of respondents

  • Number of FOIA requests received in the US for fiscal year 2017: 818,271

  • Number of FOIA request processed in the US for fiscal year 2017: 823,222

  • Distribution of FOIA requests in 2017 among top 5 agencies with highest number of request:

    • DHS: 45%

    • DOJ: 10%

    • NARA: 7%

    • DOD: 7%

    • HHS: 4%

Examining Datasets

  • Country with highest index score according to ODB Leaders Edition: Canada (76 out of 100)

  • Country with lowest index score according to ODB Leaders Edition: Sierra Leone (22 out of 100)

  • Number of datasets open in the top 30 governments according to ODB Leaders Edition: Fewer than 1 in 5

  • Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition: 19%

  • Average percentage of datasets that are open in the top 30 open data governments according to ODB Leaders Edition by sector/subject:

    • Budget: 30%

    • Companies: 13%

    • Contracts: 27%

    • Crime: 17%

    • Education: 13%

    • Elections: 17%

    • Environment: 20%

    • Health: 17%

    • Land: 7%

    • Legislation: 13%

    • Maps: 20%

    • Spending: 13%

    • Statistics: 27%

    • Trade: 23%

    • Transport: 30%

  • Percentage of countries that release data on government spending according to ODB Leaders Edition: 13%

  • Percentage of government data that is updated at regular intervals according to ODB Leaders Edition: 74%

  • Number of datasets available through:

  • Number of datasets classed as “open” in 94 places worldwide analyzed by the Open Data Index: 11%

  • Percentage of open datasets in the Caribbean, according to Open Data Census: 7%

  • Number of companies whose data is available through OpenCorporates: 158,589,950

City Open Data

  • New York City

  • Singapore

    • Number of datasets published in Singapore: 1,480

    • Percentage of datasets with standardized format: 35%

    • Percentage of datasets made as raw as possible: 25%

  • Barcelona

    • Number of datasets published in Barcelona: 443

    • Open data demand in Barcelona measured by:

      • Number of unique sessions in the month of September 2018: 5,401

    • Quality of datasets published in Barcelona according to Tim Berners Lee 5-star Open Data: 3 stars

  • London

    • Number of datasets published in London: 762

    • Number of data requests since October 2014: 325

  • Bandung

    • Number of datasets published in Bandung: 1,417

  • Buenos Aires

    • Number of datasets published in Buenos Aires: 216

  • Dubai

    • Number of datasets published in Dubai: 267

  • Melbourne

    • Number of datasets published in Melbourne: 199

Sources

  • About OGP, Open Government Partnership. 2018.  

Seven design principles for using blockchain for social impact


Stefaan Verhulst at Apolitical: “2018 will probably be remembered as the bust of the blockchain hype. Yet even as crypto currencies continue to sink in value and popular interest, the potential of using blockchain technologies to achieve social ends remains important to consider but poorly understood.

In 2019, business will continue to explore blockchain for sectors as disparate as finance, agriculture, logistics and healthcare. Policymakers and social innovators should also leverage 2019 to become more sophisticated about blockchain’s real promise, limitations  and current practice.

In a recent report I prepared with Andrew Young, with the support of the Rockefeller Foundation, we looked at the potential risks and challenges of using blockchain for social change — or “Blockchan.ge.” A number of implementations and platforms are already demonstrating potential social impact.

The technology is now being used to address issues as varied as homelessness in New York City, the Rohingya crisis in Myanmar and government corruption around the world.

In an illustration of the breadth of current experimentation, Stanford’s Center for Social Innovation recently analysed and mapped nearly 200 organisations and projects trying to create positive social change using blockchain. Likewise, the GovLab is developing a mapping of blockchange implementations across regions and topic areas; it currently contains 60 entries.

All these examples provide impressive — and hopeful — proof of concept. Yet despite the very clear potential of blockchain, there has been little systematic analysis. For what types of social impact is it best suited? Under what conditions is it most likely to lead to real social change? What challenges does blockchain face, what risks does it pose and how should these be confronted and mitigated?

These are just some of the questions our report, which builds its analysis on 10 case studies assembled through original research, seeks to address.

While the report is focused on identity management, it contains a number of lessons and insights that are applicable more generally to the subject of blockchange.

In particular, it contains seven design principles that can guide individuals or organisations considering the use of blockchain for social impact. We call these the Genesis principles, and they are outlined at the end of this article…(More)”.

Distributed, privacy-enhancing technologies in the 2017 Catalan referendum on independence: New tactics and models of participatory democracy


M. Poblet at First Monday: “This paper examines new civic engagement practices unfolding during the 2017 referendum on independence in Catalonia. These practices constitute one of the first signs of some emerging trends in the use of the Internet for civic and political action: the adoption of horizontal, distributed, and privacy-enhancing technologies that rely on P2P networks and advanced cryptographic tools. In this regard, the case of the 2017 Catalan referendum, framed within conflicting political dynamics, can be considered a first-of-its kind in participatory democracy. The case also offers an opportunity to reflect on an interesting paradox that twenty-first century activism will face: the more it will rely on private-friendly, secured, and encrypted networks, the more open, inclusive, ethical, and transparent it will need to be….(More)”.

To Reduce Privacy Risks, the Census Plans to Report Less Accurate Data


Mark Hansen at the New York Times: “When the Census Bureau gathered data in 2010, it made two promises. The form would be “quick and easy,” it said. And “your answers are protected by law.”

But mathematical breakthroughs, easy access to more powerful computing, and widespread availability of large and varied public data sets have made the bureau reconsider whether the protection it offers Americans is strong enough. To preserve confidentiality, the bureau’s directors have determined they need to adopt a “formal privacy” approach, one that adds uncertainty to census data before it is published and achieves privacy assurances that are provable mathematically.

The census has always added some uncertainty to its data, but a key innovation of this new framework, known as “differential privacy,” is a numerical value describing how much privacy loss a person will experience. It determines the amount of randomness — “noise” — that needs to be added to a data set before it is released, and sets up a balancing act between accuracy and privacy. Too much noise would mean the data would not be accurate enough to be useful — in redistricting, in enforcing the Voting Rights Act or in conducting academic research. But too little, and someone’s personal data could be revealed.

On Thursday, the bureau will announce the trade-off it has chosen for data publications from the 2018 End-to-End Census Test it conducted in Rhode Island, the only dress rehearsal before the actual census in 2020. The bureau has decided to enforce stronger privacy protections than companies like Apple or Google had when they each first took up differential privacy….

In presentation materials for Thursday’s announcement, special attention is paid to lessening any problems with redistricting: the potential complications of using noisy counts of voting-age people to draw district lines. (By contrast, in 2000 and 2010 the swapping mechanism produced exact counts of potential voters down to the block level.)

The Census Bureau has been an early adopter of differential privacy. Still, instituting the framework on such a large scale is not an easy task, and even some of the big technology firms have had difficulties. For example, shortly after Apple’s announcement in 2016 that it would use differential privacy for data collected from its macOS and iOS operating systems, it was revealed that the actual privacy loss of their systems was much higher than advertised.

Some scholars question the bureau’s abandonment of techniques like swapping in favor of differential privacy. Steven Ruggles, Regents Professor of history and population studies at the University of Minnesota, has relied on census data for decades. Through the Integrated Public Use Microdata Series, he and his team have regularized census data dating to 1850, providing consistency between questionnaires as the forms have changed, and enabling researchers to analyze data across years.

“All of the sudden, Title 13 gets equated with differential privacy — it’s not,” he said, adding that if you make a guess about someone’s identity from looking at census data, you are probably wrong. “That has been regarded in the past as protection of privacy. They want to make it so that you can’t even guess.”

“There is a trade-off between usability and risk,” he added. “I am concerned they may go far too far on privileging an absolutist standard of risk.”

In a working paper published Friday, he said that with the number of private services offering personal data, a prospective hacker would have little incentive to turn to public data such as the census “in an attempt to uncover uncertain, imprecise and outdated information about a particular individual.”…(More)”.