Explore our articles
View All Results

Stefaan Verhulst

Paper by Mila Gasco-Hernandez and  Jose Ramon Gil-Garcia: “Previous studies have infrequently addressed the dynamic interactions among social, technical, and organizational variables in open government data initiatives. In addition, organization level models have neglected to explain the role of management in decision-making processes about technology and data. This article contributes to addressing this gap in the literature by analyzing the complex relationships between open government data characteristics and the organizations and institutions in which they are embedded.

We systematically compare the open data inception and implementation processes, as well as their main results, in three Spanish local governments (Gava and Rubi in Catalonia and Gijon in Asturias) by using a model that combines the technology enactment framework with some specific constructs and relationships from the process model of computing change. Our resulting model is able to identify and explain the significant role of management in shaping and mediating different interactions, but also acknowledges the importance of organizational level variables and the context in which the open data initiative is taking place…(More)”.

The Role of Management in Open Data Initiatives in Local Governments: Opening the Organizational Black Box

Lessons from the new World Bank Global Report: “Almost daily, headlines in the world’s leading newspapers are full of examples of public sector failures: public money is mismanaged or outright misused; civil servants are not motivated or are poorly trained; government agencies fail to coordinate with each other; and as a result, citizens are either deprived of quality public services, or must go through a bureaucratic maze to access them.

These public-sector challenges are often present even in the world’s most developed countries. They are of course further exacerbated by lower levels of development.

So what hope do low and middle-income countries have to make their public sectors function more effectively? Is this just a futile enterprise altogether?

We believe it is not. Our new Global Report, Improving Public Sector Performance Through Innovation and Inter-Agency Coordination, argues that positive change is possible in many low and middle-income countries. The report collects 15 inspiring country cases of such reforms and shows that such change does not necessarily require huge financial investment or complex IT systems. What seems to be required, instead, are five interconnected drivers of success:

  • Political leadership is needed because few, if any, of the innovations are a purely technocratic exercise.  Leaders need to find ways to collaborate with a wide range of internal and external stakeholders to overcome inherent opposition.
  • Institutional capacity building of existing bodies is a common element across many of the 15 cases. For reforms to endure, one ultimately needs to create sustainable institutions.
  • Incentives matter, both at the institutional level (e.g., through government-wide policy, creating systems and structures that shape institutional objectives, and program monitoring systems) as well as at the level of civil servants (e.g., through performance targets and reward systems).
  • Increased transparency can help deliver change in public sector performance by breaking down government silos and ensuring inter-agency information-sharing, and publishing or disseminating performance information.  Transparency can also be a powerful driver for changing incentives.
    • Technology, while not a panacea, is present in two-thirds of the featured cases. The reformers applied relevant, even basic, IT tools and know-how to their specific functional requirements and did not over-design their efforts.  Furthermore, the technology application is rarely a stand-alone solution; rather, it is accompanied by policies and procedures to change behavior….(More)”.
The five drivers for improving public sector performance

Book edited by Torin Monahan and David Murakami Wood: “Surveillance is everywhere: in workplaces monitoring the performance of employees, social media sites tracking clicks and uploads, financial institutions logging transactions, advertisers amassing fine-grained data on customers, and security agencies siphoning up everyone’s telecommunications activities. Surveillance practices-although often hidden-have come to define the way modern institutions operate. Because of the growing awareness of the central role of surveillance in shaping power relations and knowledge across social and cultural contexts, scholars from many different academic disciplines have been drawn to “surveillance studies,” which in recent years has solidified as a major field of study.

Torin Monahan and David Murakami Wood’s Surveillance Studies is a broad-ranging reader that provides a comprehensive overview of the dynamic field. In fifteen sections, the book features selections from key historical and theoretical texts, samples of the best empirical research done on surveillance, introductions to debates about privacy and power, and cutting-edge treatments of art, film, and literature. While the disciplinary perspectives and foci of scholars in surveillance studies may be diverse, there is coherence and agreement about core concepts, ideas, and texts. This reader outlines these core dimensions and highlights various differences and tensions. In addition to a thorough introduction that maps the development of the field, the volume offers helpful editorial remarks for each section and brief prologues that frame the included excerpts. …(More)”.

Surveillance Studies: A Reader

Paper by H.G. (Haiko)van der Voort et al: “Big data promises to transform public decision-making for the better by making it more responsive to actual needs and policy effects. However, much recent work on big data in public decision-making assumes a rational view of decision-making, which has been much criticized in the public administration debate.

In this paper, we apply this view, and a more political one, to the context of big data and offer a qualitative study. We question the impact of big data on decision-making, realizing that big data – including its new methods and functions – must inevitably encounter existing political and managerial institutions. By studying two illustrative cases of big data use processes, we explore how these two worlds meet. Specifically, we look at the interaction between data analysts and decision makers.

In this we distinguish between a rational view and a political view, and between an information logic and a decision logic. We find that big data provides ample opportunities for both analysts and decision makers to do a better job, but this doesn’t necessarily imply better decision-making, because big data also provides opportunities for actors to pursue their own interests. Big data enables both data analysts and decision makers to act as autonomous agents rather than as links in a functional chain. Therefore, big data’s impact cannot be interpreted only in terms of its functional promise; it must also be acknowledged as a phenomenon set to impact our policymaking institutions, including their legitimacy….(More)”.

Rationality and politics of algorithms. Will the promise of big data survive the dynamics of public decision making?

Natasha Rausch at Bloomberg: “At Houston’s City Hall last week, Mayor Sylvester Turner gathered with company CEOs, university professors, police officers, politicians and local judges to discuss a $6 billion problem they all have in common: the 2020 census.

City officials and business leaders are worried about people like 21-year-old Ana Espinoza, a U.S. citizen by birth who lives with undocumented relatives. Espinoza has no intention of answering the census because she worries it could expose her family and get them deported….

Getting an accurate count has broad economic implications across the city, said Laura Murillo, chief executive officer of the Hispanic Chamber. “For everyone, the census is important. It doesn’t matter if you’re a Republican or Democrat, black or white or green.”…

For growing businesses, the census is crucial for understanding the population they’re serving in different regions. Enterprise Rent-A-Car used the 2010 census to help diversify the company’s employee base. The data prompted Enterprise to staff a new location in Houston with Spanish-speaking employees to better serve area customers, said the company’s human resources manager Phil Dyson.

“It’s been one of our top locations,” he said.

Doing the Math

Texas stands to lose at least $1,161 in federal funding for each person not counted, according to a March report by Andrew Reamer, a research professor at the George Washington Institute of Public Policy. Multiplied by the estimated 506,000 unathorized immigrants who live in the nation’s fourth-largest city, that puts at stake about $6 billion for Houston over the 10 years the census applies.

That’s just for programs such as Medicare and Medicaid. The potential loss is even larger when grants are taken into account for items like highways and community development, he said…(More)”.

Houston’s $6 Billion Census Problem: Frightened Immigrants

Paper by Björn Bartling, Ernst Fehr, David Huffman and Nick Netzer: “Trust affects almost all human relationships – in families, organizations, markets and politics. However, identifying the conditions under which trust, defined as people’s beliefs in the trustworthiness of others, has a causal effect on the efficiency of human interactions has proven to be difficult. We show experimentally and theoretically that trust indeed has a causal effect. The duration of the effect depends, however, on whether initial trust variations are supported by multiple equilibria.

We study a repeated principal-agent game with multiple equilibria and document empirically that an efficient equilibrium is selected if principals believe that agents are trustworthy, while players coordinate on an inefficient equilibrium if principals believe that agents are untrustworthy. Yet, if we change the institutional environment such that there is a unique equilibrium, initial variations in trust have short-run effects only. Moreover, if we weaken contract enforcement in the latter environment, exogenous variations in trust do not even have a short-run effect. The institutional environment thus appears to be key for whether trust has causal effects and whether the effects are transient or persistent…(More)”.

The causal effect of trust

Kelsey Sutton at Adweek: “As Hurricane Michael approached the Florida Panhandle, the Florida Division of Emergency Management tapped a tech company for help.

Over the past year, Florida’s DEM has worked closely with GasBuddy, a Boston-based app that uses crowdsourced data to identify fuel prices and inform first responders and the public about fuel availability or power outages at gas stations during storms. Since Hurricane Irma in 2017, GasBuddy and DEM have worked together to survey affected areas, helping Florida first responders identify how best to respond to petroleum shortages. With help from the location intelligence company Cuebiq, GasBuddy also provides estimated wait times at gas stations during emergencies.

DEM first noticed GasBuddy’s potential in 2016, when the app was collecting and providing data about fuel availability following a pipeline leak.

“DEM staff recognized how useful such information would be to Florida during any potential future disasters, and reached out to GasBuddy staff to begin a relationship,” a spokesperson for the Florida State Emergency Operations Center explained….

Stefaan Verhulst, co-founder and chief research and development officer at the Governance Laboratory at New York University, advocates for private corporations to partner with public institutions and NGOs. Private data collected by corporations is richer, more granular and more up-to-date than data collected through traditional social science methods, making that data useful for noncorporate purposes like research, Verhulst said. “Those characteristics are extremely valuable if you are trying to understand how society works,” Verhulst said….(More)”.

How Big Tech Is Working With Nonprofits and Governments to Turn Data Into Solutions During Disasters

Declaration: “…The 40th International Conference of Data Protection and Privacy Commissioners considers that any creation, development and use of artificial intelligence systems shall fully respect human rights, particularly the rights to the protection of personal data and to privacy, as well as human dignity, non-discrimination and fundamental values, and shall provide solutions to allow individuals to maintain control and understanding of artificial intelligence systems.

The Conference therefore endorses the following guiding principles, as its core values to preserve human rights in the development of artificial intelligence:

  1. Artificial intelligence and machine learning technologies should be designed, developed and used in respect of fundamental human rights and in accordance with the fairness principle, in particular by:
  2. Considering individuals’ reasonable expectations by ensuring that the use of artificial intelligence systems remains consistent with their original purposes, and that the data are used in a way that is not incompatible with the original purpose of their collection,
  3. taking into consideration not only the impact that the use of artificial intelligence may have on the individual, but also the collective impact on groups and on society at large,
  4. ensuring that artificial intelligence systems are developed in a way that facilitates human development and does not obstruct or endanger it, thus recognizing the need for delineation and boundaries on certain uses,…(More)
Declaration on Ethics and Data Protection in Artifical Intelligence

Douglas Yeung at Scientific American: “The conversation about unconscious bias in artificial intelligence often focuses on algorithms that unintentionally cause disproportionate harm to entire swaths of society—those that wrongly predict black defendants will commit future crimes, for example, or facial-recognition technologies developed mainly by using photos of white men that do a poor job of identifying women and people with darker skin.

But the problem could run much deeper than that. Society should be on guard for another twist: the possibility that nefarious actors could seek to attack artificial intelligence systems by deliberately introducing bias into them, smuggled inside the data that helps those systems learn. This could introduce a worrisome new dimension to cyberattacks, disinformation campaigns or the proliferation of fake news.

According to a U.S. government study on big data and privacy, biased algorithms could make it easier to mask discriminatory lending, hiring or other unsavory business practices. Algorithms could be designed to take advantage of seemingly innocuous factors that can be discriminatory. Employing existing techniques, but with biased data or algorithms, could make it easier to hide nefarious intent. Commercial data brokers collect and hold onto all kinds of information, such as online browsing or shopping habits, that could be used in this way.

Biased data could also serve as bait. Corporations could release biased data with the hope competitors would use it to train artificial intelligence algorithms, causing competitors to diminish the quality of their own products and consumer confidence in them.

Algorithmic bias attacks could also be used to more easily advance ideological agendas. If hate groups or political advocacy organizations want to target or exclude people on the basis of race, gender, religion or other characteristics, biased algorithms could give them either the justification or more advanced means to directly do so. Biased data also could come into play in redistricting efforts that entrench racial segregation (“redlining”) or restrict voting rights.

Finally, national security threats from foreign actors could use deliberate bias attacks to destabilize societies by undermining government legitimacy or sharpening public polarization. This would fit naturally with tactics that reportedly seek to exploit ideological divides by creating social media posts and buying online ads designed to inflame racial tensions….(More)”.

When AI Misjudgment Is Not an Accident

Paper by Carla Hamida and Amanda Landi: “Recently, Facebook creator Mark Zuckerberg was on trial for the misuse of personal data. In 2013, the National Security Agency was exposed by Edward Snowden for invading the privacy of inhabitants of the United States by examining personal data. We see in the news examples, like the two just described, of government agencies and private companies being less than truthful about their use of our data. A related issue is that these same government agencies and private companies do not share their own data, and this creates the openness of data problem.

Government, academics, and citizens can play a role in making data more open. In the present, there are non-profit organizations that research data openness, such as OpenData Charter, Global Open Data Index, and Open Data Barometer. These organizations have different methods on measuring openness of data, so this leads us to question what does open data mean, how does one measure how open data is and who decides how open should data be, and to what extent society is affected by the availability, or lack of availability, of data. In this paper, we explore these questions with an examination of two of the non-profit organizations that study the open data problem extensively….(More)”.

The Lack of Decentralization of Data: Barriers, Exclusivity, and Monopoly in Open Data

Get the latest news right in your inbox

Subscribe to curated findings and actionable knowledge from The Living Library, delivered to your inbox every Friday