When the Rule of Law Is Not Working


A conversation with Karl Sigmund at Edge: “…Now, I’m getting back to evolutionary game theory, the theory of evolution of cooperation and the social contract, and how the social contract can be subverted by corruption. That’s what interests me most currently. Of course, that is not a new story. I believe it explains a lot of what I see happening in my field and in related fields. The ideas that survive are the ideas that are fruitful in the sense of quickly producing a lot of publications, and that’s not necessarily correlated with these ideas being important to advancing science.

Corruption is a wicked problem, wicked in the technical sense of sociology, and it’s not something that will go away. You can reduce it, but as soon as you stop your efforts, it comes back again. Of course, there are many sides to corruption, but everybody seems now to agree that it is a very important problem. In fact, there was a Gallop Poll recently in which people were asked what the number one problem in today’s world is. You would think it would be climate change or overpopulation, but it turned out the majority said “corruption.” So, it’s a problem that is affecting us deeply.

There are so many different types of corruption, but the official definition is “a misuse of public trust for private means.” And this need not be by state officials; it could be also by CEOs, or by managers of non-governmental organizations, or by a soccer referee for that matter. It is always the misuse of public trust for private means, which of course takes many different forms; for instance, you have something called pork barreling, which is a wonderful expression in the United States, or embezzlement of funds, and so on.

I am mostly interested in the effect of bribery upon the judiciary system. If the trust in contracts breaks down, then the economy breaks down, because trust is at the root of the economy. There are staggering statistics which illustrate that the economic welfare of a state is closely related to the corruption perception index. Every year there are statistics about corruption published by organizations such as Transparency International or other such non-governmental organizations. It is truly astonishing how close this gradient between the different countries on the corruption level aligns with the gradient in welfare, in household income and things like this.

The paralyzing effect of this type of corruption upon the economy is something that is extremely interesting. Lots of economists are now turning their interest to that, which is new. In the 1970s, there was a Nobel Prize-winning economist, Gunnar Myrdal, who said that corruption is practically taboo as a research topic among economists. This has well changed in the decades since. It has become a very interesting topic for law students, for students of economy, sociology, and historians, of course, because corruption has always been with us. This is now a booming field, and I would like to approach this with evolutionary game theory.

Evolutionary game theory has a long tradition, and I have witnessed its development practically from the beginning. Some of the most important pioneers were Robert Axelrod and John Maynard Smith. In particular, Axelrod who in the late ‘70s wrote a truly seminal book called The Evolution of Cooperation, which iterated the prisoner’s dilemma. He showed that there is a way out of the social dilemma, which is based on reciprocity. This surprised economists, particularly, game theoreticians. He showed that by viewing social dilemmas in the context of a population where people learn from each other, where the social learning imitates whatever type of behavior is currently the best, you can place it into a context where cooperative strategies, like tit for tat, based on reciprocation can evolve….(More)”.

Here’s What the USMCA Does for Data Innovation


Joshua New at the Center for Data Innovation: “…the Trump administration announced the United States-Mexico-Canada Agreement (USMCA), the trade deal it intends to replace NAFTA with. The parties—Canada, Mexico, and the United States—still have to adopt the deal, and if they do, they will enjoy several welcome provisions that can give a boost to data-driven innovation in all three countries.

First, USMCA is the first trade agreement in the world to promote the publication of open government data. Article 19.18 of the agreement officially recognizes that “facilitating public access to and use of government information fosters economic and social development, competitiveness, and innovation.” Though the deal does not require parties to publish open government data, to the extent they choose to publish this data, it directs them to adhere to best practices for open data, including ensuring it is in open, machine-readable formats. Additionally, the deal directs parties to try to cooperate and identify ways they can expand access to and the use of government data, particularly for the purposes of creating economic opportunity for small and medium-sized businesses. While this is a welcome provision, the United States still needs legislation to ensure that publishing open data becomes an official responsibility of federal government agencies.

Second, Article 19.11 of USMCA prevents parties from restricting “the cross-border transfer of information, including personal information, by electronic means if this activity is for the conduct of the business of a covered person.” Additionally, Article 19.12 prevents parties from requiring people or firms “to use or locate computing facilities in that Party’s territory as a condition for conducting business in that territory.” In effect, these provisions prevent parties from enacting protectionist data localization requirements that inhibit the flow of data across borders. This is important because many countries have disingenuously argued for data localization requirements on the grounds that it protects their citizens from privacy or security harms, despite the location of data having no bearing on either privacy or security, to prop up their domestic data-driven industries….(More)”.

A Doctor’s Prescription: Data May Finally Be Good for Your Health


Interview by Art Kleiner: “In 2015, Robert Wachter published The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, a skeptical account of digitization in hospitals. Despite the promise offered by the digital transformation of healthcare, electronic health records had not delivered better care and greater efficiency. The cumbersome design, legacy procedures, and resistance from staff were frustrating everyone — administrators, nurses, consultants, and patients. Costs continued to rise, and preventable medical mistakes were not spotted. One patient at Wachter’s own hospital, one of the nation’s finest, was given 39 times the correct dose of antibiotics by an automated system that nobody questioned. The teenager survived, but it was clear that there needed to be a new approach to the management and use of data.

Wachter has for decades considered the delivery of healthcare through a lens focused on patient safety and quality. In 1996, he coauthored a paper in the New England Journal of Medicine that coined the term hospitalist in describing and promoting a new way of managing patients in hospitals: having one doctor — the hospitalist — “own” the patient journey from admission to discharge. The primary goal was to improve outcomes and save lives. Wachter argued it would also reduce costs and increase efficiency, making the business case for better healthcare. And he was right. Today there are more than 50,000 hospitalists, and it took just two years from the article’s publication to have the first data proving his point. In 2016, Wachter was named chair of the Department of Medicine at the University of California, San Francisco (UCSF), where he has worked since 1990.

Today, Wachter is, to paraphrase the title of a recent talk, less grumpy than he used to be about health tech. The hope part of his book’s title has materialized in some areas faster than he predicted. AI’s advances in imaging are already helping the detection of cancers become more accurate. As data collection has become better systematized, big technology firms such as Google, Amazon, and Apple are entering (in Google’s case, reentering) the field and having more success focusing their problem-solving skills on healthcare issues. In his San Francisco office, Wachter sat down with strategy+businessto discuss why the healthcare system may finally be about to change….

Systems for Fresh Thinking

S+B: The changes you appreciate seem to have less to do with technological design and more to do with people getting used to the new systems, building their own variations, and making them work.
WACHTER:
 The original electronic health record was just a platform play to get the data in digital form. It didn’t do anything particularly helpful in terms of helping the physicians make better decisions or helping to connect one kind of doctor with another kind of doctor. But it was a start.

I remember that when we were starting to develop our electronic health record at UCSF, 12 or 13 years ago, I hired a physician who is now in charge of our health computer system. I said to him, “We don’t have our electronic health record in yet, but I’m pretty sure we will in seven or eight years. What will your job be when that’s done?” I actually thought once the system was fully implemented, we’d be done with the need to innovate and evolve in health IT. That, of course, was asinine.

S+B: That’s like saying to an auto mechanic, “What will your job be when we have automatic transmissions?”
WACHTER:
 Right, but even more so, because many of us saw electronic health records as the be-all and end-all of digitally facilitated medicine. But putting in the electronic health record is just step one of 10. Then you need to start connecting all the pieces, and then you add analytics that make sense of the data and make predictions. Then you build tools and apps to fit into the workflow and change the way you work.

One of my biggest epiphanies was this: When you digitize, in any industry, nobody is clever enough to actually change anything. All they know how to do is digitize the old practice. You only start seeing real progress when smart people come in, begin using the new system, and say, “Why the hell do we do it that way?” And then you start thinking freshly about the work. That’s when you have a chance to reimagine the work in a digital environment…(More)”.

Text Analysis Systems Mine Workplace Emails to Measure Staff Sentiments


Alan Rothman at LLRX: “…For all of these good, bad or indifferent workplaces, a key question is whether any of the actions of management to engage the staff and listen to their concerns ever resulted in improved working conditions and higher levels of job satisfaction?

The answer is most often “yes”. Just having a say in, and some sense of control over, our jobs and workflows can indeed have a demonstrable impact on morale, camaraderie and the bottom line. As posited in the Hawthorne Effect, also termed the “Observer Effect”, this was first discovered during studies in the 1920’s and 1930’s when the management of a factory made improvements to the lighting and work schedules. In turn, worker satisfaction and productivity temporarily increased. This was not so much because there was more light, but rather, that the workers sensed that management was paying attention to, and then acting upon, their concerns. The workers perceived they were no longer just cogs in a machine.

Perhaps, too, the Hawthorne Effect is in some ways the workplace equivalent of the Heisenberg’s Uncertainty Principle in physics. To vastly oversimplify this slippery concept, the mere act of observing a subatomic particle can change its position.¹

Giving the processes of observation, analysis and change at the enterprise level a modern (but non-quantum) spin, is a fascinating new article in the September 2018 issue of The Atlantic entitled What Your Boss Could Learn by Reading the Whole Company’s Emails, by Frank Partnoy.  I highly recommend a click-through and full read if you have an opportunity. I will summarize and annotate it, and then, considering my own thorough lack of understanding of the basics of y=f(x), pose some of my own physics-free questions….

Today the text analytics business, like the work done by KeenCorp, is thriving. It has been long-established as the processing behind email spam filters. Now it is finding other applications including monitoring corporate reputations on social media and other sites.²

The finance industry is another growth sector, as investment banks and hedge funds scan a wide variety of information sources to locate “slight changes in language” that may point towards pending increases or decreases in share prices. Financial research providers are using artificial intelligence to mine “insights” from their own selections of news and analytical sources.

But is this technology effective?

In a paper entitled Lazy Prices, by Lauren Cohen (Harvard Business School and NBER), Christopher Malloy (Harvard Business School and NBER), and Quoc Nguyen (University of Illinois at Chicago), in a draft dated February 22, 2018, these researchers found that the share price of company, in this case NetApp in their 2010 annual report, measurably went down after the firm “subtly changes” its reporting “descriptions of certain risks”. Algorithms can detect such changes more quickly and effectively than humans. The company subsequently clarified in its 2011 annual report their “failure to comply” with reporting requirements in 2010. A highly skilled stock analyst “might have missed that phrase”, but once again its was captured by “researcher’s algorithms”.

In the hands of a “skeptical investor”, this information might well have resulted in them questioning the differences in the 2010 and 2011 annual reports and, in turn, saved him or her a great deal of money. This detection was an early signal of a looming decline in NetApp’s stock. Half a year after the 2011 report’s publication, it was reported that the Syrian government has bought the company and “used that equipment to spy on its citizen”, causing further declines.

Now text analytics is being deployed at a new target: The composition of employees’ communications. Although it has been found that workers have no expectations of privacy in their workplaces, some companies remain reluctant to do so because of privacy concerns. Thus, companies are finding it more challenging to resist the “urge to mine employee information”, especially as text analysis systems continue to improve.

Among the evolving enterprise applications are the human resources departments in assessing overall employee morale. For example, Vibe is such an app that scans through communications on Slack, a widely used enterprise platform. Vibe’s algorithm, in real-time reporting, measures the positive and negative emotions of a work team….(More)”.

Renovating democracy from the bottom up


Nathan Gardels at the Washington Post: “The participatory power of social media is a game changer for governance. It levels the playing field among amateurs and experts, peers and authorities and even challenges the legitimacy of representative government. Its arrival coincides with and reinforces the widespread distrust of elites across the Western world, ripening the historical moment for direct democracy.

For the first time, an Internet-based movement has come to power in a major country, Italy, under the slogan “Participate, don’t delegate!” All of the Five Star Movement’s parliamentarians, who rule the country in a coalition with the far-right League party, were nominated and elected to stand for office online. And they have appointed the world’s first minister for direct democracy, Riccardo Fraccaro.

In Rome this week, he explained the participatory agenda of Italy’s ruling coalition government to The WorldPost at a meeting of the Global Forum on Modern Direct Democracy. “Citizens must be granted the same possibility to actively intervene in the process of managing and administrating public goods as normally carried out by their elected representatives,” he enthused. “What we have witnessed in our democracy is a drift toward ‘partyocracy,’ in which a restricted circle of policymakers have been so fully empowered with decision-making capacity that they could virtually ignore and bypass the public will. The mere election of a representative every so many years is no longer sufficient to prevent this from happening. That is why our government will take the next step forward in order to innovate and enhance our democracy.”

Fraccaro went on: “Referenda, public petitions and the citizens’ ballot initiative are nothing other than the direct means available for the citizenry to submit laws that political parties are not willing to propose or to reject rules approved by political parties that are not welcome by the people. Our aim, therefore, is to establish the principles and practices of direct democracy alongside the system of representative government in order to give real, authentic sovereignty to the citizens.”

At the Rome forum, Deputy Prime Minister Luigi di Maio, a Five Star member, railed against the technocrats and banks he says are trying to frustrate the will of the people. He promised forthcoming changes in the Italian constitution to follow through on Fraccaro’s call for citizen-initiated propositions that will go to the public ballot if the legislature does not act on them.

The program that has so far emerged out of the government’s participatory agenda is a mixed bag. It includes everything from anti-immigrant and anti-vaccine policies to the expansion of digital networks and planting more trees. In a move that has unsettled the European Union authorities as well as Italy’s non-partisan, indirectly-elected president, the governing coalition last week proposed both a tax cut and the provision of a universal basic income — despite the fact that Italy’s long-term debt is already 130 percent of GDP.

The Italian experiment warrants close attention as a harbinger of things to come elsewhere. It reveals a paradox for governance in this digital age: the more participation there is, the greater the need for the counterbalance of impartial mediating practices and institutions that can process the cacophony of voices, sort out the deluge of contested information, dispense with magical thinking and negotiate fair trade-offs among the welter of conflicting interests….(More)”.

Open Data, Grey Data, and Stewardship: Universities at the Privacy Frontier.


Paper by Christine L. Borgman: “As universities recognize the inherent value in the data they collect and hold, they encounter unforeseen challenges in stewarding those data in ways that balance accountability, transparency, and protection of privacy, academic freedom, and intellectual property. Two parallel developments in academic data collection are converging: (1) open access requirements, whereby researchers must provide access to their data as a condition of obtaining grant funding or publishing results in journals; and (2) the vast accumulation of “grey data” about individuals in their daily activities of research, teaching, learning, services, and administration.

The boundaries between research and grey data are blurring, making it more difficult to assess the risks and responsibilities associated with any data collection. Many sets of data, both research and grey, fall outside privacy regulations such as HIPAA, FERPA, and PII. Universities are exploiting these data for research, learning analytics, faculty evaluation, strategic decisions, and other sensitive matters. Commercial entities are besieging universities with requests for access to data or for partnerships to mine them. The privacy frontier facing research universities spans open access practices, uses and misuses of data, public records requests, cyber risk, and curating data for privacy protection. This Article explores the competing values inherent in data stewardship and makes recommendations for practice by drawing on the pioneering work of the University of California in privacy and information security, data governance, and cyber risk….(More)”.

The latest tools for sexual assault victims: Smartphone apps and software


 
Peter Holley at the Washington Post:  “…For much of the past decade, dozens of apps and websites have been created to help survivors of sexual assault electronically record and report such crimes. They are designed to assist an enormous pool of potential victims. The Rape Abuse & Incest National Network reports that more than 11 percent of all college students — both graduate and undergraduate — experience rape or sexual assault through physical force, violence or incapacitation. Despite the prevalence of such incidents, less than 10 percent of victims on college campuses report their assaults, according to the National Sexual Violence Resource Center.

The apps range from electronic reporting tools such as JDoe to legal guides that provide victims with access to law enforcement and crisis counseling. Others help victims save and share relevant medical information in case of an assault. The app Uask includes a “panic button” that connects users with 911 or allows them to send emergency messages to people with their location.

 

Since its debut in 2015, Callisto’s software has been adopted by 12 college campuses — including Stanford, the University of Oregon and St. John’s University — and made available to more than 160,000 students, according to the company. Sexual assault survivors who visit Callisto are six times as likely to report, and 15 percent of those survivors have matched with another victim of the same assailant, the company claims.

Peter Cappelli, a professor of management at the Wharton School and director of Wharton’s Center for Human Resources, told NPR that he sees potential problems with survivors “crowdsourcing” their decision to report assaults.

“I don’t think we want to have a standard where the decisions are crowdsourced,” he said. “I think what you want is to tell people [that] the criteria [for whether or not to report] are policy related, not personally related, and you should bring forward anything that fits the criteria, not [based on] whether you feel enough other people have made the complaint or not. We want to sometimes encourage people to do things they might feel uncomfortable about.”…(More)”.

The secret data collected by dockless bikes is helping cities map your movement


Lime is able to collect this information because its bikes, like all those in dockless bike-share programs, are built to operate without fixed stations or corrals. …In the 18 months or so since dockless bike-share arrived in the US, the service has spread to at least 88 American cities. (On the provider side, at least 10 companies have jumped into the business; Lime is one of the largest.) Some of those cities now have more than a year of data related to the programs, and they’ve started gleaning insights and catering to the increased number of cyclists on their streets.

South Bend is one of those leaders. It asked Lime to share data when operations kicked off in June 2017. At first, Lime provided the information in spreadsheets, but in early 2018 the startup launched a browser-based dashboard where cities could see aggregate statistics for their residents, such as how many of them rented bikes, how many trips they took, and how far and long they rode. Lime also added heat maps that reveal where most rides occur within a city and a tool for downloading data that shows individual trips without identifying the riders. Corcoran can glance at his dashboard and see, for example, that people in South Bend have taken 340,000 rides, traveled 158,000 miles, and spent more than 7 million minutes on Lime bikes since the company started service. He can also see there are 700 Lime bikes active in the city, down from an all-time high of 1,200 during the University of Notre Dame’s 2017 football season….(More)”.

Direct Democracy and Political Engagement of the Marginalized


Dissertation by Jeong Hyun Kim: “…examines direct democracy’s implications for political equality by focusing on how it influences and modifies political attitudes and behaviors of marginalized groups. Using cases and data from Sweden, Switzerland, and the United States, I provide a comprehensive, global examination of how direct democratic institutions affect political participation, especially of political minority or marginalized groups.

In the first paper, I examine whether the practice of direct democracy supports women’s political participation. I theorize that the use of direct democracy enhances women’s sense of political efficacy, thereby promoting their participation in the political process. I test this argument by leveraging a quasi-experiment in Sweden from 1921 to 1944, wherein the use of direct democratic institutions was determined by a population threshold. Findings from a regression discontinuity analysis lend strong support for the positive effect of direct democracy on women’s political participation. Using web documents of minutes from direct democratic meetings, I further show that women’s participation in direct democracy is positively associated with their subsequent participation in parliamentary elections.

The second paper expands on the first paper by examining an individual-level mechanism linking experience with direct democracy and feelings of political efficacy. Using panel survey data from Switzerland, I examine the relationship between individuals’ exposure to direct democracy and the gender gap in political efficacy. I find that direct democracy increases women’s sense of political efficacy, while it has no significant effect on men. This finding confirms that the opportunity for direct legislation leads women to feel more efficacious in politics, suggesting its further implications for the gender gap in political engagement.

In the third and final paper, I examine how direct democratic votes targeting ethnic minorities influence political mobilization of minority groups. I theorize that targeted popular votes intensify the general public’s hostility towards minority groups, thereby enhancing group members’ perceptions of being stigmatized. Consequently, this creates a greater incentive for minorities to actively engage in politics. Using survey data from the United States, combined with information about state-level direct democracy, I find that direct democratic votes targeting the rights of immigrants lead to greater political activism among ethnic minorities with immigrant background. These studies contribute to the extant study of women and minority politics by illuminating new mechanisms underlying mobilization of women and minorities and clarifying the causal effect of the type of government on political equality….(More)”.

What Can Satellite Imagery Tell Us About Obesity in Cities?


Emily Matchar at Smithsonian: “About 40 percent of American adults are obese, defined as having a body mass index (BMI) over 30. But obesity is not evenly distributed around the country. Some cities and states have far more obese residents than others. Why? Genetics, stress, income levels and access to healthy foods are play a role. But increasingly researchers are looking at the built environment—our cities—to understand why people are fatter in some places than in others.

New research from the University of Washington attempts to take this approach one step further by using satellite data to examine cityscapes. By using the satellite images in conjunction with obesity data, they hope to uncover which urban features might influence a city’s obesity rate.

The researchers used a deep learning network to analyze about 150,000 high-resolution satellite image of four cities: Los Angeles, Memphis, San Antonio and Seattle. The cities were selected for being from states with both high obesity rates (Texas and Tennessee) and low obesity rates (California and Washington). The network extracted features of the built environment: crosswalks, parks, gyms, bus stops, fast food restaurants—anything that might be relevant to health.

“If there’s no sidewalk you’re less likely to go out walking,” says Elaine Nsoesie, a professor of global health at the University of Washington who led the research.

The team’s algorithm could then see what features were more or less common in areas with greater and lesser rates of obesity. Some findings were predictable: more parks, gyms and green spaces were correlated with lower obesity rates. Others were surprising: more pet stores equaled thinner residents (“a high density of pet stores could indicate high pet ownership, which could influence how often people go to parks and take walks around the neighborhood,” the team hypothesized).

A paper on the results was recently published in the journal JAMA Network Open….(More)”.