Algorithmic regulation: A critical interrogation


Karen Yeung in Regulation and Governance: “Innovations in networked digital communications technologies, including the rise of “Big Data,” ubiquitous computing, and cloud storage systems, may be giving rise to a new system of social ordering known as algorithmic regulation. Algorithmic regulation refers to decisionmaking systems that regulate a domain of activity in order to manage risk or alter behavior through continual computational generation of knowledge by systematically collecting data (in real time on a continuous basis) emitted directly from numerous dynamic components pertaining to the regulated environment in order to identify and, if necessary, automatically refine (or prompt refinement of) the system’s operations to attain a pre-specified goal. This study provides a descriptive analysis of algorithmic regulation, classifying these decisionmaking systems as either reactive or pre-emptive, and offers a taxonomy that identifies eight different forms of algorithmic regulation based on their configuration at each of the three stages of the cybernetic process: notably, at the level of standard setting (adaptive vs. fixed behavioral standards), information-gathering and monitoring (historic data vs. predictions based on inferred data), and at the level of sanction and behavioral change (automatic execution vs. recommender systems). It maps the contours of several emerging debates surrounding algorithmic regulation, drawing upon insights from regulatory governance studies, legal critiques, surveillance studies, and critical data studies to highlight various concerns about the legitimacy of algorithmic regulation….(More)”.

Journal tries crowdsourcing peer reviews, sees excellent results


Chris Lee at ArsTechnica: “Peer review is supposed to act as a sanity check on science. A few learned scientists take a look at your work, and if it withstands their objective and entirely neutral scrutiny, a journal will happily publish your work. As those links indicate, however, there are some issues with peer review as it is currently practiced. Recently, Benjamin List, a researcher and journal editor in Germany, and his graduate assistant, Denis Höfler, have come up with a genius idea for improving matters: something called selected crowd-sourced peer review….

My central point: peer review is burdensome and sometimes barely functional. So how do we improve it? The main way is to experiment with different approaches to the reviewing process, which many journals have tried, albeit with limited success. Post-publication peer review, when scientists look over papers after they’ve been published, is also an option but depends on community engagement.

But if your paper is uninteresting, no one will comment on it after it is published. Pre-publication peer review is the only moment where we can be certain that someone will read the paper.

So, List (an editor for Synlett) and Höfler recruited 100 referees. For their trial, a forum-style commenting system was set up that allowed referees to comment anonymously on submitted papers but also on each other’s comments as well. To provide a comparison, the papers that went through this process also went through the traditional peer review process. The authors and editors compared comments and (subjectively) evaluated the pros and cons. The 100-person crowd of researchers was deemed the more effective of the two.

The editors found that it took a bit more time to read and collate all the comments into a reviewers’ report. But it was still faster, which the authors loved. Typically, it took the crowd just a few days to complete their review, which compares very nicely to the usual four to six weeks of the traditional route (I’ve had papers languish for six months in peer review). And, perhaps most important, the responses were more substantive and useful compared to the typical two-to-four-person review.

So far, List has not published the trial results formally. Despite that, Synlett is moving to the new system for all its papers.

Why does crowdsourcing work?

Here we get back to something more editorial. I’d suggest that there is a physical analog to traditional peer review, called noise. Noise is not just a constant background that must be overcome. Noise is also generated by the very process that creates a signal. The difference is how the amplitude of noise grows compared to the amplitude of signal. For very low-amplitude signals, all you measure is noise, while for very high-intensity signals, the noise is vanishingly small compared to the signal, even though it’s huge compared to the noise of the low-amplitude signal.

Our esteemed peers, I would argue, are somewhat random in their response, but weighted toward objectivity. Using this inappropriate physics model, a review conducted by four reviewers can be expected (on average) to contain two responses that are, basically, noise. By contrast, a review by 100 reviewers may only have 10 responses that are noise. Overall, a substantial improvement. So, adding the responses of a large number of peers together should produce a better picture of a scientific paper’s strengths and weaknesses.

Didn’t I just say that reviewers are overloaded? Doesn’t it seem that this will make the problem worse?

Well, no, as it turns out. When this approach was tested (with consent) on papers submitted to Synlett, it was discovered that review times went way down—from weeks to days. And authors reported getting more useful comments from their reviewers….(More)”.

Free Speech and Transparency in a Digital Era


Russell L. Weaver at IMODEV: ” Governmental openness and transparency is inextricably intertwined with freedom of expression. In order to engage in scrutinize government, the people must have access to information regarding the functioning of government. As the U.S. Supreme Court has noted, “It is inherent in the nature of the political process that voters must be free to obtain information from divers sources in order to determine how to cast their votes”. As one commentator noted, “Citizens need to understand what their government is doing in their name.”

Despite the need for transparency, the U.S. government has frequently functioned opaquely.  For example, even though the U.S. Supreme Court is a fundamental component of the U.S. constitutional system, confirmation hearings for U.S. Supreme Court justices were held in secret for decades. That changed about a hundred years ago when the U.S. Senate broke with tradition and began holding confirmation hearings in public.  The results of this openness have been both interesting and enlightening: the U.S. citizenry has become much more interested and involved in the confirmation process, galvanizing and campaigning both for and against proposed nominees. In the 1930s, Congress decided to open up the administrative process as well. For more than a century, administrative agencies were not required to notify the public of proposed actions, or to allow the public to have input on the policy choices reflected in proposed rules and regulations. That changed in the 1930s when Congress adopted the federal Administrative Procedure Act (APA). For the creation of so-called “informal rules,” the APA required agencies to publish a NOPR (notice of proposed rulemaking) in the Federal Register, thereby providing the public with notice of the proposed rule. Congress required that the NOPR provide the public with various types of information, including “(1) a statement of the time, place, and nature of public rule making proceedings; (2) reference to the legal authority under which the rule is proposed; and (3) either the terms or substance of the proposed rule or a description of the subjects and issues involved. »  In addition to allowing interested parties the opportunity to comment on NOPRs, and requiring agencies to “consider” those comments, the APA also required agencies to issue a “concise general statement” of the “basis and purpose” of any final rule that they issue.  As with the U.S. Supreme Court’s confirmation processes, the APA’s rulemaking procedures led to greater citizen involvement in the rulemaking process.  The APA also promoted openness by requiring administrative agencies to voluntarily disclose various types of internal information to the public, including “interpretative rules and statements of policy.”

Congress supplemented the APA in the 1960s when it enacted the federal Freedom of Information Act (FOIA). FOIA gave individuals and corporations a right of access to government held information. As a “disclosure” statute, FOIA specifically provides that “upon any request for records which reasonably describes such records and is made in accordance with published rules stating the time, place, fees (if any), and procedures to be followed, shall make the records promptly available to any person.”  Agencies are required to decide within twenty days whether to comply with a request. However, the time limit can be tolled under certain circumstances. Although FOIA is a disclosure statute, it does not require disclosure of all governmental documents.  In addition to FOIA, Congress also enacted the Federal Advisory Committee Act (FACA),  the Government in the Sunshine Act, and amendments to FOIA, all of which were designed to enhance governmental openness and transparency.  In addition, many state legislatures have adopted their own open records provisions that are similar to FOIA.

Despite these movements towards openness, advancements in speech technology have forced governments to become much more open and transparent than they have ever been.  Some of this openness has been intentional as governmental entities have used new speech technologies to communicate with the citizenry and enhance its understanding of governmental operations.  However, some of this openness has taken place despite governmental resistance.  The net effect is that free speech, and changes in communications technologies, have produced a society that is much more open and transparent.  This article examines the relationship between free speech, the new technologies, and governmental openness and transparency….(More).

Courts Disrupted


A new Resource Bulletin by the Joint Technology Committee (JTC): “The concept of disruptive innovation made its debut more than 20 years ago in a Harvard Business Review article. Researchers Clayton M. Christensen and Joseph L. Bower observed that established organizations may invest in retaining current customers but often fail to make the technological investments that future customers will expect. That opens the way for low-cost competitive alternatives to enter the marketplace, addressing the needs of unserved and under-served populations. Lower-cost alternatives over time can be enhanced, gain acceptance in well-served populations, and sometimes ultimately displace traditional products or services. This should be a cautionary tale for court managers. What would happen if the people took their business elsewhere? Is that even possible? What would be the implications to both the public and the courts? Should court leaders concern themselves with this possibility?

While disruptive innovation theory is both revered and reviled, it provides a perspective that can help court managers anticipate and respond to significant change. Like large businesses with proprietary offerings, courts have a unique customer base. Until recently, those customers had no other option than to accept whatever level of service the courts would provide and at whatever cost, or simply choose not to address their legal needs. Innovations such as non-JD legal service providers, online dispute resolution (ODR), and unbundled legal services are circumventing some traditional court processes, providing more timely and cost-effective outcomes. While there is no consensus in the court community on the potential impact to courts (whether they are in danger of “going out of business”), there are compelling reasons for court managers to be aware of and leverage the concept of disruptive innovation.

As technology dramatically changes the way routine transactions are handled in other industries, courts can also embrace innovation as one way to enhance the public’s experience. Doing so may help courts “disrupt” themselves, making justice available to a wider audience at a lower cost while preserving fairness, neutrality, and transparency in the judicial process….(More).”

Community Digital Storytelling for Collective Intelligence: towards a Storytelling Cycle of Trust


Sarah Copeland and Aldo de Moor in AI & SOCIETY: “Digital storytelling has become a popular method for curating community, organisational, and individual narratives. Since its beginnings over 20 years ago, projects have sprung up across the globe, where authentic voice is found in the narration of lived experiences. Contributing to a Collective Intelligence for the Common Good, the authors of this paper ask how shared stories can bring impetus to community groups to help identify what they seek to change, and how digital storytelling can be effectively implemented in community partnership projects to enable authentic voices to be carried to other stakeholders in society. The Community Digital Storytelling (CDST) method is introduced as a means for addressing community-of-place issues. There are five stages to this method: preparation, story telling, story digitisation, digital story sense-making, and digital story sharing. Additionally, a Storytelling Cycle of Trust framework is proposed. We identify four trust dimensions as being imperative foundations in implementing community digital media interventions for the common good: legitimacy, authenticity, synergy, and commons. This framework is concerned with increasing the impact that everyday stories can have on society; it is an engine driving prolonged storytelling. From this perspective, we consider the ability to scale up the scope and benefit of stories in civic contexts. To illustrate this framework, we use experiences from the CDST workshop in northern Britain and compare this with a social innovation project in the southern Netherlands….(More)”.

The Tech Revolution That’s Changing How We Measure Poverty


Alvin Etang Ndip at the Worldbank: “The world has an ambitious goal to end extreme poverty by 2030. But, without good poverty data, it is impossible to know whether we are making progress, or whether programs and policies are reaching those who are the most in need.

Countries, often in partnership with the World Bank Group and other agencies, measure poverty and wellbeing using household surveys that help give policymakers a sense of who the poor are, where they live, and what is holding back their progress. Once a paper-and-pencil exercise, technology is beginning to revolutionize the field of household data collection, and the World Bank is tapping into this potential to produce more and better poverty data….

“Technology can be harnessed in three different ways,” says Utz Pape, an economist with the World Bank. “It can help improve data quality of existing surveys, it can help to increase the frequency of data collection to complement traditional household surveys, and can also open up new avenues of data collection methods to improve our understanding of people’s behaviors.”

As technology is changing the field of data collection, researchers are continuing to find new ways to build on the power of mobile phones and tablets.

The World Bank’s Pulse of South Sudan initiative, for example, takes tablet-based data collection a step further. In addition to conducting the household survey, the enumerators also record a short, personalized testimonial with the people they are interviewing, revealing a first-person account of the situation on the ground. Such testimonials allow users to put a human face on data and statistics, giving a fuller picture of the country’s experience.

Real-time data through mobile phones

At the same time, more and more countries are generating real-time data through high-frequency surveys, capitalizing on the proliferation of mobile phones around the world. The World Bank’s Listening to Africa (L2A) initiative has piloted the use of mobile phones to regularly collect information on living conditions. The approach combines face-to-face surveys with follow-up mobile phone interviews to collect data that allows to monitor well-being.

The initiative hands out mobile phones and solar chargers to all respondents. To minimize the risk of people dropping out, the respondents are given credit top-ups to stay in the program. From monitoring health care facilities in Tanzania to collecting data on frequency of power outages in Togo, the initiative has been rolled out in six countries and has been used to collect data on a wide range of areas. …

Technology-driven data collection efforts haven’t been restricted to the Africa region alone. In fact, the approach was piloted early in Peru and Honduras with the Listening 2 LAC program. In Europe and Central Asia, the World Bank has rolled out the Listening to Tajikistan program, which was designed to monitor the impact of the Russian economic slowdown in 2014 and 2015. Initially a six-month pilot, the initiative has now been in operation for 29 months, and a partnership with UNICEF and JICA has ensured that data collection can continue for the next 12 months. Given the volume of data, the team is currently working to create a multidimensional fragility index, where one can monitor a set of well-being indicators – ranging from food security to quality jobs and public services – on a monthly basis…

There are other initiatives, such as in Mexico where the World Bank and its partners are using satellite imagery and survey data to estimate how many people live below the poverty line down to the municipal level, or guiding data collectors using satellite images to pick a representative sample for the Somali High Frequency Survey. However, despite the innovation, these initiatives are not intended to replace traditional household surveys, which still form the backbone of measuring poverty. When better integrated, they can prove to be a formidable set of tools for data collection to provide the best evidence possible to policymakers….(More)”

Is it too late to build a better world?


Keith Burnett at Campaign for Social Science: “The greatest challenge we face is to use our intellects to guide our actions in making the world a better place for us and our fellow human beings.

This is no easy task and its history is littered with false dawns and doctrines. You would have to be blind to the lessons of the past to fail to appreciate the awful impact that delusional ideas have had on mankind. Some of the worst are those meant to save us.

There are some who take this as a warning against intervention at all, who say it can never be done and shouldn’t even be attempted. That the forces of nature blow winds in society that we can never tame. That we are bound to suffer like a small ship in a stormy sea.

They might be right, but it would be the utmost dereliction of academia to give up on this quest. And in any case, I don’t believe it is true. These forces may be there, but there is much we can do, a lot of it deeply practical to make the journey more comfortable and so we even end up in the right port.

Of course, there are those who believe we academics simply don’t care. That scholarship is happiest at a distance from messy, contradictory humanity and prefers in its detached world of conferences and publications. That we are content to analyse rather than heal.

Well I can tell you that my social sciences colleagues at Sheffield are not content in an ivory tower and they never have been. They feel the challenges of our world as keenly as any. And they know if we ever needed understanding, and a vision of what society could be, we need it now.

I am confident they are not alone and, as a scientist all my life, it has become apparent to me that, to translate insights into change, we must frequently overcome barriers of perception and culture, of politics and prejudice. Our great challenges are not only technical but matters of education and economics. Our barriers those of opportunity, power and purpose.

If we want solutions to reach those who desperately need them, we must understand how to take the word and make it flesh. Ideas alone are not enough, they come to life through people. They need money, armies of changed opinion.

If we don’t do this work, the risk is truly terrible – that the armies and the power, the public opinion and votes, will be led by ignorance and profit. As the ancient Greeks knew, a demos could only function when citizens could grasp the consequences of their choices.

Perhaps we had forgotten; thought ‘it can’t happen here’? If so, this year has been a stark reminder of why we dare not be complacent. For who would deny the great political lessons we are almost choking on as we see Brexit evolve from fringe populist movement to a force that is shaking us to pieces? Who will have failed to understand, in the frustrations of Trump, the value of a constitution designed to protect citizens against the ravages of a tyrant?

Why do the social sciences matter? Just look around us. Who would deny the need for new ways to organise our industry and our economy as real incomes fade? Who would deny that we need a society which is able to sensibly regulate against the depredations of the unscrupulous landlord?

Who would deny the need to understand how to better educate and train our youth?

We are engaged in a battle for society, and the fronts are many and difficult. Can we hope to build a society that will look after the stranger in its midst? Is social justice a chimera?

Is there anything to be done?

To this we answer, yes. But we must do more than study, we must find the gears which will ensure what we discover can be absorbed by a society than needs to act with understanding…(More)”

E-residency and blockchain


Clare Sullivan and Eric Burger in Computer Law & Security Review: “In December 2014, Estonia became the first nation to open its digital borders to enable anyone, anywhere in the world to apply to become an e-Resident. Estonian e-Residency is essentially a commercial initiative. The e-ID issued to Estonian e-Residents enables commercial activities with the public and private sectors. It does not provide citizenship in its traditional sense, and the e-ID provided to e-Residents is not a travel document. However, in many ways it is an international ‘passport’ to the virtual world. E-Residency is a profound change and the recent announcement that the Estonian government is now partnering with Bitnation to offer a public notary service to Estonian e-Residents based on blockchain technology is of significance. The application of blockchain to e-Residency has the potential to fundamentally change the way identity information is controlled and authenticated. This paper examines the legal, policy, and technical implications of this development….(More)”.

 

The Politics of Evidence: From evidence-based policy to the good governance of evidence


(Open Access) Book by Justin Parkhurst: “There has been an enormous increase in interest in the use of evidence for public policymaking, but the vast majority of work on the subject has failed to engage with the political nature of decision making and how this influences the ways in which evidence will be used (or misused) within political areas. This book provides new insights into the nature of political bias with regards to evidence and critically considers what an ‘improved’ use of evidence would look like from a policymaking perspective.

Part I describes the great potential for evidence to help achieve social goals, as well as the challenges raised by the political nature of policymaking. It explores the concern of evidence advocates that political interests drive the misuse or manipulation of evidence, as well as counter-concerns of critical policy scholars about how appeals to ‘evidence-based policy’ can depoliticise political debates. Both concerns reflect forms of bias – the first representing technical bias, whereby evidence use violates principles of scientific best practice, and the second representing issue bias in how appeals to evidence can shift political debates to particular questions or marginalise policy-relevant social concerns.

Part II then draws on the fields of policy studies and cognitive psychology to understand the origins and mechanisms of both forms of bias in relation to political interests and values. It illustrates how such biases are not only common, but can be much more predictable once we recognise their origins and manifestations in policy arenas.

Finally, Part III discusses ways to move forward for those seeking to improve the use of evidence in public policymaking. It explores what constitutes ‘good evidence for policy’, as well as the ‘good use of evidence’ within policy processes, and considers how to build evidence-advisory institutions that embed key principles of both scientific good practice and democratic representation. Taken as a whole, the approach promoted is termed the ‘good governance of evidence’ – a concept that represents the use of rigorous, systematic and technically valid pieces of evidence within decision-making processes that are representative of, and accountable to, populations served…(More)”

Mastercard’s Big Data For Good Initiative: Data Philanthropy On The Front Lines


Interview by Randy Bean of Shamina Singh: Much has been written about big data initiatives and the efforts of market leaders to derive critical business insights faster. Less has been written about initiatives by some of these same firms to apply big data and analytics to a different set of issues, which are not solely focused on revenue growth or bottom line profitability. While the focus of most writing has been on the use of data for competitive advantage, a small set of companies has been undertaking, with much less fanfare, a range of initiatives designed to ensure that data can be applied not just for corporate good, but also for social good.

One such firm is Mastercard, which describes itself as a technology company in the payments industry, which connects buyers and sellers in 210 countries and territories across the globe. In 2013 Mastercard launched the Mastercard Center for Inclusive Growth, which operates as an independent subsidiary of Mastercard and is focused on the application of data to a range of issues for social benefit….

In testimony before the Senate Committee on Foreign Affairs on May 4, 2017, Mastercard Vice Chairman Walt Macnee, who serves as the Chairman of the Center for Inclusive Growth, addressed issues of private sector engagement. Macnee noted, “The private sector and public sector can each serve as a force for good independently; however when the public and private sectors work together, they unlock the potential to achieve even more.” Macnee further commented, “We will continue to leverage our technology, data, and know-how in an effort to solve many of the world’s most pressing problems. It is the right thing to do, and it is also good for business.”…

Central to the mission of the Mastercard Center is the notion of “data philanthropy”. This term encompasses notions of data collaboration and data sharing and is at the heart of the initiatives that the Center is undertaking. The three cornerstones on the Center’s mandate are:

  • Sharing Data Insights– This is achieved through the concept of “data grants”, which entails granting access to proprietary insights in support of social initiatives in a way that fully protects consumer privacy.
  • Data Knowledge – The Mastercard Center undertakes collaborations with not-for-profit and governmental organizations on a range of initiatives. One such effort was in collaboration with the Obama White House’s Data-Driven Justice Initiative, by which data was used to help advance criminal justice reform. This initiative was then able, through the use of insights provided by Mastercard, to demonstrate the impact crime has on merchant locations and local job opportunities in Baltimore.
  • Leveraging Expertise – Similarly, the Mastercard Center has collaborated with private organizations such as DataKind, which undertakes data science initiatives for social good.Just this past month, the Mastercard Center released initial findings from its Data Exploration: Neighborhood Crime and Local Business initiative. This effort was focused on ways in which Mastercard’s proprietary insights could be combined with public data on commercial robberies to help understand the potential relationships between criminal activity and business closings. A preliminary analysis showed a spike in commercial robberies followed by an increase in bar and nightclub closings. These analyses help community and business leaders understand factors that can impact business success.Late last year, Ms. Singh issued A Call to Action on Data Philanthropy, in which she challenges her industry peers to look at ways in which they can make a difference — “I urge colleagues at other companies to review their data assets to see how they may be leveraged for the benefit of society.” She concludes, “the sheer abundance of data available today offers an unprecedented opportunity to transform the world for good.”….(More)