Algorithms can deliver public services, too


Diane Coyle in the Financial Times: “As economists have been pointing out for a while, the combination of always-online smartphones and matching algorithms (of the kind pioneered by Nobel Prize-winning economist Richard Thaler and others) reduces the transaction costs involved in economic exchanges. As Ronald Coase argued, transaction costs, because they limit the scope of exchanges in the market, help explain why companies exist. Where those costs are high, it is more efficient to keep the activities inside an organisation. The relevant costs are informational. But in dense urban markets the new technologies make it easy and fast to match up the two sides of a desired exchange, such as a would-be passenger and a would-be driver for a specific journey. This can expand the market (as well as substituting for existing forms of transport such as taxis and buses). It becomes viable to serve previously under-served areas, or for people to make journeys they previously could not have afforded. In principle all parties (customers, drivers and platform) can share the benefits.

Making more efficient use of assets such as cars is nice, but the economic gains come from matching demand with supply in contexts where there are no or few economies of scale — such as conveying a passenger or a patient from A to B, or providing a night’s accommodation in a specific place.

Rather than being seen as a threat to public services, the new technologies should be taken as a compelling opportunity to deliver inherently unscalable services efficiently, especially given the fiscal squeeze so many authorities are facing. Public transport is one opportunity. How else could cash-strapped transportation authorities even hope to provide a universal service on less busy routes? It is hard to see why they should not make contractual arrangements with private providers. Nor is there any good economic reason they could not adopt the algorithmic matching model themselves, although the organisational effort might defeat many.

However, in ageing societies the big prize will be organising other services such as adult social care this way. These are inherently person to person and there are few economies of scale. The financial pressures on governments in delivering care are only going to grow. Adopting a more efficient model for matching demand and supply is so obvious a possible solution that pilot schemes are under way in several cities — both public sector-led and private start-ups. In fact, if public authorities do not try new models, the private sector will certainly fill the gap….(More)”.

Normative Challenges of Identification in the Internet of Things: Privacy, Profiling, Discrimination, and the GDPR


Paper by Sandra Wachter: “In the Internet of Things (IoT), identification and access control technologies provide essential infrastructure to link data between a user’s devices with unique identities, and provide seamless and linked up services. At the same time, profiling methods based on linked records can reveal unexpected details about users’ identity and private life, which can conflict with privacy rights and lead to economic, social, and other forms of discriminatory treatment. A balance must be struck between identification and access control required for the IoT to function and user rights to privacy and identity. Striking this balance is not an easy task because of weaknesses in cybersecurity and anonymisation techniques.

The EU General Data Protection Regulation (GDPR), set to come into force in May 2018, may provide essential guidance to achieve a fair balance between the interests of IoT providers and users. Through a review of academic and policy literature, this paper maps the inherit tension between privacy and identifiability in the IoT.

It focuses on four challenges: (1) profiling, inference, and discrimination; (2) control and context-sensitive sharing of identity; (3) consent and uncertainty; and (4) honesty, trust, and transparency. The paper will then examine the extent to which several standards defined in the GDPR will provide meaningful protection for privacy and control over identity for users of IoT. The paper concludes that in order to minimise the privacy impact of the conflicts between data protection principles and identification in the IoT, GDPR standards urgently require further specification and implementation into the design and deployment of IoT technologies….(More)”.

How the Index Card Cataloged the World


Daniela Blei in the Atlantic: “…The index card was a product of the Enlightenment, conceived by one of its towering figures: Carl Linnaeus, the Swedish botanist, physician, and the father of modern taxonomy. But like all information systems, the index card had unexpected political implications, too: It helped set the stage for categorizing people, and for the prejudice and violence that comes along with such classification….

In 1780, two years after Linnaeus’s death, Vienna’s Court Library introduced a card catalog, the first of its kind. Describing all the books on the library’s shelves in one ordered system, it relied on a simple, flexible tool: paper slips. Around the same time that the library catalog appeared, says Krajewski, Europeans adopted banknotes as a universal medium of exchange. He believes this wasn’t a historical coincidence. Banknotes, like bibliographical slips of paper and the books they referred to, were material, representational, and mobile. Perhaps Linnaeus took the same mental leap from “free-floating banknotes” to “little paper slips” (or vice versa). Sweden’s great botanist was also a participant in an emerging capitalist economy.

Linnaeus never grasped the full potential of his paper technology. Born of necessity, his paper slips were “idiosyncratic,” say Charmantier and Müller-Wille. “There is no sign he ever tried to rationalize or advertise the new practice.” Like his taxonomical system, paper slips were both an idea and a method, designed to bring order to the chaos of the world.

The passion for classification, a hallmark of the Enlightenment, also had a dark side. From nature’s variety came an abiding preoccupation with the differences between people. As soon as anthropologists applied Linnaeus’s taxonomical system to humans, the category of race, together with the ideology of racism, was born.

It’s fitting, then, that the index card would have a checkered history. To take one example, the FBI’s J. Edgar Hoover used skills he burnished as a cataloger at the Library of Congress to assemble his notorious “Editorial Card Index.” By 1920, he had cataloged 200,000 subversive individuals and organizations in detailed, cross-referenced entries. Nazi ideologues compiled a deadlier index-card database to classify 500,000 Jewish Germans according to racial and genetic background. Other regimes have employed similar methods, relying on the index card’s simplicity and versatility to catalog enemies real and imagined.

The act of organizing information—even notes about plants—is never neutral or objective. Anyone who has used index cards to plan a project, plot a story, or study for an exam knows that hierarchies are inevitable. Forty years ago, Michel Foucault observed in a footnote that, curiously, historians had neglected the invention of the index card. The book was Discipline and Punish, which explores the relationship between knowledge and power. The index card was a turning point, Foucault believed, in the relationship between power and technology. Like the categories they cataloged, Linnaeus’s paper slips belong to the history of politics as much as the history of science….(More)”.

Behind the Screen: the Syrian Virtual Resistance


Billie Jeanne Brownlee at Cyber Orient: “Six years have gone by since the political upheaval that swept through many Middle East and North African (MENA) countries begun. Syria was caught in the grip of this revolutionary moment, one that drove the country from a peaceful popular mobilisation to a deadly fratricide civil war with no apparent way out.

This paper provides an alternative approach to the study of the root causes of the Syrian uprising by examining the impact that the development of new media had in reconstructing forms of collective action and social mobilisation in pre-revolutionary Syria.

By providing evidence of a number of significant initiatives, campaigns and acts of contentious politics that occurred between 2000 and 2011, this paper shows how, prior to 2011, scholarly work on Syria has not given sufficient theoretical and empirical consideration to the development of expressions of dissent and resilience of its cyberspace and to the informal and hybrid civic engagement they produced….(More)”.

The nation state goes virtual


Tom Symons at Nesta’s Predictions for 2018: “As the world changes, people expect their governments and public services to do so too. When it’s easy to play computer games with someone on the other side of the world, or set up a company bank account in five minutes, there is an expectation that paying taxes, applying for services or voting should be too…..

To add to this, large political upheavals such as Brexit and the election of Donald Trump have left some people feeling alienated from their national identity. Since the the UK voted to leave the EU, demand for Irish passports has increased by 50 per cent, a sign that people feel dissatisfied by the constraints of geographically determined citizenship when they can no longer relate to their national identity.

In response, some governments see these changes as an opportunity to reconceptualise what we mean by a nation state.

The e-Residency offer

The primary actor in this disruption is Estonia, which leads the world in digital government. In 2015 they introduced an e-Residency, allowing anyone anywhere in the world to receive a government-issued digital identity. The e-Residency gives people access to digital public services and the ability to register and run online businesses from the country, in exactly the same way as someone born in Estonia. As of November 2017, over 27,000 people have applied to be Estonian e-Residents, and they have established over 4,200 companies. Estonia aims to have ten million virtual residents by 2025….

While Estonia is a sovereign nation using technology to redefine itself, there are movements taking advantage of decentralising technologies in a bid to do away with the nation state altogether. Bitnation is a blockchain-based technology which enables people to create and join virtual nations. This allows people to agree their own social contracts between one another, using smart contract technology, removing the need for governments as an administrator or mediator. Since it began in 2014, it has been offering traditional government services, such as notaries, dispute resolution, marriages and voting systems, without the need for a middleman.

As of November 2017, there are over 10,000 Bitnation citizens. …

As citizens, we may be able to educate our children in Finland, access healthcare from South Korea and run our businesses in New Zealand, all without having to leave the comfort of our homes. Governments may see this as means of financial sustainability in the longer term, generating income by selling such services to a global population instead of centralised taxation systems levied on a geographic population.

Such a model has been described as ‘nation-as-a-service’, and could mean countries offering different tiers of citizenship, with taxes based on the number of services used, or tier of citizenship chosen. This could also mean multiple citizenships, including of city-states, as well as nations….

This is the moment for governments to start taking the full implications of the digital age seriously. From electronic IDs and data management through to seamless access to services, citizens will only demand better digital services. Countries such as Azerbaijan, are already developing their own versions of the e-Residency. Large internet platforms such as Amazon are gearing up to replace entire government functions. If governments don’t grasp the nettle, they may find themselves left behind by technology and other countries which won’t wait around for them….(More)”.

Proliferation of Open Government Initiatives and Systems


Book edited by Ayse Kok: “As is true in most aspects of daily life, the expansion of government in the modern era has included a move to a technologically-based system. A method of evaluation for such online governing systems is necessary for effective political management worldwide.

Proliferation of Open Government Initiatives and Systems is an essential scholarly publication that analyzes open government data initiatives to evaluate the impact and value of such structures. Featuring coverage on a broad range of topics including collaborative governance, civic responsibility, and public financial management, this publication is geared toward academicians and researchers seeking current, relevant research on the evaluation of open government data initiatives….(More)”.

Civic Technology: Open Data and Citizen Volunteers as a Resource for North Carolina Local Governments


Report by John B. Stephens: “Civic technology is an emergent area of practice where IT experts and citizens without specialized IT skills volunteer their time using government-provided open data to improve government services or otherwise create public benefit. Civic tech, as it is often referred to, draws on longer-standing practices, particularly e-government and civic engagement. It is also a new form of citizen–government co-production, building on the trend of greater government transparency.

This report is designed to help North Carolina local government leaders:

  • Define civic technology practices and describing North Carolina civic tech resources
  • Highlight accomplishments and ongoing projects in civic tech (in North Carolina and beyond)
  • Identify opportunities and challenges for North Carolina local governments in civic tech
  • Provide a set of resources for education and involvement in civic tech….(More)”.

Some new nonprofits take off, others flop – and nobody knows why


 in the Conversation: “Although entrepreneurially minded donors would benefit from expert guidance on how to spot the outfits most likely to succeed, there’s been virtually no research on this question. As a scholar focusing on how new nonprofits are founded and develop, I believe two shortcomings in how research regarding nonprofit startups gets done create this gap. And I believe that this problem stands in the way of figuring out why some new nonprofits thrive while others do not.

An obsession with success

Plenty of people like the notion of nonprofit entrepreneurship. This is not surprising, as much of what is said and written about this subjectcelebrates and spotlights success stories, and it’s human nature to revere achievement.

Success stories can certainly boost interest and excitement among nonprofit donors and other stakeholders. But what leads some nonprofit entrepreneurs to succeed and others to fail or just muddle along?

This is what academics call “selection bias” – paying attention to only one of many subsets rather than taking stock of them all. The Swedish entrepreneurship researcher Per Davidsson has eloquently illustrated why selection bias is a big deal with this paraphrased example:

Imagine that researchers want to investigate and isolate the factors that make gamblers successful. If they study only the gamblers who win all the time, they would reach the obviously false conclusion that gambling is always profitable…

Snapshot studies

A second shortcoming is the inclination to rely on research methods that investigate new nonprofits and the donors who fund them at a single point in time. One way that happens is that researchers rely on surveys that capture a nonprofit’s activities and accomplishments at a single juncture rather than over the course of months, years or decades.

But new nonprofits don’t just emerge out of nowhere. Creating and developing a thriving new nonprofit is a process, as I have explained in Nonprofit Management & Leadership, an academic journal.

Thus, distilling what makes new ventures prosper requires studying how they – and other nonprofits – evolve. Inevitably by their very nature, snapshot studies fail to explain how and why such changes take place.

Further, nonprofit studies seldom capture the earliest phases of new nonprofit activity, even though entrepreneurship and organizationalresearchers have argued what happens during these nascent stages shape organizations and have a long-lasting influence on them.

Consequently, those few scholars who actually try to look into startup dynamics rely on fleeting snapshots resort to asking founders or other key staff, board members and additional stakeholders about earlier events and actions long after they took place. But that approach is prone to another kind of error, memory distortion….(More)”.

Research reveals de-identified patient data can be re-identified


Vanessa Teague, Chris Culnane and Ben Rubinstein in PhysOrg: “In August 2016, Australia’s federal Department of Health published medical billing records of about 2.9 million Australians online. These records came from the Medicare Benefits Scheme (MBS) and the Pharmaceutical Benefits Scheme (PBS) containing 1 billion lines of historical health data from the records of around 10 per cent of the population.

These longitudinal records were de-identified, a process intended to prevent a person’s identity from being connected with information, and were made public on the government’s open data website as part of its policy on accessible public 

We found that patients can be re-identified, without decryption, through a process of linking the unencrypted parts of the  with known information about the individual.

Our findings replicate those of similar studies of other de-identified datasets:

  • A few mundane facts taken together often suffice to isolate an individual.
  • Some patients can be identified by name from publicly available information.
  • Decreasing the precision of the data, or perturbing it statistically, makes re-identification gradually harder at a substantial cost to utility.

The first step is examining a patient’s uniqueness according to medical procedures such as childbirth. Some individuals are unique given public information, and many patients are unique given a few basic facts, such as year of birth or the date a baby was delivered….

The second step is examining uniqueness according to the characteristics of commercial datasets we know of but cannot access directly. There are high uniqueness rates that would allow linking with a commercial pharmaceutical dataset, and with the billing data available to a bank. This means that ordinary people, not just the prominent ones, may be easily re-identifiable by their bank or insurance company…

These de-identification methods were bound to fail, because they were trying to achieve two inconsistent aims: the protection of individual privacy and publication of detailed individual records. De-identification is very unlikely to work for other rich datasets in the government’s care, like census data, tax records, mental health records, penal information and Centrelink data.

While the ambition of making more data more easily available to facilitate research, innovation and sound public policy is a good one, there is an important technical and procedural problem to solve: there is no good solution for publishing sensitive complex individual records that protects privacy without substantially degrading the usefulness of the data.

Some data can be safely published online, such as information about government, aggregations of large collections of material, or data that is differentially private. For sensitive, complex data about individuals, a much more controlled release in a secure research environment is a better solution. The Productivity Commission recommends a “trusted user” model, and techniques like dynamic consent also give patients greater control and visibility over their personal information….(More).

The Annual Review of Social Partnerships


(Open access) book edited by May Seitanidi and Verena Bitzer: “…written for and by cross-sector social partnership (CSSP) academics and practitioners focusing on nonprofit, business, and public sectors, who view collaboration as key to solving social problems such as climate change, economic inequality, poverty, or biodiversity loss and environmental degradation. Published by an independent group of academics and practitioners since 2006, the ARSP bridges academic theory and practice with ideas about promoting the social good, covering a wide range of subjects and geographies surrounding the interactions between nonprofit, business, and public sectors. Its aim is to inform, to share, to inspire, to educate, and to train. Building a global community of experts on CSSPs, be they from academic or practice, is the inherent motivation of the ARSP. The ARSP offers new directions for research, presents funded research projects, and provides published papers in a compilation, allowing researchers to familiarize themselves with the latest work in this field. The ARSP also captures and presents insights on partnerships from practitioners, enabling its readership to learn from the hands-on experiences and observations of those who work with and for partnerships….(More)”. Issues of the ARSP can be downloaded here.