Computers have an unlikely origin story: the 1890 census


David Lindsey Roberts at FastCompany: “The U.S. Constitution requires that a population count be conducted at the beginning of every decade.

This census has always been charged with political significance and continues to be. That’s clear from the controversy over the conduct of the upcoming 2020 census.

But it’s less widely known how important the census has been in developing the U.S. computer industry, a story that I tell in my new book, Republic of Numbers: Unexpected Stories of Mathematical Americans Through History....

The only use of the census clearly specified in the Constitution is to allocate seats in the House of Representatives. More populous states get more seats.

A minimalist interpretation of the census mission would require reporting only the overall population of each state. But the census has never confined itself to this.

A complicating factor emerged right at the beginning, with the Constitution’s distinction between “free persons” and “three-fifths of all other persons.” This was the Founding Fathers’ infamous mealy-mouthed compromise between those states with a large number of enslaved persons and those states where relatively few lived.

The first census, in 1790, also made nonconstitutionally mandated distinctions by age and sex. In subsequent decades, many other personal attributes were probed as well: occupational status, marital status, educational status, place of birth, and so on….

John Shaw Billings, a physician assigned to assist the Census Office with compiling health statistics, had closely observed the immense tabulation efforts required to deal with the raw data of 1880. He expressed his concerns to a young mechanical engineer assisting with the census, Herman Hollerith, a recent graduate of the Columbia School of Mines.

On September 23, 1884, the U.S. Patent Office recorded a submission from the 24-year-old Hollerith, titled “Art of Compiling Statistics.”

By progressively improving the ideas of this initial submission, Hollerith would decisively win an 1889 competition to improve the processing of the 1890 census.

The technological solutions devised by Hollerith involved a suite of mechanical and electrical devices….After his census success, Hollerith went into business selling this technology. The company he founded would, after he retired, become International Business Machines—IBM. IBM led the way in perfecting card technology for recording and tabulating large sets of data for a variety of purposes….(More)”

Linked Democracy: Foundations, Tools, and Applications


Book edited by Marta Poblet, Pompeu Casanovas and Víctor Rodríguez-Doncel: “This open access book shows the factors linking information flow, social intelligence, rights management and modelling with epistemic democracy, offering licensed linked data along with information about the rights involved. This model of democracy for the web of data brings new challenges for the social organisation of knowledge, collective innovation, and the coordination of actions. Licensed linked data, licensed linguistic linked data, right expression languages, semantic web regulatory models, electronic institutions, artificial socio-cognitive systems are examples of regulatory and institutional design (regulations by design). The web has been massively populated with both data and services, and semantically structured data, the linked data cloud, facilitates and fosters human-machine interaction. Linked data aims to create ecosystems to make it possible to browse, discover, exploit and reuse data sets for applications. Rights Expression Languages semi-automatically regulate the use and reuse of content…(More)”.

AI script finds bias in movies before production starts


Springwise:The GD-IQ (Geena Davis Inclusion Quotient) Spellcheck for Bias analysis tool reviews film and television scripts for equality and diversity. Geena Davis, the founder of the Geena Davis Institute on Gender in Media, recently announced a yearlong pilot programme with Walt Disney Studios. The Spellcheck for Bias tool will be used throughout the studio’s development process.

Funded by Google, the GD-IQ uses audio-visual processing technologies from the University of Southern California Viterbi School of Engineering together with Google’s machine learning capabilities. 

The tool’s analysis reveals the percentages of representation and dialogue broken down into categories of gender, race, LGBTQIA and disability representation. The analysis also highlights non-gender identified speaking characters that could help improve equality and diversity. 

Designed to help identify unconscious bias before it becomes a publicly consumed piece of media, the tool also ranks the sophistication of the characters’ vocabulary and their relative level of power within the story.

The first study of film and television representation using the GD-IQ examined the top 200 grossing, non-animated films of 2014 and 2015. Unsurprisingly, the more diverse and equal a film’s characters were, the more money the film earned. …(More)”.

CROWD4EMS: a Crowdsourcing Platform for Gathering and Geolocating Social Media Content in Disaster Response


Paper by Ravi Shankar et al” Increase in access to mobile phone devices and social media networks has changed the way people report and respond to disasters. Community-driven initiatives such as Stand By Task Force (SBTF) or GISCorps have shown great potential by crowdsourcing the acquisition, analysis, and geolocation of social media data for disaster responders. These initiatives face two main challenges: (1) most of social media content such as photos and videos are not geolocated, thus preventing the information to be used by emergency responders, and (2) they lack tools to manage volunteers contributions and aggregate them in order to ensure high quality and reliable results. This paper illustrates the use of a crowdsourcing platform that combines automatic methods for gathering information from social media and crowdsourcing techniques, in order to manage and aggregate volunteers contributions. High precision geolocation is achieved by combining data mining techniques for estimating the location of photos and videos from social media, and crowdsourcing for the validation and/or improvement of the estimated location.

The evaluation of the proposed approach is carried out using data related to the Amatrice Earthquake in 2016, coming from Flickr, Twitter and Youtube. A common data set is analyzed and geolocated by both the volunteers using the proposed platform and a group of experts. Data quality and data reliability is assessed by comparing volunteers versus experts results. Final results are shown in a web map service providing a global view of the information social media provided about the Amatrice Earthquake event…(More)”.

Official Statistics 4.0: Verified Facts for People in the 21st Century


Book by Walter J. Radermacher: “This book explores official statistics and their social function in modern societies. Digitisation and globalisation are creating completely new opportunities and risks, a context in which facts (can) play an enormously important part if they are produced with a quality that makes them credible and purpose-specific. In order for this to actually happen, official statistics must continue to actively pursue the modernisation of their working methods.This book is not about the technical and methodological challenges associated with digitisation and globalisation; rather, it focuses on statistical sociology, which scientifically deals with the peculiarities and pitfalls of governing-by-numbers, and assigns statistics a suitable position in the future informational ecosystem. Further, the book provides a comprehensive overview of modern issues in official statistics, embodied in a historical and conceptual framework that endows it with different and innovative perspectives. Central to this work is the quality of statistical information provided by official statistics. The implementation of the UN Sustainable Development Goals in the form of indicators is another driving force in the search for answers, and is addressed here….(More)”.

World stumbling zombie-like into a digital welfare dystopia, warns UN human rights expert


UN Press Release: “A UN human rights expert has expressed concerns about the emergence of the “digital welfare state”, saying that all too often the real motives behind such programs are to slash welfare spending, set up intrusive government surveillance systems and generate profits for private corporate interests.

“As humankind moves, perhaps inexorably, towards the digital welfare future it needs to alter course significantly and rapidly to avoid stumbling zombie-like into a digital welfare dystopia,” the Special Rapporteur on extreme poverty and human rights, Philip Alston, says in a report to be presented to the General Assembly on Friday.

The digital welfare state is commonly presented as an altruistic and noble enterprise designed to ensure that citizens benefit from new technologies, experience more efficient government, and enjoy higher levels of well-being. But, Alston said, the digitization of welfare systems has very often been used to promote deep reductions in the overall welfare budget, a narrowing of the beneficiary pool, the elimination of some services, the introduction of demanding and intrusive forms of conditionality, the pursuit of behavioural modification goals, the imposition of stronger sanctions regimes, and a complete reversal of the traditional notion that the state should be accountable to the individual….(More)”.

A Strong Democracy Is a Digital Democracy


 Audrey Tang in the New York Times: “Democracy improves as more people participate. And digital technology remains one of the best ways to improve participation — as long as the focus is on finding common ground and creating consensus, not division.

These are lessons Taiwan has taken to heart in recent years, with the government and the tech community partnering to create online platforms and other digital initiatives that allow everyday citizens to propose and express their opinion on policy reforms. Today, Taiwan is crowdsourcing democracy to create a more responsive government.

Fittingly, this movement, which today aims to increase government transparency, was born in a moment of national outrage over a lack of openness and accountability in politics.

On March 18, 2014, hundreds of young activists, most of them college students, occupied Taiwan’s legislature to express their profound opposition to a new trade pact with Beijing then under consideration, as well as the secretive manner in which it was being pushed through Parliament by the Kuomintang, the ruling party.

Catalyzing what came to be known as the Sunflower Movement, the protesters demanded that the pact be scrapped and that the government institute a more transparent ratification process.

The occupation, which drew widespread public support, ended a little more than three weeks later, after the government promised greater legislative oversight of the trade pact. (To this day, the pact has yet to be approved by Taiwan’s legislature.) A poll released after the occupation, however, showed that 76 percent of the nation remained dissatisfied with the Kuomintang government, illustrating the crisis of trust caused by the trade deal dispute.

To heal this rift and communicate better with everyday citizens, the administration reached out to a group of civic-minded hackers and coders, known as g0v (pronounced “gov-zero”), who had been seeking to improve government transparency through the creation of open-source tools. The organization had come to the attention of the government during the Sunflower occupation, when g0v hackers had worked closely with the protesters.

In December 2014, Jaclyn Tsai, a government minister focused on digital technology, attended a g0v-sponsored hackathon and proposed the establishment of a neutral platform where various online communities could exchange policy ideas.

Several contributors from g0v responded by partnering with the government to start the vTaiwan platform in 2015. VTaiwan (which stands for “virtual Taiwan”) brings together representatives from the public, private and social sectors to debate policy solutions to problems primarily related to the digital economy. Since it began, vTaiwan has tackled 30 issues, relying on a mix of online debate and face-to-face discussions with stakeholders. Though the government is not obligated to follow vTaiwan’s recommendations (a policy that may soon change), the group’s work often leads to concrete action….(More)”.

The Economics of Artificial Intelligence


Book edited by Ajay Agrawal, Joshua Gans and Avi Goldfarb: “Advances in artificial intelligence (AI) highlight the potential of this technology to affect productivity, growth, inequality, market power, innovation, and employment. This volume seeks to set the agenda for economic research on the impact of AI.

It covers four broad themes: AI as a general purpose technology; the relationships between AI, growth, jobs, and inequality; regulatory responses to changes brought on by AI; and the effects of AI on the way economic research is conducted. It explores the economic influence of machine learning, the branch of computational statistics that has driven much of the recent excitement around AI, as well as the economic impact of robotics and automation and the potential economic consequences of a still-hypothetical artificial general intelligence. The volume provides frameworks for understanding the economic impact of AI and identifies a number of open research questions…. (More)”

Data gaps threaten achievement of development goals in Africa


Sara Jerving at Devex: “Data gaps across the African continent threaten to hinder the achievement of the Sustainable Development Goals and the African Union’s Agenda 2063, according to the Mo Ibrahim Foundation’s first governance report released on Tuesday.

The report, “Agendas 2063 & 2030: Is Africa On Track?“ based on an analysis of the foundation’s Ibrahim index of African governance, found that since the adoption of both of these agendas, the availability of public data in Africa has declined. With data focused on social outcomes, there has been a notable decline in education, population and vital statistics, such as birth and death records, which allow citizens to access public services.

The index, on which the report is based, is the most comprehensive dataset on African governance, drawing on ten years of data of all 54 African nations. An updated index is released every two years….

The main challenge in the production of quality, timely data, according to the report, is a lack of funding and lack of independence of the national statistical offices.

Only one country, Mauritius, had a perfect score in terms of independence of its national statistics office – meaning that its office can collect the data it chooses, publish without approval from other arms of the government, and is sufficiently funded. Fifteen African nations scored zero in terms of the independence of their offices….(More)”.

The Nature of Moral Motivation


Edge Interview with Patricia S. Churchland: “Although we have made tremendous progress in understanding many details of the brain, there are huge gaps in our knowledge. What’s relevant to me, as somebody who’s interested in the nature of moral behavior, is how little we understand about the nature of reasoning, or if I may use a different expression, problem solving. I don’t know what reasoning is. For a long time, people seemed to think it was completely separate from emotion, but we know that can’t be true.

The nature of problem solving is something that is still very much in the pioneering stages in neuroscience. It’s a place where neuroscience and psychology can cooperate to get interesting experimental paradigms so that we can attack the question: How is it that, out of all these constraints and factors, a reasonable decision can be made? That’s a tough one. It will require us to find good experimental paradigms and new techniques….(More)”.