Plato and the Nerd. The Creative Partnership of Humans and Technology


MITPress: “In this book, Edward Ashford Lee makes a bold claim: that the creators of digital technology have an unsurpassed medium for creativity. Technology has advanced to the point where progress seems limited not by physical constraints but the human imagination. Writing for both literate technologists and numerate humanists, Lee makes a case for engineering—creating technology—as a deeply intellectual and fundamentally creative process. Explaining why digital technology has been so transformative and so liberating, Lee argues that the real power of technology stems from its partnership with humans.

Lee explores the ways that engineers use models and abstraction to build inventive artificial worlds and to give us things that we never dreamed of—for example, the ability to carry in our pockets everything humans have ever published. But he also attempts to counter the runaway enthusiasm of some technology boosters who claim everything in the physical world is a computation—that even such complex phenomena as human cognition are software operating on digital data. Lee argues that the evidence for this is weak, and the likelihood that nature has limited itself to processes that conform to today’s notion of digital computation is remote.

Lee goes on to argue that artificial intelligence’s goal of reproducing human cognitive functions in computers vastly underestimates the potential of computers. In his view, technology is coevolving with humans. It augments our cognitive and physical capabilities while we nurture, develop, and propagate the technology itself. Complementarity is more likely than competition….(More)”.

The Role of Evidence in Politics: Motivated Reasoning and Persuasion among Politicians


Martin Baekgaard et al in British Journal of Political Science: “Does evidence help politicians make informed decisions even if it is at odds with their prior beliefs? And does providing more evidence increase the likelihood that politicians will be enlightened by the information? Based on the literature on motivated political reasoning and the theory about affective tipping points, this article hypothesizes that politicians tend to reject evidence that contradicts their prior attitudes, but that increasing the amount of evidence will reduce the impact of prior attitudes and strengthen their ability to interpret the information correctly. These hypotheses are examined using randomized survey experiments with responses from 954 Danish politicians, and results from this sample are compared to responses from similar survey experiments with Danish citizens. The experimental findings strongly support the hypothesis that politicians are biased by prior attitudes when interpreting information. However, in contrast to expectations, the findings show that the impact of prior attitudes increases when more evidence is provided….(More)“.

Nudging and Boosting: Steering or Empowering Good Decisions


 and  in Perspectives on Psychological Science: “In recent years, policy makers worldwide have begun to acknowledge the potential value of insights from psychology and behavioral economics into how people make decisions. These insights can inform the design of nonregulatory and nonmonetary policy interventions—as well as more traditional fiscal and coercive measures. To date, much of the discussion of behaviorally informed approaches has emphasized “nudges,” that is, interventions designed to steer people in a particular direction while preserving their freedom of choice. Yet behavioral science also provides support for a distinct kind of nonfiscal and noncoercive intervention, namely, “boosts.” The objective of boosts is to foster people’s competence to make their own choices—that is, to exercise their own agency. Building on this distinction, we further elaborate on how boosts are conceptually distinct from nudges: The two kinds of interventions differ with respect to (a) their immediate intervention targets, (b) their roots in different research programs, (c) the causal pathways through which they affect behavior, (d) their assumptions about human cognitive architecture, (e) the reversibility of their effects, (f) their programmatic ambitions, and (g) their normative implications. We discuss each of these dimensions, provide an initial taxonomy of boosts, and address some possible misconceptions….(More)”.

Crowdsourcing citizen science: exploring the tensions between paid professionals and users


Jamie Woodcock et al in the Journal of Peer Production: “This paper explores the relationship between paid labour and unpaid users within the Zooniverse, a crowdsourced citizen science platform. The platform brings together a crowd of users to categorise data for use in scientific projects. It was initially established by a small group of academics for a single astronomy project, but has now grown into a multi-project platform that has engaged over 1.3 million users so far. The growth has introduced different dynamics to the platform as it has incorporated a greater number of scientists, developers, links with organisations, and funding arrangements—each bringing additional pressures and complications. The relationships between paid/professional and unpaid/citizen labour have become increasingly complicated with the rapid expansion of the Zooniverse. The paper draws on empirical data from an ongoing research project that has access to both users and paid professionals on the platform. There is the potential through growing peer-to-peer capacity that the boundaries between professional and citizen scientists can become significantly blurred. The findings of the paper, therefore, address important questions about the combinations of paid and unpaid labour, the involvement of a crowd in citizen science, and the contradictions this entails for an online platform. These are considered specifically from the viewpoint of the users and, therefore, form a new contribution to the theoretical understanding of crowdsourcing in practice….(More)”.

Journal tries crowdsourcing peer reviews, sees excellent results


Chris Lee at ArsTechnica: “Peer review is supposed to act as a sanity check on science. A few learned scientists take a look at your work, and if it withstands their objective and entirely neutral scrutiny, a journal will happily publish your work. As those links indicate, however, there are some issues with peer review as it is currently practiced. Recently, Benjamin List, a researcher and journal editor in Germany, and his graduate assistant, Denis Höfler, have come up with a genius idea for improving matters: something called selected crowd-sourced peer review….

My central point: peer review is burdensome and sometimes barely functional. So how do we improve it? The main way is to experiment with different approaches to the reviewing process, which many journals have tried, albeit with limited success. Post-publication peer review, when scientists look over papers after they’ve been published, is also an option but depends on community engagement.

But if your paper is uninteresting, no one will comment on it after it is published. Pre-publication peer review is the only moment where we can be certain that someone will read the paper.

So, List (an editor for Synlett) and Höfler recruited 100 referees. For their trial, a forum-style commenting system was set up that allowed referees to comment anonymously on submitted papers but also on each other’s comments as well. To provide a comparison, the papers that went through this process also went through the traditional peer review process. The authors and editors compared comments and (subjectively) evaluated the pros and cons. The 100-person crowd of researchers was deemed the more effective of the two.

The editors found that it took a bit more time to read and collate all the comments into a reviewers’ report. But it was still faster, which the authors loved. Typically, it took the crowd just a few days to complete their review, which compares very nicely to the usual four to six weeks of the traditional route (I’ve had papers languish for six months in peer review). And, perhaps most important, the responses were more substantive and useful compared to the typical two-to-four-person review.

So far, List has not published the trial results formally. Despite that, Synlett is moving to the new system for all its papers.

Why does crowdsourcing work?

Here we get back to something more editorial. I’d suggest that there is a physical analog to traditional peer review, called noise. Noise is not just a constant background that must be overcome. Noise is also generated by the very process that creates a signal. The difference is how the amplitude of noise grows compared to the amplitude of signal. For very low-amplitude signals, all you measure is noise, while for very high-intensity signals, the noise is vanishingly small compared to the signal, even though it’s huge compared to the noise of the low-amplitude signal.

Our esteemed peers, I would argue, are somewhat random in their response, but weighted toward objectivity. Using this inappropriate physics model, a review conducted by four reviewers can be expected (on average) to contain two responses that are, basically, noise. By contrast, a review by 100 reviewers may only have 10 responses that are noise. Overall, a substantial improvement. So, adding the responses of a large number of peers together should produce a better picture of a scientific paper’s strengths and weaknesses.

Didn’t I just say that reviewers are overloaded? Doesn’t it seem that this will make the problem worse?

Well, no, as it turns out. When this approach was tested (with consent) on papers submitted to Synlett, it was discovered that review times went way down—from weeks to days. And authors reported getting more useful comments from their reviewers….(More)”.

Is it too late to build a better world?


Keith Burnett at Campaign for Social Science: “The greatest challenge we face is to use our intellects to guide our actions in making the world a better place for us and our fellow human beings.

This is no easy task and its history is littered with false dawns and doctrines. You would have to be blind to the lessons of the past to fail to appreciate the awful impact that delusional ideas have had on mankind. Some of the worst are those meant to save us.

There are some who take this as a warning against intervention at all, who say it can never be done and shouldn’t even be attempted. That the forces of nature blow winds in society that we can never tame. That we are bound to suffer like a small ship in a stormy sea.

They might be right, but it would be the utmost dereliction of academia to give up on this quest. And in any case, I don’t believe it is true. These forces may be there, but there is much we can do, a lot of it deeply practical to make the journey more comfortable and so we even end up in the right port.

Of course, there are those who believe we academics simply don’t care. That scholarship is happiest at a distance from messy, contradictory humanity and prefers in its detached world of conferences and publications. That we are content to analyse rather than heal.

Well I can tell you that my social sciences colleagues at Sheffield are not content in an ivory tower and they never have been. They feel the challenges of our world as keenly as any. And they know if we ever needed understanding, and a vision of what society could be, we need it now.

I am confident they are not alone and, as a scientist all my life, it has become apparent to me that, to translate insights into change, we must frequently overcome barriers of perception and culture, of politics and prejudice. Our great challenges are not only technical but matters of education and economics. Our barriers those of opportunity, power and purpose.

If we want solutions to reach those who desperately need them, we must understand how to take the word and make it flesh. Ideas alone are not enough, they come to life through people. They need money, armies of changed opinion.

If we don’t do this work, the risk is truly terrible – that the armies and the power, the public opinion and votes, will be led by ignorance and profit. As the ancient Greeks knew, a demos could only function when citizens could grasp the consequences of their choices.

Perhaps we had forgotten; thought ‘it can’t happen here’? If so, this year has been a stark reminder of why we dare not be complacent. For who would deny the great political lessons we are almost choking on as we see Brexit evolve from fringe populist movement to a force that is shaking us to pieces? Who will have failed to understand, in the frustrations of Trump, the value of a constitution designed to protect citizens against the ravages of a tyrant?

Why do the social sciences matter? Just look around us. Who would deny the need for new ways to organise our industry and our economy as real incomes fade? Who would deny that we need a society which is able to sensibly regulate against the depredations of the unscrupulous landlord?

Who would deny the need to understand how to better educate and train our youth?

We are engaged in a battle for society, and the fronts are many and difficult. Can we hope to build a society that will look after the stranger in its midst? Is social justice a chimera?

Is there anything to be done?

To this we answer, yes. But we must do more than study, we must find the gears which will ensure what we discover can be absorbed by a society than needs to act with understanding…(More)”

Mastercard’s Big Data For Good Initiative: Data Philanthropy On The Front Lines


Interview by Randy Bean of Shamina Singh: Much has been written about big data initiatives and the efforts of market leaders to derive critical business insights faster. Less has been written about initiatives by some of these same firms to apply big data and analytics to a different set of issues, which are not solely focused on revenue growth or bottom line profitability. While the focus of most writing has been on the use of data for competitive advantage, a small set of companies has been undertaking, with much less fanfare, a range of initiatives designed to ensure that data can be applied not just for corporate good, but also for social good.

One such firm is Mastercard, which describes itself as a technology company in the payments industry, which connects buyers and sellers in 210 countries and territories across the globe. In 2013 Mastercard launched the Mastercard Center for Inclusive Growth, which operates as an independent subsidiary of Mastercard and is focused on the application of data to a range of issues for social benefit….

In testimony before the Senate Committee on Foreign Affairs on May 4, 2017, Mastercard Vice Chairman Walt Macnee, who serves as the Chairman of the Center for Inclusive Growth, addressed issues of private sector engagement. Macnee noted, “The private sector and public sector can each serve as a force for good independently; however when the public and private sectors work together, they unlock the potential to achieve even more.” Macnee further commented, “We will continue to leverage our technology, data, and know-how in an effort to solve many of the world’s most pressing problems. It is the right thing to do, and it is also good for business.”…

Central to the mission of the Mastercard Center is the notion of “data philanthropy”. This term encompasses notions of data collaboration and data sharing and is at the heart of the initiatives that the Center is undertaking. The three cornerstones on the Center’s mandate are:

  • Sharing Data Insights– This is achieved through the concept of “data grants”, which entails granting access to proprietary insights in support of social initiatives in a way that fully protects consumer privacy.
  • Data Knowledge – The Mastercard Center undertakes collaborations with not-for-profit and governmental organizations on a range of initiatives. One such effort was in collaboration with the Obama White House’s Data-Driven Justice Initiative, by which data was used to help advance criminal justice reform. This initiative was then able, through the use of insights provided by Mastercard, to demonstrate the impact crime has on merchant locations and local job opportunities in Baltimore.
  • Leveraging Expertise – Similarly, the Mastercard Center has collaborated with private organizations such as DataKind, which undertakes data science initiatives for social good.Just this past month, the Mastercard Center released initial findings from its Data Exploration: Neighborhood Crime and Local Business initiative. This effort was focused on ways in which Mastercard’s proprietary insights could be combined with public data on commercial robberies to help understand the potential relationships between criminal activity and business closings. A preliminary analysis showed a spike in commercial robberies followed by an increase in bar and nightclub closings. These analyses help community and business leaders understand factors that can impact business success.Late last year, Ms. Singh issued A Call to Action on Data Philanthropy, in which she challenges her industry peers to look at ways in which they can make a difference — “I urge colleagues at other companies to review their data assets to see how they may be leveraged for the benefit of society.” She concludes, “the sheer abundance of data available today offers an unprecedented opportunity to transform the world for good.”….(More)

Citizen science volunteers driven by desire to learn


UoP News: “People who give up their time for online volunteering are mainly motivated by a desire to learn, a new study has found.

The research surveyed volunteers on ‘citizen science’ projects and suggests that this type of volunteering could be used to increase general knowledge of science within society.

The study, led by Dr Joe Cox from the Department of Economics and Finance, discovered that an appetite to learn more about the subject was the number one driver for online volunteers, followed by being part of a community. It also revealed that many volunteers are motivated by a desire for escapism.

Online volunteering and crowdsourcing projects typically involve input from large numbers of contributors working individually but towards a common goal. This study surveyed 2000 people who volunteer for ‘citizen science’ projects hosted by Zooniverse, a collection of research projects that rely on volunteers to help scientists with the challenge of interpreting massive amounts of data….“What was interesting was that characteristics such as age, gender and level of education had no correlation with the amount of time people give up and the length of time they stay on a project. These participants were relatively highly educated compared with the rest of the population, but those with the highest levels of education do not appear to contribute the most effort and information towards these projects.”

The study noticed pronounced changes in how people are motivated at different stages of the volunteer process. While a desire to learn is the most important motivation among contributors at the early stages, the opportunities for social interaction and escapism become more important motivations at later stages….

He suggests that online volunteering and citizen science projects could incentivise participation by offering clearly defined opportunities for learning, while representing an effective way of increasing scientific literacy and knowledge within society….(More)”.

Elsevier Is Becoming a Data Company. Should Universities Be Wary?


Paul Basken at The Chronicle of Higher Education: “As universities have slowly pushed their scientists to embrace open-access journals, publishers will need new profit centers. Elsevier appears well ahead of the pack in creating a network of products that scientists can use to record, share, store, and measure the value to others of the surging amounts of data they produce.

“Maybe all publishers are going, or wish they were” going, in the direction of becoming data companies, said Vincent Larivière, an associate professor of information science at the University of Montreal. “But Elsevier is the only one that is there.”

A Suite of Services

Universities also recognize the future of data. Their scientists are already seeing that widely and efficiently sharing data in fields such as cancer research has enabled accomplishments that have demonstrably saved lives.

In their eagerness to embrace that future, however, universities may not be paying enough attention to what their choices of systems may eventually cost them, warned Roger C. Schonfeld, a program director at Ithaka S+R. With its comprehensive data-services network, Mr. Schonfeld wrote earlier this year, Elsevier appears ready “to lock in scientists to a research workflow no less powerful than the strength of the lock-in libraries have felt to ‘big deal’ bundles.”….

Some open-access advocates say the situation points to an urgent need to create more robust nonprofit alternatives to Elsevier’s product line of data-compiling and sharing tools. But so far financial backing for the developmental work is thin. One of the best known attempts is the Open Science Framework, a web-based data interface built by the Center for Open Science, which has an annual budget of about $6 million, provided largely by foundations and other private donors.

In general, U.S. research universities — a $70 billion scientific enterprise — have not made major contributions to such projects. The Association of American Universities and the Association of Public and Land-grant Universities have, however, formed a team that’s begun studying the future of data sharing. So far, that effort has been focused on more basic steps such as establishing data-storage facilities, linking them together, and simply persuading scientists to take seriously the need to share data.…(More)”

How data can heal our oceans


Nishan Degnarain and Steve Adler at WEF: “We have collected more data on our oceans in the past two years than in the history of the planet.

There has been a proliferation of remote and near sensors above, on, and beneath the oceans. New low-cost micro satellites ring the earth and can record what happens below daily. Thousands of tidal buoys follow currents transmitting ocean temperature, salinity, acidity and current speed every minute. Undersea autonomous drones photograph and map the continental shelf and seabed, explore deep sea volcanic vents, and can help discover mineral and rare earth deposits.

The volume, diversity and frequency of data is increasing as the cost of sensors fall, new low-cost satellites are launched, and an emerging drone sector begins to offer new insights into our oceans. In addition, new processing capabilities are enhancing the value we receive from such data on the biological, physical and chemical properties of our oceans.

Yet it is not enough.

We need much more data at higher frequency, quality, and variety to understand our oceans to the degree we already understand the land. Less than 5% of the oceans are comprehensively monitored. We need more data collection capacity to unlock the sustainable development potential of the oceans and protect critical ecosystems.

More data from satellites will help identify illegal fishing activity, track plastic pollution, and detect whales and prevent vessel collisions. More data will help speed the placement of offshore wind and tide farms, improve vessel telematics, develop smart aquaculture, protect urban coastal zones, and enhance coastal tourism.

Unlocking the ocean data market

But we’re not there yet.

This new wave of data innovation is constrained by inadequate data supply, demand, and governance. The supply of existing ocean data is locked by paper records, old formats, proprietary archives, inadequate infrastructure, and scarce ocean data skills and capacity.

The market for ocean observation is driven by science and science isn’t adequately funded.

To unlock future commercial potential, new financing mechanisms are needed to create market demand that will stimulate greater investments in new ocean data collection, innovation and capacity.

Efforts such as the Financial Stability Board’s Taskforce on Climate-related Financial Disclosure have gone some way to raise awareness and create demand for such ocean-related climate risk data.

Much data that is produced is collected by nations, universities and research organizations, NGO’s, and the private sector, but only a small percentage is Open Data and widely available.

Data creates more value when it is widely utilized and well governed. Helping organize to improve data infrastructure, quality, integrity, and availability is a requirement for achieving new ocean data-driven business models and markets. New Ocean Data Governance models, standards, platforms, and skills are urgently needed to stimulate new market demand for innovation and sustainable development….(More)”.