The science prize that’s making waves


Gillian Tett at the Financial Times: “The Ocean Health XPrize reveals a new fashion among philanthropists’…There is another reason why the Ocean Health XPrize fascinates me: what it reveals about the new fashion among philanthropists for handing out big scientific prizes. The idea is not a new one: wealthy people and governments have been giving prizes for centuries. In 1714, for example, the British government passed the Longitude Act, establishing a board to offer reward money for innovation in navigation — the most money was won by John Harrison, a clockmaker who invented the marine chronometer.

But a fascinating shift has taken place in the prize-giving game. In previous decades, governments or philanthropists usually bestowed money to recognise past achievements, often in relation to the arts. In 2012, McKinsey, the management consultants, estimated that before 1991, 97 per cent of prize money was a “recognition” award — for example, the Nobel Prizes. Today, however, four-fifths of all prize money is “incentive” or “inducement” awards. This is because many philanthropists and government agencies have started staging competitions to spur innovation in different fields, particularly science.

The best known of these is the XPrize Foundation, initiated two decades ago by Peter Diamandis, the entrepreneur. The original award, the Ansari XPrize, offered $10m to the first privately financed team to put a vehicle into space. Since then, the XPrize has spread its wings into numerous different fields, including education and life sciences. Indeed, having given $30m in prize money so far, it has another $70m of competitions running, including the Google Lunar XPrize, which is offering $30m to land a privately funded robot on the moon.

McKinsey estimates that if you look across the field of prize-giving around the world, “total funds available from large prizes have more than tripled over the last decade to reach $350m”, while the “total prize sector could already be worth as much as $1bn to $2bn”. The Ocean Health XPrize, in other words, is barely a drop in the prize-giving ocean.

Is this a good thing? Not always, it might seem. As the prizes proliferate, they can sometimes overlap. The money being awarded tends — inevitably — to reflect the pet obsessions of philanthropists, rather than what scientists themselves would like to explore. And even the people running the prizes admit that these only work when there is a clear problem to be solved….(More)”

Science to the people!


John Magan, at Digital Agenda for Europe:” …I attended the 2nd Barcelona Citizen Science Day organised as part of the city’s Science Festival. The programme was full and varied and in itself a great example of the wonderful world of do-it-yourself, hands-on, accessible, practical science. A huge variety of projects (see below) was delivered with enthusiasm, passion, and energy!

The day was rounded off with a presentation by Public Lab who showed how a bit of technical ingenuity like cheap cameras on kites and balloons can be used to keep governments and large businesses more honest and accountable – for example, data they collected is being used in court cases against BP for the Deepwater Horizon oil spill in the Gulf of Mexico.

But what was most striking is the empowerment that these Citizen Science projects give individuals to do things for themselves – to take measures to monitor, protect or improve their urban or rural environment; to indulge their curiosity or passions; to improve their finances; to work with others; to do good while having serious fun….If you want to have a deeper look, here are some of the many projects presented on a great variety of themes:

Water

Wildlife

Climate

Arts

Public health

Human

A nice booklet capturing them is available and there’s aslo a summary in Catalan only.

Read more about citizen science in the European Commission….(More)”

Researcher uncovers inherent biases of big data collected from social media sites


Phys.org: “With every click, Facebook, Twitter and other social media users leave behind digital traces of themselves, information that can be used by businesses, government agencies and other groups that rely on “big data.”

But while the information derived from social network sites can shed light on social behavioral traits, some analyses based on this type of data collection are prone to bias from the get-go, according to new research by Northwestern University professor Eszter Hargittai, who heads the Web Use Project.

Since people don’t randomly join Facebook, Twitter or LinkedIn—they deliberately choose to engage —the data are potentially biased in terms of demographics, socioeconomic background or Internet skills, according to the research. This has implications for businesses, municipalities and other groups who use because it excludes certain segments of the population and could lead to unwarranted or faulty conclusions, Hargittai said.

The study, “Is Bigger Always Better? Potential Biases of Big Data Derived from Social Network Sites” was published last month in the journal The Annals of the American Academy of Political and Social Science and is part of a larger, ongoing study.

The buzzword “big data” refers to automatically generated information about people’s behavior. It’s called “big” because it can easily include millions of observations if not more. In contrast to surveys, which require explicit responses to questions, big data is created when people do things using a service or system.

“The problem is that the only people whose behaviors and opinions are represented are those who decided to join the site in the first place,” said Hargittai, the April McClain-Delaney and John Delaney Professor in the School of Communication. “If people are analyzing big data to answer certain questions, they may be leaving out entire groups of people and their voices.”

For example, a city could use Twitter to collect local opinion regarding how to make the community more “age-friendly” or whether more bike lanes are needed. In those cases, “it’s really important to know that people aren’t on Twitter randomly, and you would only get a certain type of person’s response to the question,” said Hargittai.

“You could be missing half the population, if not more. The same holds true for companies who only use Twitter and Facebook and are looking for feedback about their products,” she said. “It really has implications for every kind of group.”…

More information: “Is Bigger Always Better? Potential Biases of Big Data Derived from Social Network Sites” The Annals of the American Academy of Political and Social Science May 2015 659: 63-76, DOI: 10.1177/0002716215570866

Open Innovation, Open Science, Open to the World


Speech by Carlos Moedas, EU Commissioner for Research, Science and Innovation: “On 25 April this year, an earthquake of magnitude 7.3 hit Nepal. To get real-time geographical information, the response teams used an online mapping tool called Open Street Map. Open Street Map has created an entire online map of the world using local knowledge, GPS tracks and donated sources, all provided on a voluntary basis. It is open license for any use.

Open Street Map was created by a 24 year-old computer science student at University College London in 2004, has today 2 million users and has been used for many digital humanitarian and commercial purposes: From the earthquakes in Haiti and Nepal to the Ebola outbreak in West Africa.

This story is one of many that demonstrate that we are moving into a world of open innovation and user innovation. A world where the digital and physical are coming together. A world where new knowledge is created through global collaborations involving thousands of people from across the world and from all walks of life.

Ladies and gentlemen, over the next two days I would like us to chart a new path for European research and innovation policy. A new strategy that is fit for purpose for a world that is open, digital and global. And I would like to set out at the start of this important conference my own ambitions for the coming years….

Open innovation is about involving far more actors in the innovation process, from researchers, to entrepreneurs, to users, to governments and civil society. We need open innovation to capitalise on the results of European research and innovation. This means creating the right ecosystems, increasing investment, and bringing more companies and regions into the knowledge economy. I would like to go further and faster towards open innovation….

I am convinced that excellent science is the foundation of future prosperity, and that openness is the key to excellence. We are often told that it takes many decades for scientific breakthroughs to find commercial application.

Let me tell you a story which shows the opposite. Graphene was first isolated in the laboratory by Profs. Geim and Novoselov at the University of Manchester in 2003 (Nobel Prizes 2010). The development of graphene has since benefitted from major EU support, including ERC grants for Profs. Geim and Novoselov. So I am proud to show you one of the new graphene products that will soon be available on the market.

This light bulb uses the unique thermal dissipation properties of graphene to achieve greater energy efficiencies and a longer lifetime that LED bulbs. It was developed by a spin out company from the University of Manchester, called Graphene Lighting, as is expected to go on sale by the end of the year.

But we must not be complacent. If we look at indicators of the most excellent science, we find that Europe is not top of the rankings in certain areas. Our ultimate goal should always be to promote excellence not only through ERC and Marie Skłodowska-Curie but throughout the entire H2020.

For such an objective we have to move forward on two fronts:

First, we are preparing a call for European Science Cloud Project in order to identify the possibility of creating a cloud for our scientists. We need more open access to research results and the underlying data. Open access publication is already a requirement under Horizon 2020, but we now need to look seriously at open data…

When innovators like LEGO start fusing real bricks with digital magic, when citizens conduct their own R&D through online community projects, when doctors start printing live tissues for patients … Policymakers must follow suit…(More)”

Improving Crowdsourcing and Citizen Science as a Policy Mechanism for NASA


Paper by Balcom Brittany: “This article examines citizen science projects, defined as “a form of open collaboration where members of the public participate in the scientific process, including identifying research questions, collecting and analyzing the data, interpreting the results, and problem solving,” as an effective and innovative tool for National Aeronautics and Space Administration (NASA) science in line with the Obama Administration’s Open Government Directive. Citizen science projects allow volunteers with no technical training to participate in analysis of large sets of data that would otherwise constitute prohibitively tedious and lengthy work for research scientists. Zooniverse.com hosts a multitude of popular space-focused citizen science projects, many of which have been extraordinarily successful and have enabled new research publications and major discoveries. This article takes a multifaceted look at such projects by examining the benefits of citizen science, effective game design, and current desktop computer and mobile device usage trends. It offers suggestions of potential research topics to be studied with emerging technologies, policy considerations, and opportunities for outreach. This analysis includes an overview of other crowdsourced research methods such as distributed computing and contests. New research and data analysis of mobile phone usage, scientific curiosity, and political engagement among Zooniverse.com project participants has been conducted for this study…(More)”

When America Says Yes to Government


Cass Sunstein in the New York Times: “In recent years, the federal government has adopted a large number of soft interventions that are meant to change behavior without mandates and bans. Among them: disclosure of information, such as calorie labels at chain restaurants; graphic warnings against, for example, distracted driving; and automatic enrollment in programs designed to benefit employees, like pension plans.

Informed by behavioral science, such reforms can have large effects while preserving freedom of choice. But skeptics deride these soft interventions as unjustified paternalism, an insult to dignity and a contemporary version of the nanny state. Some people fear that uses of behavioral science will turn out to be manipulative. They don’t want to be nudged.

But what do Americans actually think about soft interventions? I recently conducted a nationally representative survey of 563 people. Small though that number may seem, it gives a reasonable picture of what Americans think, with a margin of error of plus or minus 4.1 percentage points.

The remarkable finding is that most Americans approve of these reforms and want a lot more of them — and their approval generally cuts across partisan lines….(More)

Forging Trust Communities: How Technology Changes Politics


Book by Irene S. Wu: “Bloggers in India used social media and wikis to broadcast news and bring humanitarian aid to tsunami victims in South Asia. Terrorist groups like ISIS pour out messages and recruit new members on websites. The Internet is the new public square, bringing to politics a platform on which to create community at both the grassroots and bureaucratic level. Drawing on historical and contemporary case studies from more than ten countries, Irene S. Wu’s Forging Trust Communities argues that the Internet, and the technologies that predate it, catalyze political change by creating new opportunities for cooperation. The Internet does not simply enable faster and easier communication, but makes it possible for people around the world to interact closely, reciprocate favors, and build trust. The information and ideas exchanged by members of these cooperative communities become key sources of political power akin to military might and economic strength.

Wu illustrates the rich world history of citizens and leaders exercising political power through communications technology. People in nineteenth-century China, for example, used the telegraph and newspapers to mobilize against the emperor. In 1970, Taiwanese cable television gave voice to a political opposition demanding democracy. Both Qatar (in the 1990s) and Great Britain (in the 1930s) relied on public broadcasters to enhance their influence abroad. Additional case studies from Brazil, Egypt, the United States, Russia, India, the Philippines, and Tunisia reveal how various technologies function to create new political energy, enabling activists to challenge institutions while allowing governments to increase their power at home and abroad.

Forging Trust Communities demonstrates that the way people receive and share information through network communities reveals as much about their political identity as their socioeconomic class, ethnicity, or religion. Scholars and students in political science, public administration, international studies, sociology, and the history of science and technology will find this to be an insightful and indispensable work…(More)”

A computational algorithm for fact-checking


Kurzweil News: “Computers can now do fact-checking for any body of knowledge, according to Indiana University network scientists, writing in an open-access paper published June 17 in PLoS ONE.

Using factual information from summary infoboxes from Wikipedia* as a source, they built a “knowledge graph” with 3 million concepts and 23 million links between them. A link between two concepts in the graph can be read as a simple factual statement, such as “Socrates is a person” or “Paris is the capital of France.”

In the first use of this method, IU scientists created a simple computational fact-checker that assigns “truth scores” to statements concerning history, geography and entertainment, as well as random statements drawn from the text of Wikipedia. In multiple experiments, the automated system consistently matched the assessment of human fact-checkers in terms of the humans’ certitude about the accuracy of these statements.

Dealing with misinformation and disinformation

In what the IU scientists describe as an “automatic game of trivia,” the team applied their algorithm to answer simple questions related to geography, history, and entertainment, including statements that matched states or nations with their capitals, presidents with their spouses, and Oscar-winning film directors with the movie for which they won the Best Picture awards. The majority of tests returned highly accurate truth scores.

Lastly, the scientists used the algorithm to fact-check excerpts from the main text of Wikipedia, which were previously labeled by human fact-checkers as true or false, and found a positive correlation between the truth scores produced by the algorithm and the answers provided by the fact-checkers.

Significantly, the IU team found their computational method could even assess the truthfulness of statements about information not directly contained in the infoboxes. For example, the fact that Steve Tesich — the Serbian-American screenwriter of the classic Hoosier film “Breaking Away” — graduated from IU, despite the information not being specifically addressed in the infobox about him.

Using multiple sources to improve accuracy and richness of data

“The measurement of the truthfulness of statements appears to rely strongly on indirect connections, or ‘paths,’ between concepts,” said Giovanni Luca Ciampaglia, a postdoctoral fellow at the Center for Complex Networks and Systems Research in the IU Bloomington School of Informatics and Computing, who led the study….

“These results are encouraging and exciting. We live in an age of information overload, including abundant misinformation, unsubstantiated rumors and conspiracy theories whose volume threatens to overwhelm journalists and the public. Our experiments point to methods to abstract the vital and complex human task of fact-checking into a network analysis problem, which is easy to solve computationally.”

Expanding the knowledge base

Although the experiments were conducted using Wikipedia, the IU team’s method does not assume any particular source of knowledge. The scientists aim to conduct additional experiments using knowledge graphs built from other sources of human knowledge, such as Freebase, the open-knowledge base built by Google, and note that multiple information sources could be used together to account for different belief systems….(More)”

What cybersecurity can learn from citizen science


But as anyone who has observed an online forum thread dissecting the minutiae of geek culture can attest, hobbyists can be remarkably thorough in their exploration of topics they are passionate about. And it is often a point of pride to pick the subject that is the least conventional or popular.

The idea of citizen science is to include amateur science enthusiasts in the collection and processing of data. Thanks to the Internet, we’ve seen a surge in the number of self-taught experts in a variety of subjects. New participation platforms are social and gamified – utilizing people’s desire to compete or collaborate with others who share their passion.

How this process plays out differs from one app to the next, according to their needs: StarDust@Home asks volunteers to help sort through samples captured by the Stardust spacecraft when it flew through the coma of comet Wild 2 in 2004. They do this by viewing movies of the contents of the aerogel tiles that were used as collectors.

The security community is ripe for using the citizen science in similar ways to these. Most antimalware vendors make use of customer samples for adding detection and cleaning to their products. Many security companies use customers’ reports to gather file reputation, telemetry and prevalence data. And bug reports come from researchers of all ages and education levels – not just professional security researchers. “Month of Bug” events are a more controversial way that security is gamified. Could security companies or organizations be doing more to engage enthusiasts to help improve our response to security issues?

It could be argued that the stuff of security research – especially malware research – is potentially harmful in the hands of amateurs and should be handled only by seasoned professionals. Not only that, security is an adversarial system where the criminals would likely try to game the system to improve their profits. These are important concerns that would need to be addressed.

But the citizen science approach provides good lessons…(More)”

Handbook: How to Catalyze Humanitarian Innovation in Computing Research Institutes


Patrick Meier: “The handbook below provides practical collaboration guidelines for both humanitarian organizations & computing research institutes on how to catalyze humanitarian innovation through successful partnerships. These actionable guidelines are directly applicable now and draw on extensive interviews with leading humanitarian groups and CRI’s including the International Committee of the Red Cross (ICRC), United Nations Office for the Coordination of Humanitarian Affairs (OCHA), United Nations Children’s Fund (UNICEF), United Nations High Commissioner for Refugees (UNHCR), UN Global Pulse, Carnegie Melon University (CMU), International Business Machines (IBM), Microsoft Research, Data Science for Social Good Program at the University of Chicago and others.

This handbook, which is the first of its kind, also draws directly on years of experience and lessons learned from the Qatar Computing Research Institute’s (QCRI) active collaboration and unique partnerships with multiple international humanitarian organizations. The aim of this blog post is to actively solicit feedback on this first, complete working draft, which is available here as an open and editable Google Doc. …(More)”