What Is Citizen Science? – A Scientometric Meta-Analysis


Christopher Kullenberg and Dick Kasperowski at PLOS One: “The concept of citizen science (CS) is currently referred to by many actors inside and outside science and research. Several descriptions of this purportedly new approach of science are often heard in connection with large datasets and the possibilities of mobilizing crowds outside science to assists with observations and classifications. However, other accounts refer to CS as a way of democratizing science, aiding concerned communities in creating data to influence policy and as a way of promoting political decision processes involving environment and health.

Objective

In this study we analyse two datasets (N = 1935, N = 633) retrieved from the Web of Science (WoS) with the aim of giving a scientometric description of what the concept of CS entails. We account for its development over time, and what strands of research that has adopted CS and give an assessment of what scientific output has been achieved in CS-related projects. To attain this, scientometric methods have been combined with qualitative approaches to render more precise search terms.

Results

Results indicate that there are three main focal points of CS. The largest is composed of research on biology, conservation and ecology, and utilizes CS mainly as a methodology of collecting and classifying data. A second strand of research has emerged through geographic information research, where citizens participate in the collection of geographic data. Thirdly, there is a line of research relating to the social sciences and epidemiology, which studies and facilitates public participation in relation to environmental issues and health. In terms of scientific output, the largest body of articles are to be found in biology and conservation research. In absolute numbers, the amount of publications generated by CS is low (N = 1935), but over the past decade a new and very productive line of CS based on digital platforms has emerged for the collection and classification of data….(More)”

How Measurement Fails Doctors and Teachers


Robert M. Wachter at the New York Times: “Two of our most vital industries, health care and education, have become increasingly subjected to metrics and measurements. Of course, we need to hold professionals accountable. But the focus on numbers has gone too far. We’re hitting the targets, but missing the point.

Through the 20th century, we adopted a hands-off approach, assuming that the pros knew best. Most experts believed that the ideal “products” — healthy patients and well-educated kids — were too strongly influenced by uncontrollable variables (the sickness of the patient, the intellectual capacity of the student) and were too complex to be judged by the measures we use for other industries.

By the early 2000s, as evidence mounted that both fields were producing mediocre outcomes at unsustainable costs, the pressure for measurement became irresistible. In health care, we saw hundreds of thousands of deaths from medical errors, poor coordination of care and backbreaking costs. In education, it became clear that our schools were lagging behind those in other countries.

So in came the consultants and out came the yardsticks. In health care, we applied metrics to outcomes and processes. Did the doctor document that she gave the patient a flu shot? That she counseled the patient about smoking? In education, of course, the preoccupation became student test scores.

All of this began innocently enough. But the measurement fad has spun out of control. There are so many different hospital ratings that more than 1,600 medical centers can now lay claim to being included on a “top 100,” “honor roll,” grade “A” or “best” hospitals list. Burnout rates for doctors top 50 percent, far higher than other professions. A 2013 study found that the electronic health record was a dominant culprit. Another 2013 study found that emergency room doctors clicked a mouse 4,000 times during a 10-hour shift. The computer systems have become the dark force behind quality measures.

Education is experiencing its own version of measurement fatigue. Educators complain that the focus on student test performance comes at the expense of learning. Art, music and physical education have withered, because, really,why bother if they’re not on the test?…

Thoughtful and limited assessment can be effective in motivating improvements and innovations, and in weeding out the rare but disproportionately destructive bad apples.

But in creating a measurement and accountability system, we need to tone down the fervor and think harder about the unanticipated consequences….(More)”

 

The impact of open access scientific knowledge


Jack Karsten and Darrell M. West at Brookings: “In spite of technological advancements like the Internet, academic publishing has operated in much the same way for centuries. Scientists voluntarily review their peers’ papers for little or no compensation; the paper’s author likewise does not receive payment from academic publishers. Though most of the costs of publishing a journal are administrative, the cost of subscribing to scientific journals nevertheless increased 600 percent between 1984 and 2002. The funding for the research libraries that form the bulk of journal subscribers has not kept pace, leading to campaigns at universities including Harvard to boycott for-profit publishers.

Though the Internet has not yet brought down the price of academic journal subscriptions, it has led to some interesting alternatives. In 2015, the Twitter hashtag #icanhazPDF was created to request copies of papers located behind paywalls. Anyone with access to a specific paper can download it and then e-mail it to the requester. The practice violates the copyright of publishers, but puts papers in reach of researchers who would otherwise not be able to read them. If a researcher cannot read a journal article in the first place, they cannot go on to cite it, which raises the profile of the cited article and the journal that published it. The publisher is caught between two conflicting goals: to increase the number of citations for their articles and earning revenue to stay in business.

Thinking outside the journal

A trio of University of Chicago researchers examines this issue through the lens of Wikipedia in a paper titled “Amplifying the Impact of Open Access: Wikipedia and the Diffusion of Science.” Wikipedia makes a compelling subject for scientific diffusion given its status as one of the most visited websites in the world, attracting 374 million unique visitors monthly as of September 2015. The study found that on English language articles, Wikipedia editors are 47 percent more likely to cite an article from an open access journal. Anyone using Wikipedia as a first source for information on a subject is more likely to read information from open source journals. If readers click through the links to cited articles, they can read the actual text of these open-source journal articles.

Given how much the federal government spends on scientific research ($66 billion on nondefense R&D in 2015), it has a large role to play in the diffusion of scientific knowledge. Since 2008, the National Institutes of Health (NIH) has required researchers who publish in academic journals to also publish in PubMed, an online open access journal. Expanding provisions like the NIH Public Access Policy to other agencies and to recipients of federal grants at universities would give the public and other researchers a wealth of scientific information. Scientific literacy, even on cutting-edge research, is increasingly important when science informs policy on major issues such as climate change and health care….(More)”

Systematic Thinking for Social Action


Re-issued book by Alice M. Rivlin: “In January 1970 Alice M. Rivlin spoke to an audience at the University of California–Berkeley. The topic was developing a more rational approach to decision-making in government. If digital video, YouTube, and TED Talks had been inventions of the 1960s, Rivlin’s talk would have been a viral hit. As it was, the resulting book, Systematic Thinking for Social Action, spent years on the Brookings Press bestseller list. It is a very personal and conversational volume about the dawn of new ways of thinking about government.

As a deputy assistant secretary for program coordination, and later as assistant secretary for planning and evaluation, at the Department of Health, Education and Welfare from 1966 to 1969, Rivlin was an early advocate of systems analysis, which had been introduced by  Robert McNamara at the Department of Defense as  PPBS (planning-programming-budgeting-system).

While Rivlin brushes aside the jargon, she digs into the substance of systematic analysis and a “quiet revolution in government.” In an evaluation of the evaluators, she issues mixed grades, pointing out where analysts had been helpful in finding solutions and where—because of inadequate data or methods—they had been no help at all.

Systematic Thinking for Social Action offers important insights for anyone interested in working to find the smartest ways to allocate scarce funds to promote the maximum well-being of all citizens.

This reissue is a Brookings Classics, a series of republished books for readers to revisit or discover previous, notable works by the Brookings Institution Press.

Chicago Is Predicting Food Safety Violations. Why Aren’t Other Cities?


Julian Spector at CityLab: “The three dozen inspectors at the Chicago Department of Public Health scrutinize 16,000 eating establishments to protect diners from gut-bombing food sickness. Some of those pose more of a health risk than others; approximately 15 percent of inspections catch a critical violation.

For years, Chicago, like most every city in the U.S., scheduled these inspections by going down the complete list of food vendors and making sure they all had a visit in the mandated timeframe. That process ensured that everyone got inspected, but not that the most likely health code violators got inspected first. And speed matters in this case. Every day that unsanitary vendors serve food is a new chance for diners to get violently ill, paying in time, pain, and medical expenses.

That’s why, in 2014, Chicago’s Department of Innovation and Technology started sifting through publicly available city data and built an algorithm to predict which restaurants were most likely to be in violation of health codes, based on the characteristics of previously recorded violations. The program generated a ranked list of which establishments the inspectors should look at first. The project is notable not just because it worked—the algorithm identified violations significantly earlier than business as usual did—but because the team made it as easy as possible for other cities to replicate the approach.

And yet, more than a year after Chicago published its code, only one local government, in metro D.C., has tried to do the same thing. All cities face the challenge of keeping their food safe and therefore have much to gain from this data program. The challenge, then, isn’t just to design data solutions that work, but to do so in a way that facilitates sharing them with other cities. The Chicago example reveals the obstacles that might prevent a good urban solution from spreading to other cities, but also how to overcome them….(More)”

The Future of Behavioural Change: Balancing Public Nudging vs Private Nudging


2nd AIM Lecture by Alberto Alemanno: “Public authorities, including the European Union and its Member States, are increasingly interested in exploiting behavioral insights through public action. They increasingly do so through choice architecture, i.e. the alteration of the environment of choice surrounding a particular decision making context in areas as diverse as energy consumption, tax collection and public health. In this regard, it is useful to distinguish between two situations. The first is that of a public authority which seeks to steer behaviour in the public interest, taking into account one or more mental shortcuts. Thus, a default enrollment for organ donation leverages on the power of inertia to enhance the overall prevalence organ donors. Placing an emoticon (sad face) or a set of information about average consumption on a prohibitive energy bill has the potential to nudge consumers towards less energy consumption. I call this pure public nudging. The second perspective is when public authorities react to exploitative uses of mental shortcuts by market forces by regulating private nudging. I call this ‘counter-nudging’. Pure public nudging helps people correct mental shortcuts so as to achieve legitimate objectives (e.g. increased availability of organs, environmental protection, etc.), regardless of their exploitative use by market forces.
It is against this proposed taxonomy that the 2nd AIM Lecture examines whether also private companies may nudge for good. Are corporations well-placed to nudge their customers towards societal objectives, such as the protection of the environment or the promotion of public health? This is what I call benign corporate nudging.
Their record is far from being the most credible. Companies have used behavioural inspired interventions to maximize profits, what led them to sell more and in turn to induce citizens into more consumption. Yet corporate marketing need not always be self-interested. An incipient number of companies are using their brand, generally through their packaging and marketing efforts, to ‘nudge for good’. By illustrating some actual examples, this lecture defines the conditions under which companies may genuinely and credibly nudge for good. It argues that benign corporate nudging may have – unlike dominant CSR efforts – a positive long-term, habit-forming effect that influences consumers’ future behaviour ‘for good’….(More)”

 

Open Prescribing


“Every month, the NHS in England publishes anonymised data about the drugs prescribed by GPs. But the raw data files are large and unwieldy, with more than 600 million rows. We’re making it easier for GPs, managers and everyone to explore – supporting safer, more efficient prescribing.

OpenPrescribing is one of a range of projects built by Ben Goldacre and Anna Powell-Smith at the EBM Data Lab to help make complex medical and scientific data more accessible and more impactful in the real world…..

Data sources

Please read our guide to using the data.

Prescribing data is from the monthly files published by the Health and Social Care Information Centre(HSCIC), used under the terms of the Open Government Licence.

Practice list sizes are from the NHS Business Service Authority’s Information Portal, used under the terms of the Open Government Licence. ASTRO-PU and STAR-PUs are calculated from list sizes, based on standard formulas.

BNF codes and names are also from the NHS Business Service Authority’s Information Portal, used under the terms of the Open Government Licence.

CCG to practice relations, and practice prescribing settings, are from the HSCIC’s data downloads(epraccur.csv), used under the terms of the Open Government Licence.

CCG names and codes and CCG geographic boundaries are from the Office for National Statistics, used under the terms of the Open Government Licence.

Practice locations are approximate, geocoded using OpenCageData. If you know a better source of practice locations (not including Code-Point Open), please get in touch!…(More)”

Predictive Analytics


Revised book by Eric Siegel: “Prediction is powered by the world’s most potent, flourishing unnatural resource: data. Accumulated in large part as the by-product of routine tasks, data is the unsalted, flavorless residue deposited en masse as organizations churn away. Surprise! This heap of refuse is a gold mine. Big data embodies an extraordinary wealth of experience from which to learn.

Predictive analytics unleashes the power of data. With this technology, the computer literally learns from data how to predict the future behavior of individuals. Perfect prediction is not possible, but putting odds on the future drives millions of decisions more effectively, determining whom to call, mail, investigate, incarcerate, set up on a date, or medicate.

In this lucid, captivating introduction — now in its Revised and Updated edition — former Columbia University professor and Predictive Analytics World founder Eric Siegel reveals the power and perils of prediction:

    • What type of mortgage risk Chase Bank predicted before the recession.
    • Predicting which people will drop out of school, cancel a subscription, or get divorced before they even know it themselves.
    • Why early retirement predicts a shorter life expectancy and vegetarians miss fewer flights.
    • Five reasons why organizations predict death — including one health insurance company.
    • How U.S. Bank and Obama for America calculated — and Hillary for America 2016 plans to calculate — the way to most strongly persuade each individual.
    • Why the NSA wants all your data: machine learning supercomputers to fight terrorism.
    • How IBM’s Watson computer used predictive modeling to answer questions and beat the human champs on TV’s Jeopardy!
    • How companies ascertain untold, private truths — how Target figures out you’re pregnant and Hewlett-Packard deduces you’re about to quit your job.
    • How judges and parole boards rely on crime-predicting computers to decide how long convicts remain in prison.
    • 183 examples from Airbnb, the BBC, Citibank, ConEd, Facebook, Ford, Google, the IRS, LinkedIn, Match.com, MTV, Netflix, PayPal, Pfizer, Spotify, Uber, UPS, Wikipedia, and more….(More)”

 

‘Design thinking’ is changing the way we approach problems


Tim Johnson in University Affairs on “Why researchers in various disciplines are using the principles of design to solve problems big and small” : “A product of the same trends and philosophies that gave us smartphones, laptop computers and internet search engines, design thinking is changing the way some academics approach teaching and research, the way architects design classrooms and how leaders seek to solve the world’s most persistent problems.

Cameron Norman is a long-time supporter of design thinking (or DT) and an adjunct lecturer at the University of Toronto’s Dalla Lana School of Public Health. He notes that designers, especially product designers, are typically experts in conceptualizing problems and solving them– ideal skills for tackling a wide range of issues, from building a better kitchen table to mapping out the plans on a large building. “The field of design is the discipline of innovation,” he says. “[Design thinking] is about taking these methods, tools and ideas, and applying them in other areas.”

Design thinking centres on the free flow of ideas – far-out concepts aren’t summarily dismissed – and an unusually enthusiastic embrace of failure. “Design thinkers try to figure out what the key problem is – they look around and try to understand what’s going on, and come up with some wild ideas, thinking big and bold, on how to solve it,” Dr. Norman says. “They assume they’re not going to get it right the first time.”

If you were looking to build a better mousetrap, you’d prototype a model, test it for weaknesses, then either trash it and start again, or identify the problems and seek to correct them. DT does the same thing, but in an increasingly broad array of areas, from social policy to healthcare to business.

Deborah Shackleton, dean of design and dynamic media at Emily Carr University of Art + Design in Vancouver, was an early adopter of DT. “Design thinking is a mindset. You can use it as a tool or a technique. It’s very adaptable,” she says.

In 2005, ECUAD revamped much of its curriculum in design and dynamic media, looking to shift the focus from more traditional methods of research, like literature reviews, to something called “generative research.” “It’s the idea that you would invite the participants – for whom the design is intended – to be part of the creation process,” Dr. Shackleton says. She adds that various tools, like “co-creation kits” (which include a range of activities to engage people on a variety of cognitive and emotional levels) and ethnographic and cultural probes (activities which help participants demonstrate details about their private lives to their design partners), prove very useful in this area.

Collaboration among various fields is an important part of the design thinking process. At the University of Alberta, Aidan Rowe, an associate professor in design studies, is using design thinking to help the City of Edmonton improve services for people who are homeless. “Design is a truly interdisciplinary discipline,” says Dr. Rowe. “We always need someone to design with and for. We don’t design for ourselves.”….

Design thinkers often speak of “human-centered design” and “social innovation,” concepts that flow from DT’s assertion that no single person has the answer to a complex problem. Instead, it focuses on collective goals and places a premium on sustainability, community, culture and the empowerment of people, says Greg Van Alstyne, director of research and co-founder of the Strategic Innovation Lab, or sLab, at OCAD University. “It means you go about your problem-solving in a more holistic way. We can say ‘human-centered,’ but it’s actually ‘life-centered,’” Mr. Van Alstyne explains. “Our brand of design thinking is amenable to working within social systems and improving the lot of communities.”

 

Design thinking is also transforming university campuses in a tangible way. One example is at the University of Calgary’s Taylor Institute for Teaching and Learning, which is undergoing a $40-million renovation. “The whole space is designed to help students connect, communicate, collaborate and create knowledge,” says Lynn Taylor, vice-provost, teaching and learning. “Traditional learning was focused on the facts and concepts and procedures of a discipline, and we’re moving toward the goal of having students think far more deeply about their learning.”

To create this new space within a two-floor, 4,000-square-metre building that formerly served as an art museum, the university turned to Diamond Schmitt Architects, who have designed similar spaces at a number of other Canadian campuses. The new space, scheduled to open in February, prioritizes flexibility, with movable walls and collapsible furniture, and the seamless integration of technology.

Lead architect Don Schmitt observes that in a traditional campus building, which usually contains a long corridor and individual classrooms, conversation tends to gravitate to the only true public space: the hallway. “There’s a sense that more learning probably happens outside the classroom or between the classrooms, than happens inside the classroom,” he says.

Gone is the old-model lecture hall, with fixed podium and chairs. They’ve been replaced by a much more malleable space, which in a single day can act as a dance studio, movie theatre, lecture space, or just a big area for students to get together. “It’s about individual learning happening informally, quiet study, gregarious social activity, group study, group projects, flexible studio environments, changeable, ‘hack-able’ spaces and lots of flexibility to use different places in different ways,” Mr. Schmitt explains….(More)”

Initial Conditions Matter: Social Capital and Participatory Development


Paper by Lisa A. Cameron et al: “Billions of dollars have been spent on participatory development programs in the developing world. These programs give community members an active decision-making role. Given the emphasis on community involvement, one might expect that the effectiveness of this approach would depend on communities’ pre-existing social capital stocks. Using data from a large randomised field experiment of Community-Led Total Sanitation in Indonesia, we find that villages with high initial social capital built toilets and reduced open defecation, resulting in substantial health benefits. In villages with low initial stocks of social capital, the approach was counterproductive – fewer toilets were built than in control communities and social capital suffered….(More)”