Netnography: The Essential Guide to Qualitative Social Media Research


Book by Robert Kozinets: “Netnography is an adaptation of ethnography for the online world, pioneered by Robert Kozinets, and is concerned with the study of online cultures and communities as distinct social phenomena, rather than isolated content. In this landmark third edition, Netnography: The Essential Guide provides the theoretical and methodological groundwork as well as the practical applications, helping students both understand and do netnographic research projects of their own.

Packed with enhanced learning features throughout, linking concepts to structured activities in a step by step way, the book is also now accompanied by a striking new visual design and further case studies, offering the essential student resource to conducting online ethnographic research. Real world examples provided demonstrate netnography in practice across the social sciences, in media and cultural studies, anthropology, education, nursing, travel and tourism, and others….(More)”.

Uses and Reuses of Scientific Data: The Data Creators’ Advantage


Paper by Irene V. Pasquetto, Christine L. Borgman, and Morgan F. Wofford: “Open access to data, as a core principle of open science, is predicated on assumptions that scientific data can be reused by other researchers. We test those assumptions by asking where scientists find reusable data, how they reuse those data, and how they interpret data they did not collect themselves. By conducting a qualitative meta-analysis of evidence on two long-term, distributed, interdisciplinary consortia, we found that scientists frequently sought data from public collections and from other researchers for comparative purposes such as “ground-truthing” and calibration. When they sought others’ data for reanalysis or for combining with their own data, which was relatively rare, most preferred to collaborate with the data creators.

We propose a typology of data reuses ranging from comparative to integrative. Comparative data reuse requires interactional expertise, which involves knowing enough about the data to assess their quality and value for a specific comparison such as calibrating an instrument in a lab experiment. Integrative reuse requires contributory expertise, which involves the ability to perform the action, such as reusing data in a new experiment. Data integration requires more specialized scientific knowledge and deeper levels of epistemic trust in the knowledge products. Metadata, ontologies, and other forms of curation benefit interpretation for any kind of data reuse. Based on these findings, we theorize the data creators’ advantage, that those who create data have intimate and tacit knowledge that can be used as barter to form collaborations for mutual advantage. Data reuse is a process that occurs within knowledge infrastructures that evolve over time, encompassing expertise, trust, communities, technologies, policies, resources, and institutions….(More)”.

The Psychological Basis of Motivation to Take Part in Online Citizen Science


Paper by Liz Dowthwaite et al: “Increasing motivation to contribute to online citizen science projects can improve user experience and is critical in retaining and attracting users. Drawing on previous studies of motivation, this paper suggests self-determination theory as a framework for explaining the psychological constructs behind participation in Citizen Science. Through examining existing studies of motivation for 6 Zooniverse projects through this lens, the paper suggests how appealing to basic psychological needs could increase participation in online citizen science, considering current practices and directions for future developments and research….(More)”.

Ten ways to optimise evidence-based policy


Paper by Peter Bragge: “Applying knowledge to problems has occupied the minds of great philosophers, scientists and other thinkers for centuries. In more modern times, the challenge of connecting knowledge to practice has been addressed through fields such as evidence-based medicine which have conceptualised optimal healthcare as integration of best available research evidence, clinical experience and patients’ values. Similar principles apply to evidence-based public policy, and literature in this field has been growing since the turn of the century.

The exponential rise in knowledge availability has greatly enhanced the ‘supply’ side of the evidence-into-practice equation – however substantial gaps between evidence and practice remain. Policymakers are therefore increasingly looking to academia to optimise evidence-informed policy. This article presents ten considerations for optimising evidence-based policy, drawn from experience in delivering applied behaviour change research to government….(More)”.

Retrofitting Social Science for the Practical & Moral


Kenneth Prewitt at Issues: “…We cannot reach this fresh thinking without first challenging two formulations that today’s social science considers settled. First, social science should not assume that the “usefulness of useless knowledge” works as our narrative. Yes, it works for natural sciences. But the logic doesn’t translate. Second, we should back off from exaggerated promises about “evidence-based policy,” perhaps terming it “evidence-influenced politics,” a framing that is more accurate descriptively (what happens) and prescriptively (what should happen). The prominence given to these two formulations gets in the way of an alternative positioning of social science as an agent of improvement. I discuss this alternative below, under the label of the Fourth Purpose….

…the “Fourth Purpose.” This joins the three purposes traditionally associated with American universities and colleges: Education, Research, and Public Service. The latter is best described as being “a good citizen,” engaged in volunteer work; it is an attractive feature of higher education, but not in any substantial manner present in the other two core purposes.

The Fourth Purpose is an altogether different vision. It institutionalizes what Ross characterized as a social science being in the “broadest sense practical and moral.” It succeeds only by being fully present in education and research, for instance, including experiential learning in the curriculum and expanding processes that convert research findings into social benefits. This involves more than scattered centers across the university working on particular social problems. As Bollinger puts it, the university itself becomes a hybrid actor, at once academic and practical. “A university,” he says, “is more than simply an infrastructure supporting schools, departments, and faculty in their academic pursuits. As research universities enter into the realm or realms of the outside world, the ‘university’ (i.e., the sum of its parts/constituents) is going to have capacities far beyond those of any segment, as well as effects (hopefully generally positive) radiating back into the institution.”

To oversimplify a bit, the Fourth Purpose has three steps. The first occurs in the lab, library, or field—resulting in fundamental findings. The second ventures into settings where nonacademic players and judgment come into play, actions are taken, and ethical choices confronted, that is, practices of the kind mentioned earlier: translation research, knowledge brokers, boundary organizations, coproduction. Academic and nonacademic players should both come away from these settings with enriched understanding and capabilities. For academics, the skills required for this step differ from, but complement, the more familiar skills of teacher and researcher. The new skills will have to be built into the fabric of the university if the Fourth Purpose is to succeed.

The third step cycles back to the campus. It involves scholarly understandings not previously available. It requires learning something new about the original research findings as a result of how they are interpreted, used, rejected, modified, or ignored in settings that, in fact, are controlling whether the research findings will be implemented as hoped. This itself is new knowledge. If paid attention to, and the cycle is repeated, endlessly, a new form of scholarship is added to our tool kit….(More)”.

OMB rethinks ‘protected’ or ‘open’ data binary with upcoming Evidence Act guidance


Jory Heckman at Federal News Network: “The Foundations for Evidence-Based Policymaking Act has ordered agencies to share their datasets internally and with other government partners — unless, of course, doing so would break the law.

Nearly a year after President Donald Trump signed the bill into law, agencies still have only a murky idea of what data they can share, and with whom. But soon, they’ll have more nuanced options of ranking the sensitivity of their datasets before sharing them out to others.

Chief Statistician Nancy Potok said the Office of Management and Budget will soon release proposed guidelines for agencies to provide “tiered” access to their data, based on the sensitivity of that information….

OMB, as part of its Evidence Act rollout, will also rethink how agencies ensure protected access to data for research. Potok said agency officials expect to pilot a single application governmentwide for people seeking access to sensitive data not available to the public.

The pilot resembles plans for a National Secure Data Service envisioned by the Commission on Evidence-Based Policymaking, an advisory group whose recommendations laid the groundwork for the Evidence Act.

“As a state-of-the-art resource for improving government’s capacity to use the data it already collects, the National Secure Data Service will be able to temporarily link existing data and provide secure access to those data for exclusively statistical purposes in connection with approved projects,” the commission wrote in its 2017 final report.

In an effort to strike a balance between access and privacy, Potok said OMB has also asked agencies to provide a list of the statutes that prohibit them from sharing data amongst themselves….(More)”.

To What Extent Does the EU General Data Protection Regulation (GDPR) Apply to Citizen Scientist-led Health Research with Mobile Devices?


Article by Edward Dove and Jiahong Chen: “In this article, we consider the possible application of the European General Data Protection Regulation (GDPR) to “citizen scientist”-led health research with mobile devices. We argue that the GDPR likely does cover this activity, depending on the specific context and the territorial scope. Remaining open questions that result from our analysis lead us to call for a lex specialis that would provide greater clarity and certainty regarding the processing of health data for research purposes, including by these non-traditional researchers…(More)”.

Becoming a data steward


Shalini Kurapati at the LSE Impact Blog: “In the context of higher education, data stewards are the first point of reference for all data related questions. In my role as a data steward at TU Delft, I was able to advise, support and train researchers on various aspects of data management throughout the life cycle of a research project, from initial planning to post-publication. This included storing, managing and sharing research outputs such as data, images, models and code.

Data stewards also advise researchers on the ethical, policy and legal considerations during data collection, processing and dissemination. In a way, they are general practitioners for research data management and can usually solve most problems faced by academics. In cases that require specialist intervention, they also serve as a key point for referral (eg: IT, patent, legal experts).

Data stewardship is often organised centrally through the university library. (Subject) Data librarians, research data consultants and research data officers, usually perform similar roles to data stewards. However, TU Delft operates a decentralised model, where data stewards are placed within faculties as disciplinary experts with research experience. This allows data stewards to provide discipline specific support to researchers, which is particularly beneficial, as the concept of what data is itself varies across disciplines….(More)”.

Timing Technology


Blog by Gwern Branwen: “Technological forecasts are often surprisingly prescient in terms of predicting that something was possible & desirable and what they predict eventually happens; but they are far less successful at predicting the timing, and almost always fail, with the success (and riches) going to another.

Why is their knowledge so useless? The right moment cannot be known exactly in advance, so attempts to forecast will typically be off by years or worse. For many claims, there is no way to invest in an idea except by going all in and launching a company, resulting in extreme variance in outcomes, even when the idea is good and the forecasts correct about the (eventual) outcome.

Progress can happen and can be foreseen long before, but the details and exact timing due to bottlenecks are too difficult to get right. Launching too early means failure, but being conservative & launching later is just as bad because regardless of forecasting, a good idea will draw overly-optimistic researchers or entrepreneurs to it like moths to a flame: all get immolated but the one with the dumb luck to kiss the flame at the perfect instant, who then wins everything, at which point everyone can see that the optimal time is past. All major success stories overshadow their long list of predecessors who did the same thing, but got unlucky. So, ideas can be divided into the overly-optimistic & likely doomed, or the fait accompli. On an individual level, ideas are worthless because so many others have them too—‘multiple invention’ is the rule, and not the exception.

This overall problem falls under the reinforcement learning paradigm, and successful approaches are analogous to Thompson sampling/posterior sampling: even an informed strategy can’t reliably beat random exploration which gradually shifts towards successful areas while continuing to take occasional long shots. Since people tend to systematically over-exploit, how is this implemented? Apparently by individuals acting suboptimally on the personal level, but optimally on societal level by serving as random exploration.

A major benefit of R&D, then, is in laying fallow until the ‘ripe time’ when they can be immediately exploited in previously-unpredictable ways; applied R&D or VC strategies should focus on maintaining diversity of investments, while continuing to flexibly revisit previous failures which forecasts indicate may have reached ‘ripe time’. This balances overall exploitation & exploration to progress as fast as possible, showing the usefulness of technological forecasting on a global level despite its uselessness to individuals….(More)”.

Supporting priority setting in science using research funding landscapes


Report by the Research on Research Institute: “In this working paper, we describe how to map research funding landscapes in order to support research funders in setting priorities. Based on data on scientific publications, a funding landscape highlights the research fields that are supported by different funders. The funding landscape described here has been created using data from the Dimensions database. It is presented using a freely available web-based tool that provides an interactive visualization of the landscape. We demonstrate the use of the tool through a case study in which we analyze funding of mental health research…(More)”.