The Hand-Book of the Modern Development Specialist


Responsible Data Forum: “The engine room is excited to release new adaptations of the responsible development data book that we now fondly refer to as, “The Hand-Book of the Modern Development Specialist: Being a Complete Illustrated Guide to Responsible Data Usage, Manners & General Deportment.”

You can now view this resource on its new webpage, where you can read chapter summaries for quickresources, utilize slide decks complete with presenter notes, and read the original resource with a newdesign make-over….

Freshly Released Adaptations

The following adaptations can be found on our Hand-book webpage.

  • Chapter summaries: Chapter summaries enable readers to get a taste of section content, allow them to know if the particular section is of relative use, provides a simple overview if they aren’t comfortable diving right into the book, or gives a memory jog for those who are already familiar withthe content.
  • Slide deck templates: The slide decks enable in-depth presentation based on the structure of the book by using its diagrams. This will help responsible data advocates customize slides for their own organization’s needs. These decks are complete with thorough notes to aide a presenter that may not be an expert on the contents.
  • New & improved book format: Who doesn’t love a makeover? The original resource is still available to download as a printable file for those that prefer book formatting, and now the document sports improved visuals and graphics….(More)”

Crowdcrafting


Crowdcrafting is a web-based service that invites volunteers to contribute to scientific projects developed by citizens, professionals or institutions that need help to solve problems, analyze data or complete challenging tasks that cant be done by machines alone, but require human intelligence. The platform is 100% open source – that is its software is developed and distributed freely – and 100% open-science, making scientific research accessible to everyone.

Crowdcrafting uses PyBossa software: Our open source framework for crowdsourcing projects. Institutions, such as the British Museum, CERN and United Nations (UNITAR), are also PyBossa users.

What is citizen science?

Citizen science is the active contribution of people who are not professional scientists to science. It provides volunteers with the opportunity to contribute intellectually to the research of others, to share resources or tools at their disposal, or even to start their own research projects. Volunteers provide real value to ongoing research while they themselves acquire a better understanding of the scientific method.

Citizen science opens the doors of laboratories and makes science accessible to all. It facilitates a direct conversation between scientists and enthusiasts who wish to contribute to scientific endeavour.

Who and how you can collaborate?

Anyone can create a new project or contribute to an existing project in Crowdcrafting.

All projects start with a simple tutorial explaining how they work and providing all the information required to participate. There is thus no specific knowledge or experience required to complete proposed tasks. All volunteers need is a keen attitude to learn and share science with everyone….(More)”

Why our peer review system is a toothless watchdog


Ivan Oransky and Adam Marcus at StatNews: “While some — namely, journal editors and publishers — would like us to consider it the opposable thumb of scientific publishing, the key to differentiating rigor from rubbish, some of those very same people seem to think it’s good for nothing. Here is a partial list of the things that editors, publishers, and others have told the world peer review is not designed to do:

1. Detect irresponsible practices

Don’t expect peer reviewers to figure out if authors are “using public data as if it were the author’s own, submitting papers with the same content to different journals, or submitting an article that has already been published in another language without reference to the original,” said the InterAcademy Partnership, a consortium of national scientific academies.

2. Detect fraud

“Journal editors will tell you that peer review is not designed to detect fraud — clever misinformation will sail right through no matter how scrupulous the reviews,” Dan Engber wrote in Slate in 2005.

3. Pick up plagiarism

Peer review “is not designed to pick up fraud or plagiarism, so unless those are really egregious it usually doesn’t,” according to the Rett Syndrome Research Trust.

4. Spot ethics issues

“It is not the role of the reviewer to spot ethics issues in papers,” said Jaap van Harten, executive publisher of Elsevier (the world’s largest academic imprint)in a recent interview. “It is the responsibility of the author to abide by the publishing ethics rules. Let’s look at it in a different way: If a person steals a pair of shoes from a shop, is this the fault of the shop for not protecting their goods or the shoplifter for stealing them? Of course the fault lies with the shoplifter who carried out the crime in the first place.”

5. Spot statistical flaccidity

“Peer reviewers do not check all the datasets, rerun calculations of p-values, and so forth, except in the cases where statistical reviewers are involved — and even in these cases, statistical reviewers often check the methodologies used, sample some data, and move on.” So wrote Kent Anderson, who has served as a publishing exec at several top journals, including Science and the New England Journal of Medicine, in a recent blog post.

6. Prevent really bad research from seeing the light of day

Again, Kent Anderson: “Even the most rigorous peer review at a journal cannot stop a study from being published somewhere. Peer reviewers can’t stop an author from self-promoting a published work later.”

But …

Even when you lower expectations for peer review, it appears to come up short. Richard Smith, former editor of the BMJ, reviewed research showing that the system may be worse than no review at all, at least in biomedicine. “Peer review is supposed to be the quality assurance system for science, weeding out the scientifically unreliable and reassuring readers of journals that they can trust what they are reading,” Smith wrote. “In reality, however, it is ineffective, largely a lottery, anti-innovatory, slow, expensive, wasteful of scientific time, inefficient, easily abused, prone to bias, unable to detect fraud and irrelevant.”

So … what’s left? And are whatever scraps that remain worth the veneration peer review receives? Don’t write about anything that isn’t peer-reviewed, editors frequently admonish us journalists, even creating rules that make researchers afraid to talk to reporters before they’ve published. There’s a good chance it will turn out to be wrong. Oh? Greater than 50 percent? Because that’s the risk of preclinical research in biomedicine being wrong after it’s been peer-reviewed.

With friends like these, who needs peer review? In fact, we do need it, but not just only in the black box that happens before publication. We need continual scrutiny of findings, at sites such as PubMed Commons and PubPeer, in what is known as post-publication peer review. That’s where the action is, and where the scientific record actually gets corrected….(More)”

citizenscience.gov


citizenscience.gov is an official government website designed to accelerate the use of crowdsourcing and citizen science across the U.S. government. The site provides a portal to three key components for federal practitioners: a searchable catalog of federally supported citizen science projects, a toolkit to assist with designing and maintaining projects, and a gateway to a community of practice to share best practices.

Simplexity


Paper by Joshua D. Blank and Leigh Osofsky: “In recent years, federal government agencies have increasingly attempted to use plain language in written communications with the public. The Plain Writing Act of 2010, for instance, requires agencies to incorporate “clear and simple” explanations of rules and regulations into their official publications. In the tax context, as part of its “customer service” mission, the Internal Revenue Service bears a “duty to explain” the tax law to hundreds of millions of taxpayers who file tax returns each year. Proponents of the plain language movement have heralded this form of communication as leading to simplicity in tax compliance, more equitable access to federal programs and increased open government.

This Article casts plain language efforts in a different light. As we argue, rather than achieving simplicity, which would involve reform of the underlying law, the use of plain language to describe complex legal rules and regulations often yields “simplexity.” As we define it, simplexity occurs when the government presents clear and simple explanations of the law without highlighting its underlying complexity or reducing this complexity through formal legal changes. We show that in its numerous taxpayer publications, the IRS frequently uses plain language to transform complex, often ambiguous tax law into seemingly simple statements that (1) present contested tax law as clear tax rules, (2) add administrative gloss to the tax law and (3) fail to fully explain the tax law, including possible exceptions. Sometimes these plain language explanations benefit the government; at other times, they benefit taxpayers.

While simplexity offers a number of potential tax administration benefits, such as making the tax law understandable and even bolstering the IRS’s ability to collect tax revenue, it can also threaten vital values of transparency and democratic governance and can result in inequitable treatment of different taxpayers. We offer approaches for preserving some of the benefits of simplexity while also responding to some of its drawbacks. We also forecast the likely emergence of simplexity in potential future tax compliance measures, such as government-prepared tax returns, interactive tax return filing and increased third-party reporting….(More)”.

How to See Gentrification Coming


Nathan Collins at Pacific Standard: “Depending on whom you ask, gentrification is either damaging, not so bad, or maybe even good for the low-income people who live in what we euphemistically call up-and-coming neighborhoods. Either way, it’d be nice for everybody to know which neighborhoods are going to get revitalized/eviscerated next. Now, computer scientists think they’ve found a way to do exactly that: Using Twitter and Foursquare, map the places visited by the most socially diverse crowds. Those, it turns out, are the most likely to gentrify.

Led by University of Cambridge graduate student Desislava Hristova, the researchers began their study by mapping out the social network of 37,722 Londoners who posted Foursquare check-ins via Twitter. Two people were presumed to be friends—connected on the social network—if they followed each other’s Twitter feeds. Next, Hristova and her colleagues built a geographical network of 42,080 restaurants, clubs, shops, apartments, and so on. Quaint though it may seem, the researchers treated two places as neighbors in the geographical network if they were, in fact, physically near each other. The team then linked the social and geographical networks using 549,797 Foursquare check-ins, each of which ties a person in the social network to a place in the geographical one.

Gentrification doesn’t start when outsiders move in; it starts when outsiders come to visit.

Using the network data, the team next constructed several measures of the social diversity of places, each of which helps distinguish between places that bring together friends versus strangers, and to distinguish between spots that attract socially diverse crowds versus a steady group of regulars. Among other things, those measures showed that places in the outer boroughs of London brought together more socially homogenous groups of people—in terms of their Foursquare check-ins, at least—compared with boroughs closer to the core.

But the real question is what social diversity has to do with gentrification. To measure that, the team used the United Kingdom’s Index of Multiple Deprivation, which takes into account income, education, environmental factors such as air quality, and more to quantify the socioeconomic state of affairs in localities across the U.K., including each of London’s 32 boroughs.

The rough pattern, according to the analysis: The most socially diverse places in London were also the most deprived. This is about the opposite of what you’d expect, based on social networks studied in isolation from geography, which indicates that, generally, the people with the most diverse social networks are the most prosperous….(More)”

Friended, but not Friends: Federal Ethics Authorities Address Role of Social Media in Politics


CRS Reports & Analysis: “Since the rise of social media over the past decade, new platforms of technology have reinforced the adage that the law lags behind developments in technology. Government agencies, officials, and employees regularly use a number of social media options – e.g., Twitter, Facebook, etc. – that have led agencies to update existing ethics rules to reflect the unique issues that they may present. Two areas of ethics regulation affected by the increased role of social media are the ethical standards governing gifts to federal employees and the restrictions on employees’ political activities. These rules apply to employees in the executive branch, though separate ethics rules and guidance on similar topics apply to the House and Senate….(More)”

Fairness in Machine Learning


Presentation by Delip Rao: “…The models you create have power to get people arrested or vindicated, get loans approved or rejected, determine what interest rate should be charged for such loans, who should be shown to you in your long list of pursuits on your Tinder, what news do you read, who gets called for a job phone screen or even a college admission… the list goes on.

So what can you do about it?…

I have detailed notes for some of these slides. If you would like to follow those, try going directly to Google Slides.

 

Data protection laws around the world


Fifth edition Handbook by DLA Piper’s Data Protection and Privacy practice: “More than ever it is crucial that organisations manage and safeguard personal information and address their risks and legal responsibilities in relation to processing personal data, to address the growing thicket of applicable data protection legislation.

A well‑constructed and comprehensive compliance program can solve these competing interests and is an important risk‑management tool.

This handbook sets out an overview of the key privacy and data protection laws and regulations across nearly 100 different jurisdictions and offers a primer to businesses as they consider this complex and increasingly important area of compliance….(More)”

Infomediaries and accountability


Paper by Becky Carter: “A synthesis of what the existing evidence says (and where there are gaps) on:

1) What role might ‘infomediaries’, and specifically the media have in helping translate transparency into greater government accountability? In generating that accountability? In empowering citizens?

2) In what contexts or types of contexts do ‘infomediaries’ and media play such a facilitating role, and why?

3) What enabling factors contributed to success?

4) What role, if any, have donors had in supporting these sectors in this capacity?

5) What risks exist in this space?…(More)”