The Secret in the Information Society


Paper by Dennis Broeders in Philosophy & Technology: “Who can still keep a secret in a world in which everyone and everything are connected by technology aimed at charting and cross-referencing people, objects, movements, behaviour, relationships, tastes and preferences? The possibilities to keep a secret have come under severe pressure in the information age. That goes for the individual as well as the state. This development merits attention as secrecy is foundational for individual freedom as well as essential to the functioning of the state. Building on Simmel’s work on secrecy, this paper argues that the individual’s secrets should be saved from the ever-expanding digital transparency. The legitimate function of state secrecy in turn needs rescuing from a culture of secrecy and over-classification that has exploded in recent years. Contrary to popular expectation, the digital revolution adds another layer of secrecy that is increasingly hidden behind the facade of the ‘big usable systems’ we work and play with every day. Our dependence on information systems and their black-boxed algorithmic analytical core leads to a certain degree of Weberian (re) enchantment that may increase the disconnect between the system, user and object….(More)”

The era of development mutants


Guilo Quaggiotto at Nesta: “If you were looking for the cutting edge of the development sector, where would you go these days? You would probably look at startups like Premise who have predicted food trends 25 days faster than national statistics in Brazil, or GiveDirectly who are pushing the boundaries on evidence – from RCTs to new ways of mapping poverty – to fast track the adoption of cash transfers.

Or perhaps you might draw your attention to PetaJakarta who are experimenting with new responses to crises by harnessing human sensor networks. You might be tempted to consider Airbnb’s Disaster Response programme as an indicator of an emerging alternative infrastructure for disaster response (and perhaps raising questions about the political economy of this all).

And could Bitnation’s Refugee Emergency programme in response to the European refugee crisis be the possible precursor of future solutions for transnational issues – among the development sector’s hardest challenges? Are the business models of One Acre Fund, which provides services for smallholder farmers, or Floodtags, which analyses citizen data during floods for water and disaster managers, an indicator of future pathways to scale – that elusive development unicorn?

If you want to look at the future of procuring solutions for the development sector, should you be looking at initiatives like Citymart, which works with municipalities across the world to rethink traditional procurement and unleash the expertise and innovation capabilities of their citizens? By the same token, projects like Pathogen Box, Poverty Stoplight or Patient Innovation point to a brave new world where lead-user innovation and harnessing ‘sticky’ local knowledge becomes the norm, rather than the exception. You would also be forgiven for thinking that social movements across the world are the place to look for signs of future mechanisms for harnessing collective intelligence – Kawal Pamilu’s “citizen experts” self-organising around the Indonesian elections in 2014 is a textbook case study in this department.

The list could go on and on: welcome to the era of development mutants. While established players in the development sector are engrossed in soul-searching and their fitness for purpose is being scrutinised from all quarters, a whole new set of players is emerging, unfettered by legacy and borrowing from a variety of different disciplines. They point to a potentially different future – indeed, many potentially different futures – for the sector…..

But what if we wanted to invert this paradigm? How could we move from denial to fruitful collaboration with the ‘edgeryders’ of the development sector and accelerate its transformation?

Adopting new programming principles

Based on our experience working with development organisations, we believe that partnering with the mutants involves two types of shifts for traditional players: at the programmatic and the operational level. At the programmatic level, our work on the ground led us to articulate the following emerging principles:

  1. Mapping what people have, not what they need: even though approaches like jugaad and positive deviance have been around for a long time, unfortunately the default starting point for many development projects is still mapping needs, not assets. Inverting this paradigm allows for potentially disruptive project design and partnerships to emerge. (Signs of the future: Patient Innovation, Edgeryders, Community Mirror, Premise)

  2. Getting ready for multiple futures: When distributed across an organisation and not limited to a centralised function, the discipline of scanning the horizon for emergent solutions that contradict the dominant paradigm can help move beyond the denial phase and develop new interfaces to collaborate with the mutants. Here the link between analysis (to understand not only what is probable, but also what is possible) and action is critical – otherwise this remains purely an academic exercise. (Signs of the future: OpenCare, Improstuctures, Seeds of Good Anthropocene, Museum of the Future)

  3. Running multiple parallel experiments: According to Dave Snowden, in order to intervene in a complex system “you need multiple parallel experiments and they should be based on different and competing theories/hypotheses”. Unfortunately, many development projects are still based on linear narratives and assumptions such as “if only we run an awareness raising campaign citizens will change their behaviour”. Turning linear narratives into hypotheses to be tested (without becoming religious on a specific approach) opens up the possibility to explore the solution landscape and collaborate with non-obvious partners that bring new approaches to the table. (Signs of the future: Chukua Hakua, GiveDirectly, Finnish PM’s Office of Experiments, Ideas42, Cognitive Edge)

  4. Embracing obliquity: A deep, granular understanding of local assets and dynamics along with system mapping (see point 5 below) and pairing behavioural experts with development practitioners can help identify entry points for exploring new types of intervention based on obliquity principles. Mutants are often faster in adopting this approach and partnering with them is a way to bypass organisational inertia and explore nonlinear interventions. (Signs of the future: Sardex, social prescriptions, forensic architecture)

  5. From projects to systems: development organisations genuinely interested in developing new partnerships need to make the shift from the project logic to system investments. This involves, among other things, shifting the focus from providing solutions to helping every actor in the system to develop a higher level of consciousness about the issues they are facing and to take better decisions over time. It also entails partnering with mutants to explore entirely new financial mechanisms. (Signs of the future: Lankelly Chase, Indonesia waste banks, Dark Matter Labs)

Adopting new interfaces for working with the mutants

Harvard Business School professor Carliss Baldwin argued that most bureaucracies these days have a ‘non-contractible’ problem: they don’t know where smart people are, or how to evaluate how good they are. Most importantly, most smart people don’t want to work for them because they find them either too callous, unrewarding or slow (or a combination of all of these)….(More)”

Open Reblock


Open Reblock: “In the developed world, we take it for granted that every home or place of work has access to basic services. This includes clean water, electricity, sanitation, and access for emergency vehicles in case of need, such as ambulances or fire trucks. But this is far from being the rule in most developing cities and is a particularly stark challenge in informal settlements (slums), home for over 1 billion people around the world.

Reblocking is the process of physically transforming an informal settlement to provide an access path to all its structures. This project analyzes the spatial structure of informal city blocks, and uses an algorithm to suggest reblocking solutions that provide access to all structures within the block in a minimally disruptive way. Click on one of the examples below to see the process unfold….

This project aims to create a web-based service for an open-source code base that develops the least-disruptive extension of the existing street network to bring infrastructure and services to informal settlements and set up the conditions for the formalization of land use and property records….(More)

GIS Research Methods: Incorporating Spatial Perspectives


GIS Research Methods: Incorporating Spatial Perspectives shows researchers how to incorporate spatial thinking and geographic information system (GIS) technology into research design and analysis. Topics include research design, digital data sources, volunteered geographic information, analysis using GIS, and how to link research results to policy and action. The concepts presented in GIS Research Methods can be applied to projects in a range of social and physical sciences by researchers using GIS for the first time and experienced practitioners looking for new and innovative research techniques….(More)”

The Hand-Book of the Modern Development Specialist


Responsible Data Forum: “The engine room is excited to release new adaptations of the responsible development data book that we now fondly refer to as, “The Hand-Book of the Modern Development Specialist: Being a Complete Illustrated Guide to Responsible Data Usage, Manners & General Deportment.”

You can now view this resource on its new webpage, where you can read chapter summaries for quickresources, utilize slide decks complete with presenter notes, and read the original resource with a newdesign make-over….

Freshly Released Adaptations

The following adaptations can be found on our Hand-book webpage.

  • Chapter summaries: Chapter summaries enable readers to get a taste of section content, allow them to know if the particular section is of relative use, provides a simple overview if they aren’t comfortable diving right into the book, or gives a memory jog for those who are already familiar withthe content.
  • Slide deck templates: The slide decks enable in-depth presentation based on the structure of the book by using its diagrams. This will help responsible data advocates customize slides for their own organization’s needs. These decks are complete with thorough notes to aide a presenter that may not be an expert on the contents.
  • New & improved book format: Who doesn’t love a makeover? The original resource is still available to download as a printable file for those that prefer book formatting, and now the document sports improved visuals and graphics….(More)”

Crowdcrafting


Crowdcrafting is a web-based service that invites volunteers to contribute to scientific projects developed by citizens, professionals or institutions that need help to solve problems, analyze data or complete challenging tasks that cant be done by machines alone, but require human intelligence. The platform is 100% open source – that is its software is developed and distributed freely – and 100% open-science, making scientific research accessible to everyone.

Crowdcrafting uses PyBossa software: Our open source framework for crowdsourcing projects. Institutions, such as the British Museum, CERN and United Nations (UNITAR), are also PyBossa users.

What is citizen science?

Citizen science is the active contribution of people who are not professional scientists to science. It provides volunteers with the opportunity to contribute intellectually to the research of others, to share resources or tools at their disposal, or even to start their own research projects. Volunteers provide real value to ongoing research while they themselves acquire a better understanding of the scientific method.

Citizen science opens the doors of laboratories and makes science accessible to all. It facilitates a direct conversation between scientists and enthusiasts who wish to contribute to scientific endeavour.

Who and how you can collaborate?

Anyone can create a new project or contribute to an existing project in Crowdcrafting.

All projects start with a simple tutorial explaining how they work and providing all the information required to participate. There is thus no specific knowledge or experience required to complete proposed tasks. All volunteers need is a keen attitude to learn and share science with everyone….(More)”

Why our peer review system is a toothless watchdog


Ivan Oransky and Adam Marcus at StatNews: “While some — namely, journal editors and publishers — would like us to consider it the opposable thumb of scientific publishing, the key to differentiating rigor from rubbish, some of those very same people seem to think it’s good for nothing. Here is a partial list of the things that editors, publishers, and others have told the world peer review is not designed to do:

1. Detect irresponsible practices

Don’t expect peer reviewers to figure out if authors are “using public data as if it were the author’s own, submitting papers with the same content to different journals, or submitting an article that has already been published in another language without reference to the original,” said the InterAcademy Partnership, a consortium of national scientific academies.

2. Detect fraud

“Journal editors will tell you that peer review is not designed to detect fraud — clever misinformation will sail right through no matter how scrupulous the reviews,” Dan Engber wrote in Slate in 2005.

3. Pick up plagiarism

Peer review “is not designed to pick up fraud or plagiarism, so unless those are really egregious it usually doesn’t,” according to the Rett Syndrome Research Trust.

4. Spot ethics issues

“It is not the role of the reviewer to spot ethics issues in papers,” said Jaap van Harten, executive publisher of Elsevier (the world’s largest academic imprint)in a recent interview. “It is the responsibility of the author to abide by the publishing ethics rules. Let’s look at it in a different way: If a person steals a pair of shoes from a shop, is this the fault of the shop for not protecting their goods or the shoplifter for stealing them? Of course the fault lies with the shoplifter who carried out the crime in the first place.”

5. Spot statistical flaccidity

“Peer reviewers do not check all the datasets, rerun calculations of p-values, and so forth, except in the cases where statistical reviewers are involved — and even in these cases, statistical reviewers often check the methodologies used, sample some data, and move on.” So wrote Kent Anderson, who has served as a publishing exec at several top journals, including Science and the New England Journal of Medicine, in a recent blog post.

6. Prevent really bad research from seeing the light of day

Again, Kent Anderson: “Even the most rigorous peer review at a journal cannot stop a study from being published somewhere. Peer reviewers can’t stop an author from self-promoting a published work later.”

But …

Even when you lower expectations for peer review, it appears to come up short. Richard Smith, former editor of the BMJ, reviewed research showing that the system may be worse than no review at all, at least in biomedicine. “Peer review is supposed to be the quality assurance system for science, weeding out the scientifically unreliable and reassuring readers of journals that they can trust what they are reading,” Smith wrote. “In reality, however, it is ineffective, largely a lottery, anti-innovatory, slow, expensive, wasteful of scientific time, inefficient, easily abused, prone to bias, unable to detect fraud and irrelevant.”

So … what’s left? And are whatever scraps that remain worth the veneration peer review receives? Don’t write about anything that isn’t peer-reviewed, editors frequently admonish us journalists, even creating rules that make researchers afraid to talk to reporters before they’ve published. There’s a good chance it will turn out to be wrong. Oh? Greater than 50 percent? Because that’s the risk of preclinical research in biomedicine being wrong after it’s been peer-reviewed.

With friends like these, who needs peer review? In fact, we do need it, but not just only in the black box that happens before publication. We need continual scrutiny of findings, at sites such as PubMed Commons and PubPeer, in what is known as post-publication peer review. That’s where the action is, and where the scientific record actually gets corrected….(More)”

citizenscience.gov


citizenscience.gov is an official government website designed to accelerate the use of crowdsourcing and citizen science across the U.S. government. The site provides a portal to three key components for federal practitioners: a searchable catalog of federally supported citizen science projects, a toolkit to assist with designing and maintaining projects, and a gateway to a community of practice to share best practices.

Simplexity


Paper by Joshua D. Blank and Leigh Osofsky: “In recent years, federal government agencies have increasingly attempted to use plain language in written communications with the public. The Plain Writing Act of 2010, for instance, requires agencies to incorporate “clear and simple” explanations of rules and regulations into their official publications. In the tax context, as part of its “customer service” mission, the Internal Revenue Service bears a “duty to explain” the tax law to hundreds of millions of taxpayers who file tax returns each year. Proponents of the plain language movement have heralded this form of communication as leading to simplicity in tax compliance, more equitable access to federal programs and increased open government.

This Article casts plain language efforts in a different light. As we argue, rather than achieving simplicity, which would involve reform of the underlying law, the use of plain language to describe complex legal rules and regulations often yields “simplexity.” As we define it, simplexity occurs when the government presents clear and simple explanations of the law without highlighting its underlying complexity or reducing this complexity through formal legal changes. We show that in its numerous taxpayer publications, the IRS frequently uses plain language to transform complex, often ambiguous tax law into seemingly simple statements that (1) present contested tax law as clear tax rules, (2) add administrative gloss to the tax law and (3) fail to fully explain the tax law, including possible exceptions. Sometimes these plain language explanations benefit the government; at other times, they benefit taxpayers.

While simplexity offers a number of potential tax administration benefits, such as making the tax law understandable and even bolstering the IRS’s ability to collect tax revenue, it can also threaten vital values of transparency and democratic governance and can result in inequitable treatment of different taxpayers. We offer approaches for preserving some of the benefits of simplexity while also responding to some of its drawbacks. We also forecast the likely emergence of simplexity in potential future tax compliance measures, such as government-prepared tax returns, interactive tax return filing and increased third-party reporting….(More)”.

How to See Gentrification Coming


Nathan Collins at Pacific Standard: “Depending on whom you ask, gentrification is either damaging, not so bad, or maybe even good for the low-income people who live in what we euphemistically call up-and-coming neighborhoods. Either way, it’d be nice for everybody to know which neighborhoods are going to get revitalized/eviscerated next. Now, computer scientists think they’ve found a way to do exactly that: Using Twitter and Foursquare, map the places visited by the most socially diverse crowds. Those, it turns out, are the most likely to gentrify.

Led by University of Cambridge graduate student Desislava Hristova, the researchers began their study by mapping out the social network of 37,722 Londoners who posted Foursquare check-ins via Twitter. Two people were presumed to be friends—connected on the social network—if they followed each other’s Twitter feeds. Next, Hristova and her colleagues built a geographical network of 42,080 restaurants, clubs, shops, apartments, and so on. Quaint though it may seem, the researchers treated two places as neighbors in the geographical network if they were, in fact, physically near each other. The team then linked the social and geographical networks using 549,797 Foursquare check-ins, each of which ties a person in the social network to a place in the geographical one.

Gentrification doesn’t start when outsiders move in; it starts when outsiders come to visit.

Using the network data, the team next constructed several measures of the social diversity of places, each of which helps distinguish between places that bring together friends versus strangers, and to distinguish between spots that attract socially diverse crowds versus a steady group of regulars. Among other things, those measures showed that places in the outer boroughs of London brought together more socially homogenous groups of people—in terms of their Foursquare check-ins, at least—compared with boroughs closer to the core.

But the real question is what social diversity has to do with gentrification. To measure that, the team used the United Kingdom’s Index of Multiple Deprivation, which takes into account income, education, environmental factors such as air quality, and more to quantify the socioeconomic state of affairs in localities across the U.K., including each of London’s 32 boroughs.

The rough pattern, according to the analysis: The most socially diverse places in London were also the most deprived. This is about the opposite of what you’d expect, based on social networks studied in isolation from geography, which indicates that, generally, the people with the most diverse social networks are the most prosperous….(More)”