Science to the People


David Lang on how citizen science bridges the gap between science and society: “It’s hard to find a silver lining in the water crisis in Flint, Michigan. The striking images of jugs of brown water being held high in protest are a symbol of institutional failure on a grand scale. It’s a disaster. But even as questions of accountability and remedy remain unanswered, there is already one lesson we can take away: Citizen science can be used as a powerful tool to build (or rebuild) the public’s trust in science.

Because the other striking image from Flint is this: Citizen-scientists  sampling and testing their own water, from their homes and neighborhoods,and reporting the results as scientific data. Dr. Marc Edwards is the VirginiaTech civil engineering professor who led the investigation into the lead levels in Flint’s water supply, and in a February 2016 interview with TheChronicle of Higher Education, he gave an important answer about the methods his team used to obtain the data: “Normal people really appreciate good science that’s done in their interest. They stepped forward as citizen-scientists to explore what was happening to them and to their community,we provided some funding and the technical and analytical expertise, and they did all the work. I think that work speaks for itself.”

It’s a subtle but important message: The community is rising up and rallying by using science, not by reacting to it. Other scientists trying to highlight important issues and influence public opinion would do well to take note, because there’s a disconnect between what science reports and what the general public chooses to believe. For instance, 97 percent of scientists agree that the world’s climate is warming, likely due to human activities. Yet only 70 percent of Americans believe that global warming is real. Many of the most important issues of our time have the same, growing gap between scientific and societal consensus: genetically modified foods, evolution,vaccines are often widely distrusted or disputed despite strong, positive scientific evidence…..

The good news is that we’re learning. Citizen science — the growing trend of involving non-professional scientists in the process of discovery — is proving to be a supremely effective tool. It now includes far more than birders and backyard astronomers, its first amateur champions. Over the past few years,the discipline has been gaining traction and popularity in academic circles too. Involving groups of amateur volunteers is now a proven strategy for collecting data over large geographic areas or over long periods of time.Online platforms like Zooniverse have shown that even an untrained human eye can spot anomalies in everything from wildebeest migrations to Martiansurfaces. For certain types of research, citizen science just works.

While a long list of peer-reviewed papers now backs up the efficacy of citizen science, and a series of papers has shown its positive impact on students’ view of science, we’re just beginning to understand the impact of that participation on the wider perception of science. Truthfully, for now,most of what we know so far about its public impact is anecdotal, as in the work in Flint, or even on our online platform for explorers, OpenExplorer….It makes sense that citizen science should affect public perception of science.The difference between “here are the results of a study” and “please help

It makes sense that citizen science should affect public perception of science.The difference between “here are the results of a study” and “please help us in the process of discovery” is profound. It’s the difference between a rote learning moment and an immersive experience. And even if not everyone is getting involved, the fact that this is possible and that some members of a community are engaging makes science instantly more relatable. It creates what Tim O’Reilly calls an “architecture of participation.” Citizen scientists create the best interface for convincing the rest of the populace.

A recent article in Nature argued that the DIY biology community was, in fact, ahead of the scientific establishment in terms of proactively thinking about the safety and ethics of rapidly advancing biotechnology tools. They had to be. For those people opening up community labs so that anyone can come and participate, public health issues can’t be pushed aside or dealt with later. After all, they are the public that will be affected….(More)”

The Open Data Barometer (3rd edition)


The Open Data Barometer: “Once the preserve of academics and statisticians, data has become a development cause embraced by everyone from grassroots activists to the UN Secretary-General. There’s now a clear understanding that we need robust data to drive democracy and development — and a lot of it.

Last year, the world agreed the Sustainable Development Goals (SDGs) — seventeen global commitments that set an ambitious agenda to end poverty, fight inequality and tackle climate change by 2030. Recognising that good data is essential to the success of the SDGs, the Global Partnership for Sustainable Development Data and the International Open Data Charter were launched as the SDGs were unveiled. These alliances mean the “data revolution” now has over 100 champions willing to fight for it. Meanwhile, Africa adopted the African Data Consensus — a roadmap to improving data standards and availability in a region that has notoriously struggled to capture even basic information such as birth registration.

But while much has been made of the need for bigger and better data to power the SDGs, this year’s Barometer follows the lead set by the International Open Data Charter by focusing on how much of this data will be openly available to the public.

Open data is essential to building accountable and effective institutions, and to ensuring public access to information — both goals of SDG 16. It is also essential for meaningful monitoring of progress on all 169 SDG targets. Yet the promise and possibilities offered by opening up data to journalists, human rights defenders, parliamentarians, and citizens at large go far beyond even these….

At a glance, here are this year’s key findings on the state of open data around the world:

    • Open data is entering the mainstream.The majority of the countries in the survey (55%) now have an open data initiative in place and a national data catalogue providing access to datasets available for re-use. Moreover, new open data initiatives are getting underway or are promised for the near future in a number of countries, including Ecuador, Jamaica, St. Lucia, Nepal, Thailand, Botswana, Ethiopia, Nigeria, Rwanda and Uganda. Demand is high: civil society and the tech community are using government data in 93% of countries surveyed, even in countries where that data is not yet fully open.
    • Despite this, there’s been little to no progress on the number of truly open datasets around the world.Even with the rapid spread of open government data plans and policies, too much critical data remains locked in government filing cabinets. For example, only two countries publish acceptable detailed open public spending data. Of all 1,380 government datasets surveyed, almost 90% are still closed — roughly the same as in the last edition of the Open Data Barometer (when only 130 out of 1,290 datasets, or 10%, were open). What is more, much of the approximately 10% of data that meets the open definition is of poor quality, making it difficult for potential data users to access, process and work with it effectively.
    • “Open-washing” is jeopardising progress. Many governments have advertised their open data policies as a way to burnish their democratic and transparent credentials. But open data, while extremely important, is just one component of a responsive and accountable government. Open data initiatives cannot be effective if not supported by a culture of openness where citizens are encouraged to ask questions and engage, and supported by a legal framework. Disturbingly, in this edition we saw a backslide on freedom of information, transparency, accountability, and privacy indicators in some countries. Until all these factors are in place, open data cannot be a true SDG accelerator.
    • Implementation and resourcing are the weakest links.Progress on the Barometer’s implementation and impact indicators has stalled or even gone into reverse in some cases. Open data can result in net savings for the public purse, but getting individual ministries to allocate the budget and staff needed to publish their data is often an uphill battle, and investment in building user capacity (both inside and outside of government) is scarce. Open data is not yet entrenched in law or policy, and the legal frameworks supporting most open data initiatives are weak. This is a symptom of the tendency of governments to view open data as a fad or experiment with little to no long-term strategy behind its implementation. This results in haphazard implementation, weak demand and limited impact.
    • The gap between data haves and have-nots needs urgent attention.Twenty-six of the top 30 countries in the ranking are high-income countries. Half of open datasets in our study are found in just the top 10 OECD countries, while almost none are in African countries. As the UN pointed out last year, such gaps could create “a whole new inequality frontier” if allowed to persist. Open data champions in several developing countries have launched fledgling initiatives, but too often those good open data intentions are not adequately resourced, resulting in weak momentum and limited success.
    • Governments at the top of the Barometer are being challenged by a new generation of open data adopters. Traditional open data stalwarts such as the USA and UK have seen their rate of progress on open data slow, signalling that new political will and momentum may be needed as more difficult elements of open data are tackled. Fortunately, a new generation of open data adopters, including France, Canada, Mexico, Uruguay, South Korea and the Philippines, are starting to challenge the ranking leaders and are adopting a leadership attitude in their respective regions. The International Open Data Charter could be an important vehicle to sustain and increase momentum in challenger countries, while also stimulating renewed energy in traditional open data leaders….(More)”

Foreign Policy has lost its creativity. Design thinking is the answer.


Elizabeth Radziszewski at The Wilson Quaterly: “Although the landscape of threats has changed in recent years, U.S. strategies bear striking resemblance to the ways policymakers dealt with crises in the past. Whether it involves diplomatic overtures, sanctions, bombing campaigns, or the use of special ops and covert operations, the range of responses suffers from innovation deficit. Even the use of drones, while a new tool of warfare, is still part of the limited categories of responses that focus mainly on whether or not to kill, cooperate, or do nothing. To meet the evolving nature of threats posed by nonstate actors such as ISIS, the United States needs a strategy makeover — a creative lift, so to speak.

Sanctions, diplomacy, bombing campaigns, special ops, covert operations — the range of our foreign policy responses suffers from an innovation deficit.

Enter the business world. Today’s top companies face an increasingly competitive marketplace where innovative approaches to product and service development are a necessity. Just as the market has changed for companies since the forces of globalization and the digital economy took over, so has the security landscape evolved for the world’s leading hegemon. Yet the responses of top businesses to these changes stand in stark contrast to the United States’ stagnant approaches to current national security threats. Many of today’s thriving businesses have embraced design thinking (DT), an innovative process that identifies consumer needs through immersive ethnographic experiences that are melded with creative brainstorming and quick prototyping.

What would happen if U.S. policymakers took cues from the business world and applied DT in policy development? Could the United States prevent the threats from metastasizing with more proactive rather than reactive strategies — by discovering, for example, how ideas from biology, engineering, and other fields could help analysts inject fresh perspective into tired solutions? Put simply, if U.S. policymakers want to succeed in managing future threats, then they need to start thinking more like business innovators who integrate human needs with technology and economic feasibility.

In his 1969 book The Sciences of the Artificial, Herbert Simon made the first connection between design and a way of thinking. But it was not until the 1980s and 1990s that Stanford scientists began to see the benefits of design practices used by industrial designers as a method for creative thinking. At the core of DT is the idea that solving a challenge requires a deeper understanding of the problem’s true nature and the processes and people involved. This approach contrasts greatly with more standard innovation styles, where a policy solution is developed and then resources are used to fit the solution to the problem. DT reverses the order.

DT encourages divergent thinking, the process of generating many ideas before converging to select the most feasible ones, including making connections between different-yet-related worlds. Finally, the top ideas are quickly prototyped and tested so that early solutions can be modified without investing many resources and risking the biggest obstacle to real innovation: the impulse to try fitting an idea, product, policy to the people, rather of the other way around…

If DT has reenergized the innovative process in the business and nonprofit sector, a systematic application of its methodology could just as well revitalize U.S. national security policies. Innovation in security and foreign policy is often framed around the idea of technological breakthroughs. Thanks toDefense Advanced Research Projects Agency (DARPA), the Department of Defense has been credited with such groundbreaking inventions as GPS, the Internet, and stealth fighters — all of which have created rich opportunities to explore new military strategies. Reflecting this infatuation with technology, but with a new edge, is Defense Secretary Ashton Carter’s unveiling of the Defense Innovation Unit Experimental, an initiative to scout for new technologies, improve outreach to startups, and form deeper relationships between the Pentagon and Silicon Valley. The new DIUE effort signals what businesses have already noticed: the need to be more flexible in establishing linkages with people outside of the government in search for new ideas.

Yet because the primary objective of DIUE remains technological prowess, the effort alone is unlikely to drastically improve the management of national security. Technology is not a substitute for an innovative process. When new invention is prized as the sole focus of innovation, it can, paradoxically, paralyze innovation. Once an invention is adopted, it is all too tempting to mold subsequent policy development around emergent technology, even if other solutions could be more appropriate….(More)”

Juries as Problem Solving Institutions


Series of interviews on Collective Problem Solving by Henry FarrellOver the last two years, a group of scholars from disciplines including political science, political theory, cognitive psychology, information science, statistics and computer science have met under the auspices of the MacArthur Foundation Research Network on Opening Governance. The goal of these meetings has been to bring the insights of different disciplines to bear on fundamental problems of collective problem solving. How do we best solve collective problems? How should we study and think about collective intelligence? How can we apply insights to real world problems? A wide body of work leads us to believe that complex problems are most likely to be solved when people with different viewpoints and sets of skills come together. This means that we can expect that the science of collective problem solving too will be improved when people from diverse disciplinary perspectives work together to generate new insights on shared problems.

Political theorists are beginning to think in different ways about institutions such as juries. Here, the crucial insights will involve how these institutions can address the traditional concerns of political theory, such as justice and recognition, while also solving the complex problem of figuring out how best to resolve disputes, and establishing the guilt or innocence of parties in criminal cases.

Melissa Schwartzberg is an associate professor of political science at New York University, working on the political theory of democratic decision making. I asked her a series of questions about the jury as a problem-solving institution.

Henry: Are there any general ways for figuring out the kinds of issues that juries (based on random selection of citizens and some voting rule) are good at deciding on, and the issues that they might have problems with?

Melissa: This is a difficult question, in part because we don’t have unmediated access to the “true state of the world”: our evidence about jury competence essentially derives from the correlation of jury verdicts with what the judge would have rendered, but obviously that doesn’t mean that the judge was correct. One way around the question is to ask instead what, historically, have been the reasons why we would wish to assign judgment to laypersons: what the “jury of one’s peers” signifies. Placing a body of ordinary citizens between the state and the accused serves an important protective device, so the use of the jury is quite clearly not all about judgment. But there is a long history of thinking that juries have special access to local knowledge – the established norms, practices, and expectations of a community, but in early periods knowledge of the parties and the alleged crime – that helps to shed light on why we still think “vicinage” is important…..(More)”

E-Regulation and the Rule of Law: Smart Government, Institutional Information Infrastructures, and Fundamental Values


Rónán Kennedy in Information Polity: “Information and communications technology (ICT) is increasingly used in bureaucratic and regulatory processes. With the development of the ‘Internet of Things’, some researchers speak enthusiastically of the birth of the ‘Smart State’. However, there are few theoretical or critical perspectives on the role of ICT in these routine decision-making processes and the mundane work of government regulation of economic and social activity. This paper therefore makes an important contribution by putting forward a theoretical perspective on smartness in government and developing a values-based framework for the use of ICT as a tool in the internal machinery of government.

It critically reviews the protection of the rule of law in digitized government. As an addition to work on e-government, a new field of study, ‘e-regulation’ is proposed, defined, and critiqued, with particular attention to the difficulties raised by the use of models and simulation. The increasing development of e-regulation could compromise fundamental values by embedding biases, software errors, and mistaken assumptions deeply into government procedures. The article therefore discusses the connections between the ‘Internet of Things’, the development of ‘Ambient Law’, and how the use of ICT in e-regulation can be a support for or an impediment to the operation of the rule of law. It concludes that e-government research should give more attention to the processes of regulation, and that law should be a more central discipline for those engaged in this activity….(More)

Accountable Algorithms


Paper by Joshua A. Kroll et al: “Many important decisions historically made by people are now made by computers. Algorithms count votes, approve loan and credit card applications, target citizens or neighborhoods for police scrutiny, select taxpayers for an IRS audit, and grant or deny immigration visas.

The accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology. The tools currently available to policymakers, legislators, and courts were developed to oversee human decision-makers and often fail when applied to computers instead: for example, how do you judge the intent of a piece of software? Additional approaches are needed to make automated decision systems — with their potentially incorrect, unjustified or unfair results — accountable and governable. This Article reveals a new technological toolkit to verify that automated decisions comply with key standards of legal fairness.

We challenge the dominant position in the legal literature that transparency will solve these problems. Disclosure of source code is often neither necessary (because of alternative techniques from computer science) nor sufficient (because of the complexity of code) to demonstrate the fairness of a process. Furthermore, transparency may be undesirable, such as when it permits tax cheats or terrorists to game the systems determining audits or security screening.

The central issue is how to assure the interests of citizens, and society as a whole, in making these processes more accountable. This Article argues that technology is creating new opportunities — more subtle and flexible than total transparency — to design decision-making algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of algorithms, but also — in certain cases — the governance of decision-making in general. The implicit (or explicit) biases of human decision-makers can be difficult to find and root out, but we can peer into the “brain” of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterwards.

The technological tools introduced in this Article apply widely. They can be used in designing decision-making processes from both the private and public sectors, and they can be tailored to verify different characteristics as desired by decision-makers, regulators, or the public. By forcing a more careful consideration of the effects of decision rules, they also engender policy discussions and closer looks at legal standards. As such, these tools have far-reaching implications throughout law and society.

Part I of this Article provides an accessible and concise introduction to foundational computer science concepts that can be used to verify and demonstrate compliance with key standards of legal fairness for automated decisions without revealing key attributes of the decision or the process by which the decision was reached. Part II then describes how these techniques can assure that decisions are made with the key governance attribute of procedural regularity, meaning that decisions are made under an announced set of rules consistently applied in each case. We demonstrate how this approach could be used to redesign and resolve issues with the State Department’s diversity visa lottery. In Part III, we go further and explore how other computational techniques can assure that automated decisions preserve fidelity to substantive legal and policy choices. We show how these tools may be used to assure that certain kinds of unjust discrimination are avoided and that automated decision processes behave in ways that comport with the social or legal standards that govern the decision. We also show how algorithmic decision-making may even complicate existing doctrines of disparate treatment and disparate impact, and we discuss some recent computer science work on detecting and removing discrimination in algorithms, especially in the context of big data and machine learning. And lastly in Part IV, we propose an agenda to further synergistic collaboration between computer science, law and policy to advance the design of automated decision processes for accountability….(More)”

A New Dark Age Looms


William B. Gail in the New York Times: “Imagine a future in which humanity’s accumulated wisdom about Earth — our vast experience with weather trends, fish spawning and migration patterns, plant pollination and much more — turns increasingly obsolete. As each decade passes, knowledge of Earth’s past becomes progressively less effective as a guide to the future. Civilization enters a dark age in its practical understanding of our planet.

To comprehend how this could occur, picture yourself in our grandchildren’s time, a century hence. Significant global warming has occurred, as scientists predicted. Nature’s longstanding, repeatable patterns — relied on for millenniums by humanity to plan everything from infrastructure to agriculture — are no longer so reliable. Cycles that have been largely unwavering during modern human history are disrupted by substantial changes in temperature and precipitation….

Our foundation of Earth knowledge, largely derived from historically observed patterns, has been central to society’s progress. Early cultures kept track of nature’s ebb and flow, passing improved knowledge about hunting and agriculture to each new generation. Science has accelerated this learning process through advanced observation methods and pattern discovery techniques. These allow us to anticipate the future with a consistency unimaginable to our ancestors.

But as Earth warms, our historical understanding will turn obsolete faster than we can replace it with new knowledge. Some patterns will change significantly; others will be largely unaffected, though it will be difficult to say what will change, by how much, and when.

The list of possible disruptions is long and alarming. We could see changes to the prevalence of crop and human pests, like locust plagues set off by drought conditions; forest fire frequency; the dynamics of the predator-prey food chain; the identification and productivity of reliably arable land, and the predictability of agriculture output.

Historians of the next century will grasp the importance of this decline in our ability to predict the future. They may mark the coming decades of this century as the period during which humanity, despite rapid technological and scientific advances, achieved “peak knowledge” about the planet it occupies. They will note that many decades may pass before society again attains the same level.

One exception to this pattern-based knowledge is the weather, whose underlying physics governs how the atmosphere moves and adjusts. Because we understand the physics, we can replicate the atmosphere with computer models. Monitoring by weather stations and satellites provides the starting point for the models, which compute a forecast for how the weather will evolve. Today, forecast accuracy based on such models is generally good out to a week, sometimes even two.

But farmers need to think a season or more ahead. So do infrastructure planners as they design new energy and water systems. It may be feasible to develop the science and make the observations necessary to forecast weather a month or even a season in advance. We are also coming to understand enough of the physics to make useful global and regional climate projections a decade or more ahead.

The intermediate time period is our big challenge. Without substantial scientific breakthroughs, we will remain reliant on pattern-based methods for time periods between a month and a decade. … Our best knowledge is built on what we have seen in the past, like how fish populations respond to El Niño’s cycle. Climate change will further undermine our already limited ability to make these predictions. Anticipating ocean resources from one year to the next will become harder.

Civilization’s understanding of Earth has expanded enormously in recent decades, making humanity safer and more prosperous. As the patterns that we have come to expect are disrupted by warming temperatures, we will face huge challenges feeding a growing population and prospering within our planet’s finite resources. New developments in science offer our best hope for keeping up, but this is by no means guaranteed….(More)”

Tag monitors air pollution and never loses charge


Springwise: “The battle to clean up the air of major cities is well underway, with businesses and politicians pledging to help with the pollution issue. We have seen projects using mobile air sensors mounted on pigeons to bring the problem to public attention, and now a new crowdsourcing campaign is attempting to map the UK’s air pollution.

CleanSpace uses a portable, air pollution-sensing tag to track exposure to harmful pollutants in real-time. The tag is connected to an app, which analyzes and combines the data to that of other users in the UK to create an air pollution map.

An interesting part of the CleanSpace Tag’s technology is the fact it never needs to be charged. The startup say the tag is powered by harvesting 2G, 3G, 4G and wifi signals, which keep its small power requirements filled. The app also rewards users for traveling on-foot or by bike, offering them “CleanMiles” that can be exchanged for discounts with the CleanSpace’s partners.

The startup successfully raised more than GBP 100,000 in a crowdfunding campaign last year, and the team has given back GBP 10,000 to their charitable partners this year. …(More)”

Matchmaking Algorithms Are Unraveling the Causes of Rare Genetic Diseases


Regan Penaluna at Nautilus: “Jill Viles, an Iowa mother, was born with a rare type of muscular dystrophy. The symptoms weren’t really noticeable until preschool, when she began to fall while walking. She saw doctors, but they couldn’t diagnose her or supply a remedy. When she left for college, she was 5-foot-3 and weighed just 87 pounds.

How she would spend her time there turned into part of a remarkable story by David Epstein,published in ProPublica in January. Viles tore through her library’s medical literature and came up with a self-diagnosis—Emery-Dreifuss, a rare form of muscular dystrophy—and she was right. Then she came across photos of a female Canadian Olympic hurdler, Priscilla Lopes-Schliep, and she realized that, despite the hurdler’s muscular frame, she still displayed some of the same physical characteristics—similarly prominent arm and leg veins, peculiarly missing fat, and the same separation between butt and hip muscles. Eventually, in a slow, roundabout way, Viles managed to contact Lopes-Schliep and confirm that they shared the same type of partial lipodystrophy, Dunnigan-type. By comparing their genomes, scientists could determine that both women had a mutation in the same gene, though they were mutated in different ways—explaining, perhaps, why Viles’ muscles degenerated and Lopes-Schliep’s didn’t.

Viles’ story illustrates the challenge of finding the genetic cause for rare diseases, which some define as affecting less than 5 in 10,000 people. Heidi Rehm, a professor of pathology at Harvard Medical School, has set out to speed up and streamline the matching process. Since last July, Rehm and a group of geneticists launched Matchmaker Exchange, a network of gene databases that helps solve the causes of rare disease by matching the disease symptoms and genotype between at least two people’s cases. The goal in the next 5 to 10 years, Rehm says, is to see if there is a common variant in a novel gene that’s never been implicated in a disease. It’s been likened to the online dating site of rare genetic diseases. .Nautilus caught up with Rehm to learn more about her work….(More)”

Smart City and Smart Government: Synonymous or Complementary?


Paper by Leonidas G. Anthopoulos and Christopher G. Reddick: “Smart City is an emerging and multidisciplinary domain. It has been recently defined as innovation, not necessarily but mainly through information and communications technologies (ICT), which enhance urban life in terms of people, living, economy, mobility and governance. Smart government is also an emerging topic, which attracts increasing attention from scholars who work in public administration, political and information sciences. There is no widely accepted definition for smart government, but it appears to be the next step of e-government with the use of technology and innovation by governments for better performance. However, it is not clear whether these two terms co-exist or concern different domains. The aim of this paper is to investigate the term smart government and to clarify its meaning in relationship to the smart city. In this respect this paper performed a comprehensive literature review analysis and concluded that smart government is shown not to be synonymous with smart city. Our findings show that smart city has a dimension of smart government, and smart government uses smart city as an area of practice. The authors conclude that smart city is complimentary, part of larger smart government movement….(More)”