Introduction to Special Issue of California Management Review by Boyd Cohen, Esteve Almirall, and Henry Chesbrough: “This article introduces the special issue on the increasing role of cities as a driver for (open) innovation and entrepreneurship. It frames the innovation space being cultivated by proactive cities. Drawing on the diverse papers selected in this special issue, this introduction explores a series of tensions that are emerging as innovators and entrepreneurs seek to engage with local governments and citizens in an effort to improve the quality of life and promote local economic growth…Urbanization, the democratization of innovation and technology, and collaboration are converging paradigms helping to drive entrepreneurship and innovation in urban areas around the globe. These three factors are converging to drive innovation and entrepreneurship in cities and have been referred to as the urbanpreneur spiral….(More)”
Using GitHub in Government: A Look at a New Collaboration Platform
Justin Longo at the Center for Policy Informatics: “…I became interested in the potential for using GitHub to facilitate collaboration on text documents. This was largely inspired by the 2012 TED Talk by Clay Shirky where he argued that open source programmers could teach us something about how to do open governance:
Somebody put up a tool during the copyright debate last year in the Senate, saying, “It’s strange that Hollywood has more access to Canadian legislators than Canadian citizens do. Why don’t we use GitHub to show them what a citizen-developed bill might look like?” …
For this research, we undertook a census of Canadian government and public servant accounts on GitHub and surveyed those users, supplemented by interviews with key government technology leaders.
This research has now been published in the journal Canadian Public Administration. (If you don’t have access to the full document through the publisher, you can also find it here).
Despite the growing enthusiasm for GitHub (mostly from those familiar with open source software development), and the general rhetoric in favour of collaboration, we suspected that getting GitHub used in public sector organizations for text collaboration might be an uphill battle – not least of which because of the steep learning curve involved in using GitHub, and its inflexibility when being used to edit text.
The history of computer-supported collaborative work platforms is littered with really cool interfaces that failed to appeal to users. The experience to date with GitHub in Canadian governments reflects this, as far as our research shows.
We found few government agencies having an active presence on GitHub compared to social media presence in general. And while federal departments and public servants on GitHub are rare, provincial, territorial, First Nations and local governments are even rarer.
For individual accounts held by public servants, most were found in the federal government at higher rates than those found in broader society (see Mapping Collaborative Software). Within this small community, the distribution of contributions per user follows the classic long-tail distribution with a small number of contributors responsible for most of the work, a larger number of contributors doing very little on average, and many users contributing nothing.
GitHub is still resisted by all but the most technically savvy. With a peculiar terminology and work model that presupposes a familiarity with command line computer operations and the language of software coding, using GitHub presents many barriers to the novice user. But while it is tempting to dismiss GitHub, as it currently exists, as ill-suited as a collaboration tool to support document writing, it holds potential as a useful platform for facilitating collaboration in the public sector.
As an example, to help understand how GitHub might be used within governments for collaboration on text documents, we discuss a briefing note document flow in the paper (see the paper for a description of this lovely graphic).
A few other finding are addressed in the paper, from why public servants may choose not to collaborate even though they believe it’s the right thing to do, to an interesting story about what propelled the use of GitHub in the government of Canada in the first place….(More)”
Scientists have a word for studying the post-truth world: agnotology
epistemology, or the study of knowledge. This field helps define what we know and why we know it. On the flip side of this is agnotology, or the study of ignorance. Agnotology is not often discussed, because studying the absence of something — in this case knowledge — is incredibly difficult.
But scientists have another word for “post-truth”. You might have heard ofDoubt is our product
Agnotology is more than the study of what we don’t know; it’s also the study of why we are not supposed to know it. One of its more important aspects is revealing how people, usually powerful ones, use ignorance as a strategic tool to hide or divert attention from societal problems in which they have a vested interest.
A perfect example is the tobacco industry’s dissemination of reports that continuously questioned the link between smoking and cancer. As one tobacco employee famously stated, “Doubt is our product.”
In a similar way, conservative think tanks such as The Heartland Institute work to discredit the science behind human-caused climate change.
Despite the fact that 97% of scientists support the anthropogenic causes of climate change, hired “experts” have been able to populate talk shows, news programmes, and the op-ed pages to suggest a lack of credible data or established consensus, even with evidence to the contrary.
These institutes generate pseudo-academic reports to counter scientific results. In this way, they are responsible for promoting ignorance….
Under agnotology 2.0, truth becomes a moot point. It is the sensation that counts. Public media leaders create an impact with whichever arguments they can muster based in whatever fictional data they can create…Donald Trump entering the White House is the pinnacle of agnotology 2.0. Washington Post journalist Fareed Zakaria has argued that in politics, what matters is no longer the economy but identity; we would like to suggest that the problem runs deeper than that.
The issue is not whether we should search for identity, for fame, or for sensational opinions and entertainment. The overarching issue is the fallen status of our collective search for truth, in its many forms. It is no longer a positive attribute to seek out truth, determine biases, evaluate facts, or share knowledge.
Under agnotology 2.0, scientific thinking itself is under attack. In a post-fact and post-truth era, we could very well become post-science….(More)”.
How statistics lost their power – and why we should fear what comes next
William Davies in The Guardian: “In theory, statistics should help settle arguments. They ought to provide stable reference points that everyone – no matter what their politics – can agree on. Yet in recent years, divergent levels of trust in statistics has become one of the key schisms that have opened up in western liberal democracies. Shortly before the November presidential election, a study in the US discovered that 68% of Trump supporters distrusted the economic data published by the federal government. In the UK, a research project by Cambridge University and YouGov looking at conspiracy theories discovered that 55% of the population believes that the government “is hiding the truth about the number of immigrants living here”.
Rather than diffusing controversy and polarisation, it seems as if statistics are actually stoking them. Antipathy to statistics has become one of the hallmarks of the populist right, with statisticians and economists chief among the various “experts” that were ostensibly rejected by voters in 2016. Not only are statistics viewed by many as untrustworthy, there appears to be something almost insulting or arrogant about them. Reducing social and economic issues to numerical aggregates and averages seems to violate some people’s sense of political decency.
Nowhere is this more vividly manifest than with immigration. The thinktank British Future has studied how best to win arguments in favour ofimmigration and multiculturalism. One of its main findings is that people often respond warmly to qualitative evidence, such as the stories of individual migrants and photographs of diverse communities. But statistics – especially regarding alleged benefits of migration to Britain’s economy – elicit quite the opposite reaction. People assume that the numbers are manipulated and dislike the elitism of resorting to quantitative evidence. Presented with official estimates of how many immigrants are in the country illegally, a common response is to scoff. Far from increasing support for immigration, British Future found, pointing to its positive effect on GDP can actually make people more hostile to it. GDP itself has come to seem like a Trojan horse for an elitist liberal agenda. Sensing this, politicians have now largely abandoned discussing immigration in economic terms.
All of this presents a serious challenge for liberal democracy. Put bluntly, the British government – its officials, experts, advisers and many of its politicians – does believe that immigration is on balance good for the economy. The British government did believe that Brexit was the wrong choice. The problem is that the government is now engaged in self-censorship, for fear of provoking people further.
This is an unwelcome dilemma. Either the state continues to make claims that it believes to be valid and is accused by sceptics of propaganda, or else, politicians and officials are confined to saying what feels plausible and intuitively true, but may ultimately be inaccurate. Either way, politics becomes mired in accusations of lies and cover-ups.
The declining authority of statistics – and the experts who analyse them – is at the heart of the crisis that has become known as “post-truth” politics. And in this uncertain new world, attitudes towards quantitative expertise have become increasingly divided. From one perspective, grounding politics in statistics is elitist, undemocratic and oblivious to people’s emotional investments in their community and nation. It is just one more way that privileged people in London, Washington DC or Brussels seek to impose their worldview on everybody else. From the opposite perspective, statistics are quite the opposite of elitist. They enable journalists, citizens and politicians to discuss society as a whole, not on the basis of anecdote, sentiment or prejudice, but in ways that can be validated. The alternative to quantitative expertise is less likely to be democracy than an unleashing of tabloid editors and demagogues to provide their own “truth” of what is going on across society.
Is there a way out of this polarisation? Must we simply choose between a politics of facts and one of emotions, or is there another way of looking at this situation?One way is to view statistics through the lens of their history. We need to try and see them for what they are: neither unquestionable truths nor elite conspiracies, but rather as tools designed to simplify the job of government, for better or worse. Viewed historically, we can see what a crucial role statistics have played in our understanding of nation states and their progress. This raises the alarming question of how – if at all – we will continue to have common ideas of society and collective progress, should statistics fall by the wayside….(More).”
DataCollaboratives.org – A New Resource on Creating Public Value by Exchanging Data
Recent years have seen exponential growth in the amount of data being generated and stored around the world. There is increasing recognition that this data can play a key role in solving some of the most difficult public problems we face.
However, much of the potentially useful data is currently privately held and not available for public insights. Data in the form of web clicks, social “likes,” geo location and online purchases are typically tightly controlled, usually by entities in the private sector. Companies today generate an ever-growing stream of information from our proliferating sensors and devices. Increasingly, they—and various other actors—are asking if there is a way to make this data available for the public good. There is an ongoing search for new models of corporate responsibility in the digital era around data toward the creation of “data collaboratives”.
Today, the GovLab is excited to launch a new resource for Data Collaboratives (datacollaboratives.org). Data Collaboratives are an emerging form of public-private partnership in which participants from different sectors — including private companies, research institutions, and government agencies — exchange data to help solve public problems.
The resource results from different partnerships with UNICEF (focused on creating data collaboratives to improve children’s lives) and Omidyar Network (studying new ways to match (open) data demand and supply to increase impact).
Natalia Adler, a data, research and policy planning specialist and the UNICEF Data Collaboratives Project Lead notes, “At UNICEF, we’re dealing with the world’s most complex problems affecting children. Data Collaboratives offer an exciting opportunity to tap on previously inaccessible datasets and mobilize a wide range of data expertise to advance child rights around the world. It’s all about connecting the dots.”
To better understand the potential of these Collaboratives, the GovLab collected information on dozens of examples from across the world. These many and diverse initiatives clearly suggest the potential of Data Collaboratives to improve people’s lives when done responsibly. As Stefaan Verhulst, co-founder of the GovLab, puts it: “In the coming months and years, Data Collaboratives will be essential vehicles for harnessing the vast stores of privately held data toward the public good.”
In particular, our research to date suggests that Data Collaboratives offer a number of potential benefits, including enhanced:
- Situational Awareness and Response: For example, Orbital Insights and the World Bank are using satellite imagery to measure and track poverty. This technology can, in some instances, “be more accurate than U.S. census data.”
- Public Service Design and Delivery: Global mapping company, Esri, and Waze’s Connected Citizen’s program are using crowdsourced traffic information to help governments design better transportation.
- Knowledge Creation and Transfer: The National Institutes of Health (NIH), the U.S. Food and Drug Administration (FDA), 10 biopharmaceutical companies and a number of non-profit organizations are sharing data to create new, more effective diagnostics and therapies for medical patients.
- Prediction and Forecasting: Intel and the Earth Research Institute at the University of California Santa Barbara (UCSB) are using satellite imagery to predict drought conditions and develop targeted interventions for farmers and governments.
- Impact Assessment and Evaluation: Nielsen and the World Food Program (WFP) have been using data collected via mobile phone surveys to better monitor food insecurity in order to advise the WFP’s resource allocations….(More)
Popular Democracy: The Paradox of Participation
Book by Gianpaolo Baiocchi and Ernesto Ganuza: “Local participation is the new democratic imperative. In the United States, three-fourths of all cities have developed opportunities for citizen involvement in strategic planning. The World Bank has invested $85 billion over the last decade to support community participation worldwide. But even as these opportunities have become more popular, many contend that they have also become less connected to actual centers of power and the jurisdictions where issues relevant to communities are decided.
With this book, Gianpaolo Baiocchi and Ernesto Ganuza consider the opportunities and challenges of democratic participation. Examining how one mechanism of participation has traveled the world—with its inception in Porto Alegre, Brazil, and spread to Europe and North America—they show how participatory instruments have become more focused on the formation of public opinion and are far less attentive to, or able to influence, actual reform. Though the current impact and benefit of participatory forms of government is far more ambiguous than its advocates would suggest, Popular Democracy concludes with suggestions of how participation could better achieve its political ideals….(More)”
Urban Exposures: How Cell Phone Data Helps Us Better Understand Human Exposure To Air Pollution
Senseable City Lab: “Global urbanization has led to one of the world’s most pressing environmental health concerns: the increasing number of people contributing to and being affected by air pollution, leading to 7 million early deaths each year. The key issue is human exposure to pollution within cities and the consequential effects on human health.
With new research conducted at MIT’s Senseable City Lab, human exposure to air pollution can now be accurately quantified at an unprecedented scale. Researchers mapped the movements of several million people using ubiquitous cell phone data, and intersected this information with neighborhood air pollution measures. Covering the expanse of New York City and its 8.5 million inhabitants, the study reveals where and when New Yorkers are most at risk of exposure to air pollution – with major implications for environment and public health policy… (More)”
From Servants to Stewards: Design-led Innovation in the Public Sector
Adam Hasler: “For years, and very acutely the last few months, citizens of the United States and in many other parts of the world have been pitched into an often uncomfortable morass of debate and discussion about the direction of their country. Problems exist, and persist, which government at all levels has tried to address or currently addresses, and government’s efficacy at addressing problems affects all of us in some way. At such an historical moment like the one in which we live, in which a competing visions of government excite or frighten so many, we remember how much government matters to us.
A very powerful anecdote told to a crowd of listeners at Harvard recently recounted how, during a United States Digital Service project, the prototype for a project delivered to a decision maker and her team didn’t include a feature that was very clearly dictated to them in the requirements. The head of the United States Digital Service team that facilitated the project received an angry call summoning her to the director’s office. There, the policy maker who had added the requirement asked for an explanation why the prototype didn’t meet requirements. “We described to her that we actually took this prototype to a school, and had people use it. It wasn’t a feature they wanted or used, so it didn’t make sense to build it.” The simple common sense of the logic of design-thinking immediately resonated with the policy maker. “Yeah, we shouldn’t build it if they don’t need it.” She stopped for a moment, and continued, “Oh my gosh, this is great, we should do everything like this, we should make policy like this!”
“Yeah, we shouldn’t build it if they don’t need it.” She stopped for a moment, and continued, “Oh my gosh, this is great! We should do everything like this! We should make policy like this!”
This story demonstrates how a growing movement within governments around the world has begun improve the public sector through design-led innovation. This article, presented in four parts, explores various aspects of that movement. To get right to it, the “design” in design-led innovation refers in this work specifically to design thinking, or the idea that design is a process, rather than a domain of outputs. You’ll see that I advocate strongly for a particular design process known as human-centered design, commonly referred to as HCD. HCD is a process made up of alternating divergence and convergence by which an individual or team starts by empathetically understanding a problem through close interaction with the people that experience it. The team then extends that co-creation to the solution phase, and experiments with ideas originating from both the team the humans who have the problem. It relies heavily on prototyping and small-scale releases of potential solutions to facilitate multiple iterations and get as close as possible to a solution whose effectiveness the team measures relative to its ability to solve the original problem. This may represent a bit of a switch to some: rather than become enamored of and advocate for a favored genius idea, many of today’s best designers fall in love with the problem, and don’t rest until a solution, originating from anywhere, gets it closer to solved.
I define innovation here as the process of developing and cultivating new ideas, often from individuals throughout an organization and even outside of it, thereby maximizing the potential of all of the resources at an organization’s disposal and often breaking down organizational silos. The marriage of innovation and design thinking suggests a strategy in which innovation encourages new ideas and helps an organization adapt to ever-changing conditions, and a transparent process that helps to develop a deep understanding of a problem, decreases cost and mitigates the risk of releasing something that doesn’t solve the problem, and provides a mechanism for questioning the system itself.
This work culminates an introductory research project for me. At the heart of the work is the question, “How can design thinking and innovation improve public sector effectiveness, provide more opportunities for rewarding political participation, and facilitate the pursuit of ambitious, shared goals that move us into the future?…(More)”
Cancer Research Orgs Release Big Data for Precision Medicine
Jennifer Bresnick at HealthITAnalytics: “The American Association for Cancer Research (AACR) is releasing more than 19,000 de-identified genomic records to further the international research community’s explorations into precision medicine.
The big data dump, which includes information on 59 major types of cancer, including breast, colorectal, and lung cancer, is a result of the AACR Project Genomics Evidence Neoplasia Information Exchange (GENIE) initiative, and includes both genomic and some clinical data on consenting patients….
“These data were generated as part of routine patient care and without AACR Project GENIE they would likely never have been shared with the global cancer research community.”
Eight cancer research institutions, including five based in the United States, have contributed to the first phase of the GENIE project. Dana-Farber Cancer Institute in Boston, Memorial Sloan Kettering Cancer Center in New York City, and the University of Texas MD Anderson Cancer Center in Houston are among the collaborators.
Alongside institutions in Paris, the Netherlands, Toronto, Nashville, and Baltimore, these organizations aim to expand the research community’s knowledge of cancer and its potential treatments by continuing to make the exchange of high-grade clinical data a top priority.
“We are committed to sharing not only the real-world data within the AACR Project GENIE registry but also our best practices, from tips about assembling an international consortium to the best variant analysis pipeline, because only by working together will information flow freely and patients benefit rapidly,” Sawyers added…
Large-scale initiatives like the AACR Project GENIE, alongside separate data collection efforts like the VA’s Million Veterans Project, the CancerLinQ platform, Geisinger Health System’s MyCode databank, and the nascent PMI Cohort, will continue to make critical genomic and clinical data available to investigators across the country and around the world…(More)”.
Beyond IRBs: Designing Ethical Review Processes for Big Data Research
Conference Proceedings by Future of Privacy Forum: “The ethical framework applying to human subject research in the biomedical and behavioral research fields dates back to the Belmont Report.Drafted in 1976 and adopted by the United States government in 1991 as the Common Rule, the Belmont principles were geared towards a paradigmatic controlled scientific experiment with a limited population of human subjects interacting directly with researchers and manifesting their informed consent. These days, researchers in academic institutions as well as private sector businesses not subject to the Common Rule, conduct analysis of a wide array of data sources, from massive commercial or government databases to individual tweets or Facebook postings publicly available online, with little or no opportunity to directly engage human subjects to obtain their consent or even inform them of research activities.
Data analysis is now used in multiple contexts, such as combatting fraud in the payment card industry, reducing the time commuters spend on the road, detecting harmful drug interactions, improving marketing mechanisms, personalizing the delivery of education in K-12 schools, encouraging exercise and weight loss, and much more. And companies deploy data research not only to maximize economic gain but also to test new products and services to ensure they are safe and effective. These data uses promise tremendous societal benefits but at the same time create new risks to privacy, fairness, due process and other civil liberties.
Increasingly, corporate officers find themselves struggling to navigate unsettled social norms and make ethical choices that are more befitting of philosophers than business managers or even lawyers. The ethical dilemmas arising from data analysis transcend privacy and trigger concerns about stigmatization, discrimination, human subject research, algorithmic decision making and filter bubbles.
The challenge of fitting the round peg of data-focused research into the square hole of existing ethical and legal frameworks will determine whether society can reap the tremendous opportunities hidden in the data exhaust of governments and cities, health care institutions and schools, social networks and search engines, while at the same time protecting privacy, fairness, equality and the integrity of the scientific process. One commentator called this “the biggest civil rights issue of our time.”…(More)”