The war to free science


Brian Resnick and Julia Belluz at Vox: “The 27,500 scientists who work for the University of California generate 10 percent of all the academic research papers published in the United States.

Their university recently put them in a strange position: Sometime this year, these scientists will not be able to directly access much of the world’s published research they’re not involved in.

That’s because in February, the UC system — one of the country’s largest academic institutions, encompassing Berkeley, Los Angeles, Davis, and several other campuses — dropped its nearly $11 million annual subscription to Elsevier, the world’s largest publisher of academic journals.

On the face of it, this seemed like an odd move. Why cut off students and researchers from academic research?

In fact, it was a principled stance that may herald a revolution in the way science is shared around the world.

The University of California decided it doesn’t want scientific knowledge locked behind paywalls, and thinks the cost of academic publishing has gotten out of control.

Elsevier owns around 3,000 academic journals, and its articles account for some 18 percentof all the world’s research output. “They’re a monopolist, and they act like a monopolist,” says Jeffrey MacKie-Mason, head of the campus libraries at UC Berkeley and co-chair of the team that negotiated with the publisher.Elsevier makes huge profits on its journals, generating billions of dollars a year for its parent company RELX .

This is a story about more than subscription fees. It’s about how a private industry has come to dominate the institutions of science, and how librarians, academics, and even pirates are trying to regain control.

The University of California is not the only institution fighting back. “There are thousands of Davids in this story,” says University of California Davis librarian MacKenzie Smith, who, like so many other librarians around the world, has been pushing for more open access to science. “But only a few big Goliaths.”…(More)”.

Return on Data


Paper by Noam Kolt: “Consumers routinely supply personal data to technology companies in exchange for services. Yet, the relationship between the utility (U) consumers gain and the data (D) they supply — “return on data” (ROD) — remains largely unexplored. Expressed as a ratio, ROD = U / D. While lawmakers strongly advocate protecting consumer privacy, they tend to overlook ROD. Are the benefits of the services enjoyed by consumers, such as social networking and predictive search, commensurate with the value of the data extracted from them? How can consumers compare competing data-for-services deals?

Currently, the legal frameworks regulating these transactions, including privacy law, aim primarily to protect personal data. They treat data protection as a standalone issue, distinct from the benefits which consumers receive. This article suggests that privacy concerns should not be viewed in isolation, but as part of ROD. Just as companies can quantify return on investment (ROI) to optimize investment decisions, consumers should be able to assess ROD in order to better spend and invest personal data. Making data-for-services transactions more transparent will enable consumers to evaluate the merits of these deals, negotiate their terms and make more informed decisions. Pivoting from the privacy paradigm to ROD will both incentivize data-driven service providers to offer consumers higher ROD, as well as create opportunities for new market entrants….(More)”.

Federal Data Strategy: Use Cases


US Federal Data Strategy: “For the purposes of the Federal Data Strategy, a “Use Case” is a data practice or method that leverages data to support an articulable federal agency mission or public interest outcome. The Federal Data Strategy sought use cases from the public that solve problems or demonstrate solutions that can help inform the four strategy areas: Enterprise Data Governance; Use, Access, and Augmentation; Decision-making and Accountability; and Commercialization, Innovation, and Public Use. The Federal Data Strategy team was in part informed by these submissions, which are posted below…..(More)”.

We Read 150 Privacy Policies. They Were an Incomprehensible Disaster.


Kevin Litman-Navarro at the New York Times: “….I analyzed the length and readability of privacy policies from nearly 150 popular websites and apps. Facebook’s privacy policy, for example, takes around 18 minutes to read in its entirety – slightly above average for the policies I tested….

Despite efforts like the General Data Protection Regulation to make policies more accessible, there seems to be an intractable tradeoff between a policy’s readability and length. Even policies that are shorter and easier to read can be impenetrable, given the amount of background knowledge required to understand how things like cookies and IP addresses play a role in data collection….

So what might a useful privacy policy look like?

Consumers don’t need a technical understanding of data collection processes in order to protect their personal information. Instead of explaining the excruciatingly complicated inner workings of the data marketplace, privacy policies should help people decide how they want to present themselves online. We tend to go on the internet privately – on our phones or at home – which gives the impression that our activities are also private. But, often, we’re more visible than ever.

A good privacy policy would help users understand how exposed they are: Something as simple as a list of companies that might purchase and use your personal information could go a long way towards setting a new bar for privacy-conscious behavior. For example, if you know that your weather app is constantly tracking your whereabouts and selling your location data as marketing research, you might want to turn off your location services entirely, or find a new app.

Until we reshape privacy policies to meet our needs — or we find a suitable replacement — it’s probably best to act with one rule in mind. To be clear and concise: Someone’s always watching….(More)”.

Information Sharing as a Dimension of Smartness: Understanding Benefits and Challenges in Two Megacities


Paper by J. Ramon Gil-Garcia, Theresa A. Pardo, and Manuel De Tuya: “Cities around the world are facing increasingly complex problems.

These problems frequently require collaboration and information sharing across agency boundaries.

In our view, information sharing can be seen as an important dimension of what is recently being called smartness in cities and enables the ability to improve decision making and day-to-day operations in urban settings. Unfortunately, what many city managers are learning is that there are important challenges to sharing information both within their city and with others.

Based on nonemergency service integration initiatives in New York City and Mexico City, this article examines important benefits from and challenges to information sharing in the context of what the participants characterize as smart city initiatives, particularly in large metropolitan areas.

The research question guiding this study is as follows: To what extent do previous findings about information sharing hold in the context of city initiatives, particularly in megacities?

The results provide evidence on the importance of some specific characteristics of cities and megalopolises and how they affect benefits and challenges of information sharing. For instance, cities seem to have more managerial flexibility than other jurisdictions such as state governments.

In addition, megalopolises have most of the necessary technical skills and financial resources needed for information sharing and, therefore, these challenges are not as relevant as in other local governments….(More)”.

How Organizations with Data and Technology Skills Can Play a Critical Role in the 2020 Census


Blog Post by Kathryn L.S. Pettit and Olivia Arena: “The 2020 Census is less than a year away, and it’s facing new challenges that could result in an inaccurate count. The proposed inclusion of a citizenship question, the lack of comprehensive and unified messaging, and the new internet-response option could worsen the undercount of vulnerable and marginalized communities and deprive these groups of critical resources.

The US Census Bureau aims to count every US resident. But some groups are more likely to be missed than others. Communities of color, immigrants, young children, renters, people experiencing homelessness, and people living in rural areas have long been undercounted in the census. Because the census count is used to apportion federal funding and draw legislative districts for political seats, an inaccurate count means that these populations receive less than their fair share of resources and representation.

Local governments and community-based organizations have begun forming Complete Count Committees, coalitions of trusted community voices established to encourage census responses, to achieve a more accurate count in 2020. Local organizations with data and technology skills—like civic tech groups, libraries, technology training organizations, and data intermediaries—can harness their expertise to help these coalitions achieve a complete count.

As the coordinator of the National Neighborhood Indicators Partnership (NNIP), we are learning about 2020 Census mobilization in communities across the country. We have found that data and technology groups are natural partners in this work; they understand what is at risk in 2020, are embedded in communities as trusted data providers, and can amplify the importance of the census.

Threats to a complete count

The proposed citizenship question, currently being challenged in court, would likely suppress the count of immigrants and households in immigrant communities in the US. Though federal law prohibits the Census Bureau from disclosing individual-level data, even to other agencies, people may still be skeptical about the confidentiality of the data or generally distrust the government. Acknowledging these fears is important for organizations partnering in outreach to vulnerable communities.

Another potential hurdle is that, for the first time, the Census Bureau will encourage people to complete their census forms online (though answering by mail or phone will still be options). Though a high tech census could be more cost-effective, the digital divide compounded by the underfunding of the Census Bureau that limited initial testing of new methods and outreach could worsen the undercount….(More)”.

Measuring impact by design: A guide to methods for impact measurement


Privy Council Office (Canada): “…This document is intended to be both an accessible introduction to the topic, as well as a reference for those involved in the design, delivery, procurement or appraisal of impact measurement strategies for Impact Canada projects. Drawing on best practices, Measuring Impact by Design was written to guide its readers to think differently about measuring impact than we have traditionally done within the federal public service.

In its role leading Impact Canada as a whole-of-government effort, the IIU works with an ever-expanding network of partners to deliver a range of innovative, outcomes-based program approaches. We are aware that program spending is an investment that we are making on behalf of, and directly for Canadians, and we need to place a greater emphasis on understanding what differences these investments make in improving the lives of citizens. That means we need a better understanding of what works, for whom, and in what contexts; and we need a better understanding of what kinds of investments are likely to maximize the social, economic and environmental returns we seek.

“We are aware that program spending is an investment that we are making on behalf of, and directly for Canadians, and we need to place a greater emphasis on understanding what differences these investments make in improving the lives of citizens.”

Good impact measurement practices are fundamental to these understandings and it is incumbent upon us to be rigorous in our efforts. We recognize that we are still building our capacity in government deliver on these approaches. It is why we built flexibility within Impact Canada authorities to use grants and contributions to fund research organizations with expertise in the kinds of techniques outlined in this guide. We encourage our partner departments to consider taking up these flexibilities.

Measuring Impact by Design is one of a number of supports that the IIU provides to deliver on its commitment to improve measurement practices for Impact Canada. We look forward to continued collaboration with our partners in the delivery of these important outcomes-based approaches across the public sector….(More)”.

The Landscape of Open Data Policies


Apograf: “Open Access (OA) publishing has a long history, going back to the early 1990s, and was born with the explicit intention of improving access to scholarly literature. The internet has played a pivotal role in garnering support for free and reusable research publications, as well as stronger and more democratic peer-review systems — ones are not bogged down by the restrictions of influential publishing platforms….

Looking back, looking forward

Launched in 1991, ArXiv.org was a pioneering platform in this regard, a telling example of how researchers could cooperate to publish academic papers for free and in full view for the public. Though it has limitations — papers are curated by moderators and are not peer-reviewed — arXiv is a demonstration of how technology can be used to overcome some of the incentive and distribution problems that scientific research had long been subjected to.

The scientific community has itself assumed the mantle to this end: the Budapest Open Access Initiative (BOAI) and the Berlin Declaration on Open Access Initiative, launched in 2002 and 2003 respectively, are considered landmark movements in the push for unrestricted access to scientific research. While mostly symbolic, the effort highlighted the growing desire to solve the problems plaguing the space through technology.

The BOAI manifesto begins with a statement that is an encapsulation of the movement’s purpose,

“An old tradition and a new technology have converged to make possible an unprecedented public good. The old tradition is the willingness of scientists and scholars to publish the fruits of their research in scholarly journals without payment, for the sake of inquiry and knowledge. The new technology is the internet. The public good they make possible is the world-wide electronic distribution of the peer-reviewed journal literature and completely free and unrestricted access to it by all scientists, scholars, teachers, students, and other curious minds.”

Plan S is a more recent attempt to make publicly funded research available to all. Launched by Science Europe in September 2018, Plan S — short for ‘Shock’ — has energized the research community with its resolution to make access to publicly funded knowledge a right to everyone and dissolve the profit-driven ecosystem of research publication. Members of the European Union have vowed to achieve this by 2020.

Plan S has been supported by governments outside Europe as well. China has thrown itself behind it, and the state of California has enacted a law that requires open access to research one year after publishing. It is, of course, not without its challenges: advocacy and ensuring that publishing is not restricted a few venues are two such obstacles. However, the organization behind forming the guidelines, cOAlition S, has agreed to make the guidelines more flexible.

The emergence of this trend is not without its difficulties, however, and numerous obstacles continue to hinder the dissemination of information in a manner that is truly transparent and public. Chief among these are the many gates that continue to keep research as somewhat of exclusive property, besides the fact that the infrastructure and development for such systems are short on funding and staff…..(More)”.

Democracy (Re)Imagined


Chapter by Oldrich Bubak and Henry Jacek in Trivialization and Public Opinion: “Democracy (Re)Imagined begins with a brief review of opinion surveys, which, over the recent decades, indicate steady increases in the levels of mistrust of the media, skepticism of the government’s effectiveness, and the public’s antipathy toward politics. It thus continues to explore the realities and the logic behind these perspectives. What can be done to institute good governance and renew the faith in the democratic system? It is becoming evident that rather than relying on the idea of more democracy, governance for the new age is smart, bringing in people where they are most capable and engaged. Here, the focus is primarily on the United States providing an extreme case in the evolution of democratic systems and a rationale for revisiting the tenets of governance.

Earlier, we have identified some deep lapses in public discourse and alluded to a number of negative political and policy outcomes across the globe. It may thus not be a revelation that the past several decades have seen a disturbing trend apparent in the views and choices of people throughout the democratic world—a declining political confidence and trust in government. These have been observed in European nations, Canada as well as the United States, countries different in their political and social histories (Dalton 2017). Consider some numbers from a recent US poll, the 2016 Survey of American Political Culture. The survey found, for example, that 64% of the American public had little or no confidence in the federal government’s capacity to solve problems (up from 60% in 1996), while 56% believed “the government in Washington threatens the freedom of ordinary Americans.” About 88% of respondents thought “political events these days seem more like theater or entertainment than like something to be taken seriously” (up from 79% in 1996). As well, 75% of surveyed individuals thought that one cannot “believe much” the mainstream media content (Hunter and Bowman 2016). As in other countries, such numbers, consistent across polls, tell a story much different than responses collected half a century ago.

Some, unsurprised, argue citizens have always had a level of skepticism and mistrust toward their government but appreciated their regime legitimacy, a democratic capacity to exercise their will and choose a new government. However, other scholars are arriving at a more pessimistic conclusion: People have begun questioning the very foundations of their systems of government—the legitimacy of liberal democratic regimes. Foa and Mounk, for example, examined responses from three waves of cross-national surveys (1995–2014) focusing on indicators of regime legitimacy: “citizens’ express support for the system as a whole; the degree to which they support key institutions of liberal democracy, such as civil rights; their willingness to advance their political causes within the existing political system; and their openness to authoritarian alternatives such as military rule” (2016, 6). They find citizens to be not only progressively critical of their government but also “cynical about the value of democracy as a political system, less hopeful that anything they do might influence public policy, and more willing to express support for authoritarian alternatives” (2016, 7). The authors point out that in 2011, 24% of those born in the 1980s thought democracy 1 was a “bad” system for the US, while 26% of the same cohort believed it is unimportant 2 for people to “choose their leaders in free elections.” Also in 2011, 32% of respondents of all ages reported a preference for a “strong leader” who need not “bother with parliament and elections” (up from 24% in 1995). As well, Foa and Mounk (2016) observe a decrease in interest and participation in conventional (including voting and political party membership) and non-conventional political activities (such as participation in protests or social movement).

These responses only beckon more questions, particularly as some scholars believe that “[t]he changing values and skills of Western publics encourage a new type of assertive or engaged citizen who is skeptical about political elites and the institutions of representative democracy” (Dalton 2017, 391). In this and the next chapter, we explore the realities and the logic behind these perspectives. Is the current system working as intended? What can be done to renew the faith in government and citizenship? What can we learn from how public comes to their opinions? We focus primarily on the developments in the United States, providing an extreme case in an evolution of a democratic system and a rationale for revisiting the tenets of governance. We will begin to discern the roots of many of the above stances and see that regaining effectiveness and legitimacy in modern governance demands more than just “more democracy.” Governance for the new age is smart, bringing in citizens where they are most capable and engaged. But change will demand a proper understanding of the underlying problems and a collective awareness of the solutions. And getting there requires us to cope with trivialization….(More)”

How Can We Overcome the Challenge of Biased and Incomplete Data?


Knowledge@Wharton: “Data analytics and artificial intelligence are transforming our lives. Be it in health care, in banking and financial services, or in times of humanitarian crises — data determine the way decisions are made. But often, the way data is collected and measured can result in biased and incomplete information, and this can significantly impact outcomes.  

In a conversation with Knowledge@Wharton at the SWIFT Institute Conference on the Impact of Artificial Intelligence and Machine Learning in the Financial Services Industry, Alexandra Olteanu, a post-doctoral researcher at Microsoft Research, U.S. and Canada, discussed the ethical and people considerations in data collection and artificial intelligence and how we can work towards removing the biases….

….Knowledge@Wharton: Bias is a big issue when you’re dealing with humanitarian crises, because it can influence who gets help and who doesn’t. When you translate that into the business world, especially in financial services, what implications do you see for algorithmic bias? What might be some of the consequences?

Olteanu: A good example is from a new law in the New York state according to which insurance companies can now use social media to decide the level for your premiums. But, they could in fact end up using incomplete information. For instance, you might be buying your vegetables from the supermarket or a farmer’s market, but these retailers might not be tracking you on social media. So nobody knows that you are eating vegetables. On the other hand, a bakery that you visit might post something when you buy from there. Based on this, the insurance companies may conclude that you only eat cookies all the time. This shows how even incomplete data can affect you….(More)”.