100 Stories: The Impact of Open Access


Report by Jean-Gabriel Bankier and Promita Chatterji: “It is time to reassess how we talk about the impact of open access. Early thought leaders in the field of scholarly communications sparked our collective imagination with a compelling vision for open access: improving global access to knowledge, advancing science, and providing greater access to education.1 But despite the fact that open access has gained a sizable foothold, discussions about the impact of open access are often still stuck at the level of aspirational or potential benefit. Shouldn’t we be able to gather real examples of positive outcomes to demonstrate the impact of open access? We need to get more concrete. Measurements like

Measurements like altmetrics and download counts provide useful data about usage, but remain largely indicators of early-level interest rather actual outcomes and benefits. There has been considerable research into how open access affects citation counts,2 but beyond that discussion there is still a gap between the hypothetical societal good of open access and the minutiae of usage and interest measurements. This report begins to bridge that gap by presenting a framework, drawn from 100 real stories that describe the impact of open access. Collected by bepress from across 500 institutions and 1400 journals using Digital Commons as their publishing and/or institutional repository platform, these stories present information about actual outcomes, benefits, and impacts.

This report brings to light the wide variety of scholarly and cultural activity that takes place on university campuses and the benefit resulting from greater visibility and access to these materials. We hope that administrators, authors, students, and others will be empowered to articulate and amplify the impact of their own work. We also created the framework to serve as a tool for stakeholders who are interested in advocating for open access on their campus yet lack the specific vocabulary and suitable examples. Whether it is a librarian hoping to make the case for open access with reluctant administrators or faculty, a faculty member who wants to educate students about changing modes of publishing, a funding agency looking for evidence in support of its open access requirement, or students advocating for educational affordability, the framework and stories themselves can be a catalyst for these endeavors. Put more simply, these are 100 stories to answer the question: “why does open access matter?”…(More)”

Screen Shot 2016-10-29 at 7.09.03 PM

Open parliament policy applied to the Brazilian Chamber of Deputies


Paper by  &   in The Journal of Legislative Studies:”…analyse the implementation of an open parliament policy that is taking place at the Chamber of Deputies, in accordance with the guidelines of the Open Government Partnership international programme (OGP), regarding the action plan of the Opening Parliament Work Group in particular, one of the subgroups of OGP. The authors will evaluate two blocks of initiatives for open parliaments executed by the Chamber in the last few years, that is, digital participation in the legislative process and Transparency 2.0, in order to observe their impasses and results obtained until now. In the first part the authors will study the e-Democracy portal and in the second part the authors will focus on open data, collaborative activities to use those data (hackathons) and the creation of the Hacker Lab, a permanent space dedicated to open parliament practices. The analysis considers the initiatives that the authors evaluated as part of the transformative and arena profiles of the Brazilian Parliament, according to Polsby’s classification, with exclusive characteristics…. (More)”

See also Hacking Parliament

Crowdsourcing and cellphone data could help guide urban revitalization


Science Magazine: “For years, researchers at the MIT Media Lab have been developing a database of images captured at regular distances around several major cities. The images are scored according to different visual characteristics — how safe the depicted areas look, how affluent, how lively, and the like….Adjusted for factors such as population density and distance from city centers, the correlation between perceived safety and visitation rates was strong, but it was particularly strong for women and people over 50. The correlation was negative for people under 30, which means that males in their 20s were actually more likely to visit neighborhoods generally perceived to be unsafe than to visit neighborhoods perceived to be safe.

In the same paper, the researchers also identified several visual features that are highly correlated with judgments that a particular area is safe or unsafe. Consequently, the work could help guide city planners in decisions about how to revitalize declining neighborhoods.,,,

Jacobs’ theory, Hidalgo says, is that neighborhoods in which residents can continuously keep track of street activity tend to be safer; a corollary is that buildings with street-facing windows tend to create a sense of safety, since they imply the possibility of surveillance. Newman’s theory is an elaboration on Jacobs’, suggesting that architectural features that demarcate public and private spaces, such as flights of stairs leading up to apartment entryways or archways separating plazas from the surrounding streets, foster the sense that crossing a threshold will bring on closer scrutiny….(More)”

Artificial Intelligence can streamline public comment for federal agencies


John Davis at the Hill: “…What became immediately clear to me was that — although not impossible to overcome — the lack of consistency and shared best practices across all federal agencies in accepting and reviewing public comments was a serious impediment. The promise of Natural Language Processing and cognitive computing to make the public comment process light years faster and more transparent becomes that much more difficult without a consensus among federal agencies on what type of data is collected – and how.

“There is a whole bunch of work we have to do around getting government to be more customer friendly and making it at least as easy to file your taxes as it is to order a pizza or buy an airline ticket,” President Obama recently said in an interview with WIRED. “Whether it’s encouraging people to vote or dislodging Big Data so that people can use it more easily, or getting their forms processed online more simply — there’s a huge amount of work to drag the federal government and state governments and local governments into the 21st century.”

…expanding the discussion around Artificial Intelligence and regulatory processes to include how the technology should be leveraged to ensure fairness and responsiveness in the very basic processes of rulemaking – in particular public notices and comments. These technologies could also enable us to consider not just public comments formally submitted to an agency, but the entire universe of statements made through social media posts, blogs, chat boards — and conceivably every other electronic channel of public communication.

Obviously, an anonymous comment on the Internet should not carry the same credibility as a formally submitted, personally signed statement, just as sworn testimony in court holds far greater weight than a grapevine rumor. But so much public discussion today occurs on Facebook pages, in Tweets, on news website comment sections, etc. Anonymous speech enjoys explicit protection under the Constitution, based on a justified expectation that certain sincere statements of sentiment might result in unfair retribution from the government.

Should we simply ignore the valuable insights about actual public sentiment on specific issues made possible through the power of Artificial Intelligence, which can ascertain meaning from an otherwise unfathomable ocean of relevant public conversations? With certain qualifications, I believe Artificial Intelligence, or AI, should absolutely be employed in the critical effort to gain insights from public comments – signed or anonymous.

“In the criminal justice system, some of the biggest concerns with Big Data are the lack of data and the lack of quality data,” the NSTC report authors state. “AI needs good data. If the data is incomplete or biased, AI can exacerbate problems of bias.” As a former federal criminal prosecutor and defense attorney, I am well familiar with the absolute necessity to weigh the relative value of various forms of evidence – or in this case, data…(More)

The Future of Drone Use: Opportunities and Threats from Ethical and Legal Perspectives


Book by Bart Custers: “Given the popularity of drones and the fact that they are easy and cheap to buy, it is generally expected that the ubiquity of drones will significantly increase within the next few years. This raises questions as to what is technologically feasible (now and in the future), what is acceptable from an ethical point of view and what is allowed from a legal point of view. Drone technology is to some extent already available and to some extent still in development. The aim and scope of this book is to map the opportunities and threats associated with the use of drones and to discuss the ethical and legal issues of the use of drones.
This book provides an overview of current drone technologies and applications and of what to expect in the next few years. The question of how to regulate the use of drones in the future is addressed, by considering conditions and contents of future drone legislation and by analyzing issues surrounding privacy and safeguards that can be taken. As such, this book is valuable to scholars in several disciplines, such as law, ethics, sociology, politics and public administration, as well as to practitioners and others who may be confronted with the use of drones in their work, such as professionals working in the military, law enforcement, disaster management and infrastructure management. Individuals and businesses with a specific interest in drone use may also find in the nineteen contributions contained in this volume unexpected perspectives on this new field of research and innovation….(More)”

The power of prediction markets


Adam Mann in Nature: “It was a great way to mix science with gambling, says Anna Dreber. The year was 2012, and an international group of psychologists had just launched the ‘Reproducibility Project’ — an effort to repeat dozens of psychology experiments to see which held up1. “So we thought it would be fantastic to bet on the outcome,” says Dreber, who leads a team of behavioural economists at the Stockholm School of Economics.

In particular, her team wanted to see whether scientists could make good use of prediction markets: mini Wall Streets in which participants buy and sell ‘shares’ in a future event at a price that reflects their collective wisdom about the chance of the event happening. As a control, Dreber and her colleagues first asked a group of psychologists to estimate the odds of replication for each study on the project’s list. Then the researchers set up a prediction market for each study, and gave the same psychologists US$100 apiece to invest.

When the Reproducibility Project revealed last year that it had been able to replicate fewer than half of the studies examined2, Dreber found that her experts hadn’t done much better than chance with their individual predictions. But working collectively through the markets, they had correctly guessed the outcome 71% of the time3.

Experiments such as this are a testament to the power of prediction markets to turn individuals’ guesses into forecasts of sometimes startling accuracy. That uncanny ability ensures that during every US presidential election, voters avidly follow the standings for their favoured candidates on exchanges such as Betfair and the Iowa Electronic Markets (IEM). But prediction markets are increasingly being used to make forecasts of all kinds, on everything from the outcomes of sporting events to the results of business decisions. Advocates maintain that they allow people to aggregate information without the biases that plague traditional forecasting methods, such as polls or expert analysis….

Prediction markets have also had some high-profile misfires, however — such as giving the odds of a Brexit ‘stay’ vote as 85% on the day of the referendum, 23 June. (UK citizens in fact narrowly voted to leave the European Union.) And prediction markets lagged well behind conventional polls in predicting that Donald Trump would become the 2016 Republican nominee for US president.

Such examples have inspired academics to probe prediction markets. Why do they work as well as they do? What are their limits, and why do their predictions sometimes fail?…(More)”

 

Crowdsourcing Gun Violence Research


Penn Engineering: “Gun violence is often described as an epidemic, but as visible and shocking as shooting incidents are, epidemiologists who study that particular source of mortality have a hard time tracking them. The Centers for Disease Control is prohibited by federal law from conducting gun violence research, so there is little in the way of centralized infrastructure to monitor where, how,when, why and to whom shootings occur.

Chris Callison-Burch, Aravind K.Joshi Term Assistant Professor in Computer and InformationScience, and graduate studentEllie Pavlick are working to solve this problem.

They have developed the GunViolence Database, which combines machine learning and crowdsourcing techniques to produce a national registry of shooting incidents. Callison-Burch and Pavlick’s algorithm scans thousands of articles from local newspaper and television stations,determines which are about gun violence, then asks everyday people to pullout vital statistics from those articles, compiling that information into a unified, open database.

For natural language processing experts like Callison-Burch and Pavlick, the most exciting prospect of this effort is that it is training computer systems to do this kind of analysis automatically. They recently presented their work on that front at Bloomberg’s Data for Good Exchange conference.

The Gun Violence Database project started in 2014, when it became the centerpiece of Callison-Burch’s “Crowdsourcing and Human Computation”class. There, Pavlick developed a series of homework assignments that challenged undergraduates to develop a classifier that could tell whether a given news article was about a shooting incident.

“It allowed us to teach the things we want students to learn about datascience and natural language processing, while giving them the motivation to do a project that could contribute to the greater good,” says Callison-Burch.

The articles students used to train their classifiers were sourced from “TheGun Report,” a daily blog from New York Times reporters that attempted to catalog shootings from around the country in the wake of the Sandy Hook massacre. Realizing that their algorithmic approach could be scaled up to automate what the Times’ reporters were attempting, the researchers began exploring how such a database could work. They consulted with DouglasWiebe, a Associate Professor of Epidemiology in Biostatistics andEpidemiology in the Perelman School of Medicine, to learn more about what kind of information public health researchers needed to better study gun violence on a societal scale.

From there, the researchers enlisted people to annotate the articles their classifier found, connecting with them through Mechanical Turk, Amazon’scrowdsourcing platform, and their own website, http://gun-violence.org/…(More)”

Empowering cities


“The real story on how citizens and businesses are driving smart cities” by the Economist Intelligence Unit: “Digital technologies are the lifeblood of today’s cities. They are applied widely in industry and society, from information and communications technology (ICT) to the Internet of Things (IoT), in which objects are connected to the Internet. As sensors turn any object into part of an intelligent urban network, and as computing power facilitates analysis of the data these sensors collect, elected officials and city administrators can gain an unparalleled understanding of the infrastructure and services of their city. However, to make the most of this intelligence, another ingredient is essential: citizen engagement. Thanks to digital technologies, citizens can provide a steady flow of feedback and ideas to city officials.

This study by The Economist Intelligence Unit (EIU), supported by Philips Lighting, investigates how citizens and businesses in 12 diverse cities around the world—Barcelona, Berlin, Buenos Aires, Chicago, London, Los Angeles, Mexico City, New York City, Rio de Janeiro, Shanghai, Singapore and Toronto—envision the benefits of smart cities. The choices of the respondents to the survey reflect the diverse nature of the challenges and opportunities facing different cities, from older cities in mature markets, where technology is at work with infrastructure that may be centuries old, to new cities in emerging markets, which have the opportunity to incorporate digital technologies as they grow.

Coupled with expert perspectives, these insights paint a fresh picture of how digital technologies can empower people to contribute-giving city officials a roadmap to smart city life in the 21st century….(More)”

Crowdsourcing investigative journalism


Convoca in Peru: “…collaborative effort is the essence of Convoca. We are a team of journalists and programmers who work with professionals from different disciplines and generations to expose facts that are hidden by networks of power and affect the life of citizens. We bet on the work in partnership to publish findings of high impact from Peru, where the Amazon survives in almost 60% of the country, in the middle of oil exploitation, minerals and criminal activities such as logging, illegal mining and human trafficking. Fifty percent of social conflicts have as epicenter extractives areas of natural resources where the population and communities with the highest poverty rates live.

Over one year and seven months, Convoca has uncovered facts of public relevance such as the patterns of corruption and secrecy networking with journalists from Latin America and the world. The series of reports with the BRIO platform revealed the cost overruns of highways and public works in Latin American countries in the hands of Brazilian companies financed by the National Bank of Economic and Social Development (BNDES), nowadays investigated in the most notorious corruption scandal in the region, ‘Lava Jato’. This research won the 2016 Journalistic Excellence Award granted by the Inter American Press Association (SIP). On a global scale, we dove into 11 million and a half files of the ‘Panama Papers’ with more than a hundred media and organizations led by the International Consortium of Investigative Journalists (ICIJ), which allowed to undress the world of tax havens where companies and characters hide their fortune.

Our work on extractive industries ‘Excesses unpunished’ won the most important award of data journalism in the world, the Data Journalism Awards 2016, and is a finalist of the Gabriel Garcia Marquez Award which recognized the best of journalism in Latin America. We invite you to be the voice of this effort to keep publishing new reports that allow citizens to make better decisions about their destinies and compel groups of power to come clean about their activities and fulfill their commitments. So join ConBoca: The Power of Citizens Call, our first fundraising campaign alongside our readers. We believe that journalism is a public service….(More)”

Rethinking Society for the 21st Century


The International Panel on Social Progress: “The crisis of social-democracy in recent decades seems, in the rich countries, to have generated a decline of hope for a just society. In developing countries, the trend is now to mimic the developed countries, rather than inventing a new model, and, in spite of reduced poverty in several countries, social hardships reminiscent of the early phase of Western capitalism are widespread.

Yet neither the collapse of utopian illusions nor booming capitalism in developing countries should mean the end of the quest for justice.

Can we hope for a better society?

Social scientists have never been so well equipped to provide an answer, thanks to the development of all the relevant disciplines since WWII….

The International Panel on Social Progress (IPSP) will harness the competence of hundreds of experts about social issues and will deliver a report addressed to all social actors, movements, organizations, politicians and decision-makers, in order to provide them with the best expertise on questions that bear on social change.

The Panel will seek consensus whenever possible but will not hide controversies and will honestly present up-to-date arguments and analyses, and debates about them, in an accessible way.

The Panel will have no partisan political agenda, but will aim at restoring hope in social progress and stimulating intellectual and public debates. Different political and philosophical views may conceive of social progress in different ways, emphasizing values such as freedom, dignity, or equality.

The Panel will retain full independence from political parties, governments, and organizations with a partisan agenda.

While the Panel will primarily work for the dissemination of knowledge to all relevant actors in society, it will also foster research on the topics it will study and help to revive interest for research in social long-term prospective analysis….(More)”

IPSP Report: open for comment.