Paper by Wouter Boon et al: “How is public engagement perceived to contribute to open science? This commentary highlights common reflections on this question from interviews with 12 public engagement fellows in Utrecht University’s Open Science Programme in the Netherlands. We identify four reasons why public engagement is an essential enabler of open science. Interaction between academics and society can: (1) better align science with the needs of society; (2) secure a relationship of trust between science and society; (3) increase the quality and impact of science; and (4) support the impact of open access and FAIR data practices (data which meet principles of findability, accessibility, interoperability and reusability). To be successful and sustainable, such public engagement requires support in skills training and a form of institutionalisation in a university-wide system, but, most of all, the fellows express the importance of a formal and informal recognition and rewards system. Our findings suggest that in order to make public engagement an integral part of open science, universities should invest in institutional support, create awareness, and stimulate dialogue among staff members on how to ‘do’ good public engagement….(More)”.
Ethics, Integrity and Policymaking
Book by Dónal O’Mathúna, and Ron Iphofen: “…provides illustrative case studies that explore various research and innovation topics that raise challenges requiring ethical reflection and careful policymaking responses. The cases highlight diverse ethical challenges and provide lessons for the various options available for policymaking. Cases are drawn from many fields, including artificial intelligence, space science, energy, data protection, professional research practice and pandemic planning. Case studies are particularly helpful with ethical issues to provide crucial context. This book reflects the ambiguity of ethical dilemmas in contemporary policymaking. Analyses reflect current debates where consensus has not yet been achieved. These cases illustrate key points made throughout the PRO-RES EU-funded project from which they arise: that ethical judgement is a fluid enterprise, where values, principles and standards must constantly adjust to new situations, new events and new research developments. This book is an indispensable aid to policymaking that addresses, and/or uses evidence from, novel research developments….(More)”.
The Case for Abolishing Elections
Essay by Nicholas Coccoma: “Terry Bouricius remembers the moment he converted to democracy by lottery. A bookish Vermonter, now 68, he was elected to the State House in 1990 after working for years as a public official in Burlington. At first state government excited him, but he quickly grew disillusioned. “During my time as a legislator,” he told me in an interview last year, “it became obvious to me that the ‘people’s house’ was not very representative of the people who actually lived in Vermont.”
The revelation came while Bouricius was working on a housing committee. “The committee members were an outgoing and garrulous bunch,” he observed. “Shy wallflowers almost never become legislators.” More disturbing, he noted how his fellow politicians—all of whom owned their homes—tended to legislate in favor of landlords and against tenants. “I saw that the experiences and beliefs of legislators shape legislation far more than facts,” he said. “After that, I frequently commented that any 150 Vermonters pulled from the phone book would be more representative than the elected House membership.”
There is widespread disgust with electoral politics and a hunger for greater responsiveness—a hunger, in other words, for democracy.
Many Americans agree. In a poll conducted in January 2020, 65 percent of respondents said that everyday people selected by lottery—who meet some basic requirements and are willing and able to serve—would perform better or much better compared to elected politicians. In March last year a Pew survey found that a staggering 79 percent believe it’s very or somewhat important for the government to create assemblies where everyday citizens from all walks of life can debate issues and make recommendations about national laws. “My decade of experience serving in the state legislature convinces me that this popular assessment is correct,” Bouricius said.
The idea—technically known as “sortition”—has been spreading. Perhaps its most prominent academic advocate is Yale political theorist Hélène Landemore. Her 2020 book Open Democracy: Reinventing Popular Rule for the Twenty-First Century explores the limitations of both direct democracy and electoral-representative democracy, advocating instead for government by large, randomly selected “mini-publics.” As she put it in conversation with Ezra Klein at the New York Times last year, “I think we are realizing the limits of just being able to choose rulers, as opposed to actually being able to choose outcomes.” She is not alone. Rutgers philosopher Alex Guerrero and Belgian public intellectual David Van Reybrouck have made similar arguments in favor of democracy by lottery. In the 2016 translation of his book Against Elections, Van Reybrouck characterizes elections as “the fossil fuel of politics.” “Whereas once they gave democracy a huge boost,” he writes, “much like the boost that oil gave the economy, it now it turns out they cause colossal problems of their own.”…(More)”.
Data Structures the Fun Way
Book by Jeremy Kubica: “This accessible and entertaining book provides an in-depth introduction to computational thinking through the lens of data structures — a critical component in any programming endeavor. Through diagrams, pseudocode, and humorous analogies, you’ll learn how the structure of data drives algorithmic operations, gaining insight into not just how to build data structures, but precisely how and when to use them.
This book will give you a strong background in implementing and working with more than 15 key data structures, from stacks, queues, and caches to bloom filters, skip lists, and graphs. Master linked lists by standing in line at a cafe, hash tables by cataloging the history of the summer Olympics, and Quadtrees by neatly organizing your kitchen cabinets. Along with basic computer science concepts like recursion and iteration, you’ll learn:
- The complexity and power of pointers
- The branching logic of tree-based data structures
- How different data structures insert and delete data in memory
- Why mathematical mappings and randomization are useful
- How to make tradeoffs between speed, flexibility, and memory usage
Data Structures the Fun Way shows how to efficiently apply these ideas to real-world problems—a surprising number of which focus on procuring a decent cup of coffee. At any level, fully understanding data structures will teach you core skills that apply across multiple programming languages, taking your career to the next level….(More)”.
Writing the Revolution
Book by Heather Ford: “A close reading of Wikipedia’s article on the Egyptian Revolution reveals the complexity inherent in establishing the facts of events as they occur and are relayed to audiences near and far.
Wikipedia bills itself as an encyclopedia built on neutrality, authority, and crowd-sourced consensus. Platforms like Google and digital assistants like Siri distribute Wikipedia’s facts widely, further burnishing its veneer of impartiality. But as Heather Ford demonstrates in Writing the Revolution, the facts that appear on Wikipedia are often the result of protracted power struggles over how data are created and used, how history is written and by whom, and the very definition of facts in a digital age.
In Writing the Revolution, Ford looks critically at how the Wikipedia article about the 2011 Egyptian Revolution evolved over the course of a decade, both shaping and being shaped by the Revolution as it happened. When data are published in real time, they are subject to an intense battle over their meaning across multiple fronts. Ford answers key questions about how Wikipedia’s so-called consensus is arrived at; who has the power to write dominant histories and which knowledges are actively rejected; how these battles play out across the chains of circulation in which data travel; and whether history is now written by algorithms…(More)”
How Food Delivery Workers Shaped Chinese Algorithm Regulations
Article by Matt Sheehan and Sharon Du: “In 2021, China issued a series of policy documents aimed at governing the algorithms that underpin much of the internet today. The policies included a regulation on recommendation algorithms and a draft regulation on synthetically generated media, commonly known as deepfakes. Domestically, Chinese media touted the recommendation engine regulations for the options they gave Chinese internet users, such as the choice to “turn off the algorithm” on major platforms. Outside China, these regulations have largely been seen through the prism of global geopolitics, framed as questions over whether China is “ahead” in algorithm regulations or whether it will export a “Chinese model” of artificial intelligence (AI) governance to the rest of the world.
These are valid questions with complex answers, but they overlook the core driver of China’s algorithm regulations: they are designed primarily to address China’s domestic social, economic, and political problems. The Chinese Communist Party (CCP) is the ultimate arbiter here, deciding both what counts as a problem and how it should be solved. But the CCP doesn’t operate in a vacuum. Like any governing party, it is constantly creating new policies to try to put out fires, head off problems, and respond to public desires.
Through a short case study, we can see how Chinese food delivery drivers, investigative journalists, and academics helped shape one part of the world’s first regulations on recommendation algorithms. From that process, we can learn how international actors might better predict and indirectly influence Chinese algorithm policy…(More)”.
People’s Plan for Nature
About: “The nature crisis affects everyone, and we believe everyone should have a say in how we solve it. The People’s Plan for Nature is the UK’s biggest ever conversation about the future of nature.
The People’s Plan for Nature will include recommendations for governments (local and national), food and farming businesses, non-governmental organisations, communities, and individuals.
These recommendations will be the outcome of the People’s Assembly for Nature, a citizens’ assembly that will run as part of the project. This assembly will bring together a group of people from all walks of life to have an honest conversation, find common ground and make recommendations for the protection and restoration of nature in the UK.
This will ensure the People’s Plan for Nature is rooted in the values, ideas and experiences of people from all corners of the UK…(More)”.
Bad Data
Book by Georgina Sturge; “Our politicians make vital decisions and declarations every day that rely on official data. But should all statistics be trusted?
In BAD DATA, House of Commons Library statistician Georgina Sturge draws back the curtain on how governments of the past and present have been led astray by figures littered with inconsistency, guesswork and uncertainty.
Discover how a Hungarian businessman’s bright idea caused half a million people to go missing from UK migration statistics. Find out why it’s possible for two politicians to disagree over whether poverty has gone up or down, using the same official numbers, and for both to be right at the same time. And hear about how policies like ID cards, super-casinos and stopping ex-convicts from reoffending failed to live up to their promise because they were based on shaky data.
With stories that range from the troubling to the empowering to the downright absurd, BAD DATA reveals secrets from the usually closed-off world of policy-making. It also suggests how – once we understand the human story behind the numbers – we can make more informed choices about who to trust, and when…(More)”.
What Moneyball-for-Everything Has Done to American Culture
Article by Derek Thompson: “…The analytics revolution, which began with the movement known as Moneyball, led to a series of offensive and defensive adjustments that were, let’s say, catastrophically successful. Seeking strikeouts, managers increased the number of pitchers per game and pushed up the average velocity and spin rate per pitcher. Hitters responded by increasing the launch angles of their swings, raising the odds of a home run, but making strikeouts more likely as well. These decisions were all legal, and more important, they were all correct from an analytical and strategic standpoint….
When universal smarts lead to universal strategies, it can lead to a more homogenous product. Take the NBA. When every basketball team wakes up to the calculation that three points is 50 percent more than two points, you get a league-wide blitz of three-point shooting to take advantage of the discrepancy. Before the 2011–12 season, the league as a whole had never averaged more than 20 three-point-shot attempts per game. This year, no team is attempting fewer than 25 threes per game; four teams are attempting more than 40.
As I’ve written before, the quantitative revolution in culture is a living creature that consumes data and spits out homogeneity. Take the music industry. Before the ’90s, music labels routinely lied to Billboard about their sales figures to boost their preferred artists. In 1991, Billboard switched methodologies to use more objective data, including point-of-sale information and radio surveys that didn’t rely on input from the labels. The charts changed overnight. Rock-and-roll bands were toppled, and hip-hop and country surged. When the charts became more honest, they also became more static. Popular songs stick around longer than they used to. One analysis of the history of pop-music styles found that rap and hip-hop have dominated American pop music longer than any other musical genre. As the analytics revolution in music grew, radio playlists became more repetitive, and by some measures, the most popular songs became more similar to one another…(More)”.
Improving Access and Management of Public Transit ITS Data
Report by the National Academies: “With the proliferation of automated vehicle location (AVL), automated passenger counters (APCs), and automated fare collection (AFC), transit agencies are collecting increasingly granular data on service performance, ridership, customer behavior, and financial recovery. While granular intelligent transportation systems (ITS) data can meaningfully improve transit decision-making, transit agencies face many challenges in accessing, validating, storing, and analyzing these data sets. These challenges are made more difficult in that the tools for managing and analyzing transit ITS data generally cannot, at this point, be shared across transit agencies because of variation in data collection systems and data formats. Multiple vendors provide ITS hardware and software, and data formats vary by vendor. Moreover, agencies may employ a patchwork of ITS that has been acquired and modified over time, leading to further consistency challenges.
Standardization of data structures and tools can help address these challenges. Not only can standardization streamline data transfer, validation, and database structuring, it encourages the development of analysis tools that can be used across transit agencies, as has been the case with route and schedule data, standardized in the General Transit Feed Specification (GTFS) format..(More)”.