The Potentially Adverse Impact of Twitter 2.0 on Scientific and Research Communication


Article by Julia Cohen: “In just over a month after the change in Twitter leadership, there have been significant changes to the social media platform, in its new “Twitter 2.0.” version. For researchers who use Twitter as a primary source of data, including many of the computer scientists at USC’s Information Sciences Institute (ISI), the effects could be debilitating…

Over the years, Twitter has been extremely friendly to researchers, providing and maintaining a robust API (application programming interface) specifically for academic research. The Twitter API for Academic Research allows researchers with specific objectives who are affiliated with an academic institution to gather historical and real-time data sets of tweets, and related metadata, at no cost. Currently, the Twitter API for Academic Research continues to be functional and maintained in Twitter 2.0.

The data obtained from the API provides a means to observe public conversations and understand people’s opinions about societal issues. Luca Luceri, a Postdoctoral Research Associate at ISI called Twitter “a primary platform to observe online discussion tied to political and social issues.” And Twitter touts its API for Academic Research as a way for “academic researchers to use data from the public conversation to study topics as diverse as the conversation on Twitter itself.”

However, if people continue deactivating their Twitter accounts, which appears to be the case, the makeup of the user base will change, with data sets and related studies proportionally affected. This is especially true if the user base evolves in a way that makes it more ideologically homogeneous and less diverse.

According to MIT Technology Review, in the first week after its transition, Twitter may have lost one million users, which translates to a 208% increase in lost accounts. And there’s also the concern that the site could not work as effectively, because of the substantial decrease in the size of the engineering teams. This includes concerns about the durability of the service researchers rely on for data, namely the Twitter API. Jason Baumgartner, founder of Pushshift, a social media data collection, analysis, and archiving platform, said in several recent API requests, his team also saw a significant increase in error rates – in the 25-30% range –when they typically see rates near 1%. Though for now this is anecdotal, it leaves researchers wondering if they will be able to rely on Twitter data for future research.

One example of how the makeup of the less-regulated Twitter 2.0 user base could significantly be altered is if marginalized groups leave Twitter at a higher rate than the general user base, e.g. due to increased hate speech. Keith Burghardt, a Computer Scientist at ISI who studies hate speech online said, “It’s not that an underregulated social media changes people’s opinions, but it just makes people much more vocal. So you will probably see a lot more content that is hateful.” In fact, a study by Montclair State University found that hate speech on Twitter skyrocketed in the week after the acquisition of Twitter….(More)”.

Language and the Rise of the Algorithm


Book by Jeffrey M. Binder: “Bringing together the histories of mathematics, computer science, and linguistic thought, Language and the Rise of the Algorithm reveals how recent developments in artificial intelligence are reopening an issue that troubled mathematicians well before the computer age: How do you draw the line between computational rules and the complexities of making systems comprehensible to people? By attending to this question, we come to see that the modern idea of the algorithm is implicated in a long history of attempts to maintain a disciplinary boundary separating technical knowledge from the languages people speak day to day.
 
Here Jeffrey M. Binder offers a compelling tour of four visions of universal computation that addressed this issue in very different ways: G. W. Leibniz’s calculus ratiocinator; a universal algebra scheme Nicolas de Condorcet designed during the French Revolution; George Boole’s nineteenth-century logic system; and the early programming language ALGOL, short for algorithmic language. These episodes show that symbolic computation has repeatedly become entangled in debates about the nature of communication. Machine learning, in its increasing dependence on words, erodes the line between technical and everyday language, revealing the urgent stakes underlying this boundary.
 
The idea of the algorithm is a levee holding back the social complexity of language, and it is about to break. This book is about the flood that inspired its construction…(More)”.

Research Methods in Deliberative Democracy


Book edited by Selen A. Ercan et al: “… brings together a wide range of methods used in the study of deliberative democracy. It offers thirty-one different methods that scholars use for theorizing, measuring, exploring, or applying deliberative democracy. Each chapter presents one method by explaining its utility in deliberative democracy research and providing guidance on its application by drawing on examples from previous studies. The book hopes to inspire scholars to undertake methodologically robust, intellectually creative, and politically relevant research. It fills a significant gap in a rapidly growing field of research by assembling diverse methods and thereby expanding the range of methodological choices available to students, scholars, and practitioners of deliberative democracy…(More)”.

Industry Data for Society Partnership


Press Release: “On Wednesday, a new Industry Data for Society Partnership (IDSP) was launched by GitHub, Hewlett Packard Enterprise (HPE), LinkedIn, Microsoft, Northumbrian Water Group, R2 Factory and UK Power Networks. The IDSP is a first-of-its-kind cross-industry partnership to help advance more open and accessible private-sector data for societal good. The founding members of the IDSP agree to provide greater access to their data, where appropriate, to help tackle some of the world’s most pressing challenges in areas such as sustainability and inclusive economic growth.

In the past few years, open data has played a critical role in enabling faster research and collaboration across industries and with the public sector. As we saw during COVID-19, pandemic data that was made more open enabled researchers to make faster progress and gave citizens more information to inform their day-to-day activities. The IDSP’s goal is to continue this model into new areas and help address other complex societal challenges. The IDSP will serve as a forum for the participating companies to foster collaboration, as well as a resource for other entities working on related issues.

IDSP members commit to the following:

  • To open data or provide greater access to data, where appropriate, to help solve pressing societal problems in a usable, responsible and inclusive manner.
  • To share knowledge and information for the effective use of open data and data collaboration for social benefit.
  • To invest in skilling a broad class of professionals to use data effectively and responsibly for social impact.
  • To protect individuals’ privacy in all these activities.

The IDSP will also bring in other organizations with expertise in societal issues. At launch, The GovLab’s Data Program based at New York University and the Open Data Institute will both be partnership Affiliates to provide guidance and expertise for partnership endeavors…(More)”.

Learnings on the Importance of Youth Engagement


Blog by  Anna Ibru and Dane Gambrell at The GovLab: “…In recent years, public institutions around the world are piloting new youth engagement initiatives like Creamos that tap the expertise and experiences of young people to develop projects, programs, and policies and address complex social challenges within communities. 

To learn from and scale best practices from international models of youth engagement, The GovLab has develop case studies about three path breaking initiatives: Nuortenbudjetti, Helsinki’s participatory budgeting initiative for youth; Forum Jove BCN, Barcelona’s youth led citizens’ assembly; and Creamos, an open innovation and coaching program for young social innovators in Chile. For government decision makers and institutions who are looking to engage and empower young people to get involved in their communities, develop real-world solutions, and strengthen democracy, these examples describe these initiatives and their outcomes along with guidance on how to design and replicate such projects in your community. Young people are still a widely untapped resource who are too-often left out in policy and program design. The United Nations affirms that it is impossible to meet the UN SDGs by 2030 without active participation of the 1.8 billion youth in the world. Government decision makers and institutions must capitalize on the opportunity to engage and empower young people. The successes of NuortenbudjettiForum Jove BCN, and Creamos provide a roadmap for policymakers looking to engage in this space….(More)” See also:  Nuortenbudjetti: Helsinki’s Youth BudgetCreamos: Co-creating youth-led social innovation projects in Chile and Forum Jove BCN: Barcelona’s Youth Forum.

Screen Shot 2022 12 06 At 1.36.48 Pm

Using private sector geospatial data to inform policy


OECD Report: “Over the last decade, a large variety of geospatial data sources, such as GPS trajectories, geotagged photos, and social media have become available for research and statistical applications. These new data sources are often generated, voluntarily or non-voluntarily, by private sector organisations and can provide highly granular and timely information to policymakers. Drawing on experiences of several OECD countries, this paper highlights the potential of combining traditional and unconventional data from both public and private sources, and makes the case for facilitating co-operation between data providers and organisations responsible for public policy. In addition, the paper provides a series of best practices on leveraging private data for the public good and identifies opportunities, challenges, and ways forward for public and private sector partnerships on data sharing….(More)”.

Can citizen deliberation address the climate crisis? Not if it is disconnected from politics and policymaking


Blog by John Boswell, Rikki Dean and Graham Smith: “..Modelled on the deliberative democratic ideal, much of the attention on climate assemblies focuses on their internal features. The emphasis is on their novelty in providing respite from the partisan bickering of politics-as-usual, instead creating space for the respectful free and fair exchange of reasons.

On these grounds, the Global Citizens’ Assembly in 2021 and experimental ‘wave’ of climate assemblies across European countries are promising. Participating citizens have demonstrated they can grapple with complex information, deliberate respectfully, and come to a well thought-through set of recommendations that are – every time – more progressive than current climate policies.

But, before we get carried away with this enthusiasm, it is important to focus on a fundamental point usually glossed over. Assemblies are too often talked about in magical terms, as if by their moral weight alone citizen recommendations will win the day through the forceless force of their arguments. But this expectation is naive.

Designing for impact requires much more attention to the nitty-gritty of how policy actually gets made. That means taking seriously the technical uncertainties and complexities associated with policy interventions, and confronting the political challenges and trade-offs required in balancing priorities in the shadow of powerful interests.

In a recent study, we have examined the first six national climate assemblies – in Ireland, France, the UK, Scotland, Germany and Denmark – to see how they tried to achieve impact. Our novel approach is to take the focus away from their (very similar) ‘internal design characteristics’ – such as random selection – and instead put it on their ‘integrative design characteristics’…(More)”.

How data restrictions erode internet freedom


Article by Tom Okman: “Countries across the world – small, large, powerful and weak – are accelerating efforts to control and restrict private data. According to the Information Technology and Innovation Foundation, the number of laws, regulations and policies that restrict or require data to be stored in a specific country more than doubled between 2017 and 2021, rising from 67 to 144.

Some of these laws may be driven by benevolent intentions. After all, citizens will support stopping the spread of online disinformation, hate, and extremism or systemic cyber-snooping. Cyber-libertarian John Perry Barlow’s call for the government to “leave us alone” in cyberspace rings hollow in this context.

Government internet oversight is on the rise.

Government internet oversight is on the rise. Image: Information Technology and Innovation Foundation

But some digital policies may prove to be repressive for companies and citizens alike. They extend the justifiable concern over the dominance of large tech companies to other areas of the digital realm.

These “digital iron curtains” can take many forms. What they have in common is that they seek to silo the internet (or parts of it) and private data into national boxes. This risks dividing the internet, reducing its connective potential, and infringing basic digital freedoms…(More)”.

Abandoned: the human cost of neurotechnology failure


Article by Liam Drew: “…Hundreds of thousands of people benefit from implanted neurotechnology every day. Among the most common devices are spinal-cord stimulators, first commercialized in 1968, that help to ease chronic pain. Cochlear implants that provide a sense of hearing, and deep-brain stimulation (DBS) systems that quell the debilitating tremor of Parkinson’s disease, are also established therapies.

Encouraged by these successes, and buoyed by advances in computing and engineering, researchers are trying to develop evermore sophisticated devices for numerous other neurological and psychiatric conditions. Rather than simply stimulating the brain, spinal cord or peripheral nerves, some devices now monitor and respond to neural activity.

For example, in 2013, the US Food and Drug Administration approved a closed-loop system for people with epilepsy. The device detects signs of neural activity that could indicate a seizure and stimulates the brain to suppress it. Some researchers are aiming to treat depression by creating analogous devices that can track signals related to mood. And systems that allow people who have quadriplegia to control computers and prosthetic limbs using only their thoughts are also in development and attracting substantial funding.

The market for neurotechnology is predicted to expand by around 75% by 2026, to US$17.1 billion. But as commercial investment grows, so too do the instances of neurotechnology companies giving up on products or going out of business, abandoning the people who have come to depend on their devices.

Shortly after the demise of ATI, a company called Nuvectra, which was based in Plano, Texas, filed for bankruptcy in 2019. Its device — a new kind of spinal-cord stimulator for chronic pain — had been implanted in at least 3,000 people. In 2020, artificial-vision company Second Sight, in Sylmar, California, laid off most of its workforce, ending support for the 350 or so people who were using its much heralded retinal implant to see. And in June, another manufacturer of spinal-cord stimulators — Stimwave in Pompano Beach, Florida — filed for bankruptcy. The firm has been bought by a credit-management company and is now embroiled in a legal battle with its former chief executive. Thousands of people with the stimulator, and their physicians, are watching on in the hope that the company will continue to operate.

When the makers of implanted devices go under, the implants themselves are typically left in place — surgery to remove them is often too expensive or risky, or simply deemed unnecessary. But without ongoing technical support from the manufacturer, it is only a matter of time before the programming needs to be adjusted or a snagged wire or depleted battery renders the implant unusable.

People are then left searching for another way to manage their condition, but with the added difficulty of a non-functional implant that can be an obstacle both to medical imaging and future implants. For some people, including Möllmann-Bohle, no clear alternative exists.

“It’s a systemic problem,” says Jennifer French, executive director of Neurotech Network, a patient advocacy and support organization in St. Petersburg, Florida. “It goes all the way back to clinical trials, and I don’t think it’s received enough attention.”…(More)”.

Beyond Measure: The Hidden History of Measurement from Cubits to Quantum Constants


Book by James Vincent: “From the cubit to the kilogram, the humble inch to the speed of light, measurement is a powerful tool that humans invented to make sense of the world. In this revelatory work of science and social history, James Vincent dives into its hidden world, taking readers from ancient Egypt, where measuring the annual depth of the Nile was an essential task, to the intellectual origins of the metric system in the French Revolution, and from the surprisingly animated rivalry between metric and imperial, to our current age of the “quantified self.” At every turn, Vincent is keenly attuned to the political consequences of measurement, exploring how it has also been used as a tool for oppression and control.

Beyond Measure reveals how measurement is not only deeply entwined with our experience of the world, but also how its history encompasses and shapes the human quest for knowledge…(More)”.