Which Connections Really Help You Find a Job?


Article by Iavor Bojinov, Karthik Rajkumar, Guillaume Saint-Jacques, Erik Brynjolfsson, and Sinan Aral: “Whom should you connect with the next time you’re looking for a job? To answer this question, we analyzed data from multiple large-scale randomized experiments involving 20 million people to measure how different types of connections impact job mobility. Our results, published recently in Science Magazine, show that your strongest ties — namely your connections to immediate coworkers, close friends, and family — were actually the least helpful for finding new opportunities and securing a job. You’ll have better luck with your weak ties: the more infrequent, arm’s-length relationships with acquaintances.

To be more specific, the ties that are most helpful for finding new jobs tend to be moderately weak: They strike a balance between exposing you to new social circles and information and having enough familiarity and overlapping interests so that the information is useful. Our findings uncovered the relationship between the strength of the connection (as measured by the number of mutual connections prior to connecting) and the likelihood that a job seeker transitions to a new role within the organization of a connection.The observation that weak ties are more beneficial for finding a job is not new. Sociologist Mark Granovetter first laid out this idea in a seminal 1973 paper that described how a person’s network affects their job prospects. Since then, the theory, known as the “strength of weak ties,” has become one of the most influential in the social sciences — underpinning network theories of information diffusion, industry structure, and human cooperation….(More)”.

The Dangers of Systems Illiteracy


Review by Carol Dumaine: “In 1918, as the Great War was coming to an end after four bloody years of brutal conflict, an influenza pandemic began to ravage societies around the globe. While in Paris negotiating the terms of the peace agreement in the spring of 1919, evidence indicates that US president Woodrow Wilson was stricken with the flu. 

Wilson, who had been intransigent in insisting on just peace terms for the defeated nations (what he called “peace without victory”), underwent a profound change of mental state that his personal physician and closest advisors attributed to his illness. While sick, Wilson suddenly agreed to all the terms he had previously adamantly rejected and approved a treaty that made onerous demands of Germany. 

Wilson’s reversal left Germans embittered and his own advisors disillusioned. Historian John M. Barry, who recounts this episode in his book about the 1918 pandemic, The Great Influenza, observes that most historians agree “that the harshness toward Germany of the Paris peace treaty helped create the economic hardship, nationalistic reaction, and political chaos that fostered the rise of Hitler.” 

This anecdote is a vivid illustration of how a public health disaster can intersect with world affairs, potentially sowing the seeds for a future of war. Converging crises can leave societies with too little time to regroup, breaking down resilience and capacities for governance. Barry concludes from his research into the 1918 pandemic that to forestall this loss of authority—and perhaps to avoid future, unforeseen repercussions—government leaders should share the unvarnished facts and evolving knowledge of a situation. 

Society is ultimately based on trust; during the flu pandemic, “as trust broke down, people became alienated not only from those in authority, but from each other.” Barry continues, “Those in authority must retain the public’s trust. The way to do that is to distort nothing, to put the best face on nothing, to try to manipulate no one.”

Charles Weiss makes a similar argument in his new book, The Survival Nexus: Science, Technology, and World Affairs. Weiss contends that the preventable human and economic losses of the COVID-19 pandemic were the result of politicians avoiding harsh truths: “Political leaders suppressed evidence of virus spread, downplayed the importance of the epidemic and the need to observe measures to protect the health of the population, ignored the opinions of local experts, and publicized bogus ‘cures’—all to avoid economic damage and public panic, but equally importantly to consolidate political power and to show themselves as strong leaders who were firmly in control.” …(More)”.

The Potentially Adverse Impact of Twitter 2.0 on Scientific and Research Communication


Article by Julia Cohen: “In just over a month after the change in Twitter leadership, there have been significant changes to the social media platform, in its new “Twitter 2.0.” version. For researchers who use Twitter as a primary source of data, including many of the computer scientists at USC’s Information Sciences Institute (ISI), the effects could be debilitating…

Over the years, Twitter has been extremely friendly to researchers, providing and maintaining a robust API (application programming interface) specifically for academic research. The Twitter API for Academic Research allows researchers with specific objectives who are affiliated with an academic institution to gather historical and real-time data sets of tweets, and related metadata, at no cost. Currently, the Twitter API for Academic Research continues to be functional and maintained in Twitter 2.0.

The data obtained from the API provides a means to observe public conversations and understand people’s opinions about societal issues. Luca Luceri, a Postdoctoral Research Associate at ISI called Twitter “a primary platform to observe online discussion tied to political and social issues.” And Twitter touts its API for Academic Research as a way for “academic researchers to use data from the public conversation to study topics as diverse as the conversation on Twitter itself.”

However, if people continue deactivating their Twitter accounts, which appears to be the case, the makeup of the user base will change, with data sets and related studies proportionally affected. This is especially true if the user base evolves in a way that makes it more ideologically homogeneous and less diverse.

According to MIT Technology Review, in the first week after its transition, Twitter may have lost one million users, which translates to a 208% increase in lost accounts. And there’s also the concern that the site could not work as effectively, because of the substantial decrease in the size of the engineering teams. This includes concerns about the durability of the service researchers rely on for data, namely the Twitter API. Jason Baumgartner, founder of Pushshift, a social media data collection, analysis, and archiving platform, said in several recent API requests, his team also saw a significant increase in error rates – in the 25-30% range –when they typically see rates near 1%. Though for now this is anecdotal, it leaves researchers wondering if they will be able to rely on Twitter data for future research.

One example of how the makeup of the less-regulated Twitter 2.0 user base could significantly be altered is if marginalized groups leave Twitter at a higher rate than the general user base, e.g. due to increased hate speech. Keith Burghardt, a Computer Scientist at ISI who studies hate speech online said, “It’s not that an underregulated social media changes people’s opinions, but it just makes people much more vocal. So you will probably see a lot more content that is hateful.” In fact, a study by Montclair State University found that hate speech on Twitter skyrocketed in the week after the acquisition of Twitter….(More)”.

How data restrictions erode internet freedom


Article by Tom Okman: “Countries across the world – small, large, powerful and weak – are accelerating efforts to control and restrict private data. According to the Information Technology and Innovation Foundation, the number of laws, regulations and policies that restrict or require data to be stored in a specific country more than doubled between 2017 and 2021, rising from 67 to 144.

Some of these laws may be driven by benevolent intentions. After all, citizens will support stopping the spread of online disinformation, hate, and extremism or systemic cyber-snooping. Cyber-libertarian John Perry Barlow’s call for the government to “leave us alone” in cyberspace rings hollow in this context.

Government internet oversight is on the rise.

Government internet oversight is on the rise. Image: Information Technology and Innovation Foundation

But some digital policies may prove to be repressive for companies and citizens alike. They extend the justifiable concern over the dominance of large tech companies to other areas of the digital realm.

These “digital iron curtains” can take many forms. What they have in common is that they seek to silo the internet (or parts of it) and private data into national boxes. This risks dividing the internet, reducing its connective potential, and infringing basic digital freedoms…(More)”.

Abandoned: the human cost of neurotechnology failure


Article by Liam Drew: “…Hundreds of thousands of people benefit from implanted neurotechnology every day. Among the most common devices are spinal-cord stimulators, first commercialized in 1968, that help to ease chronic pain. Cochlear implants that provide a sense of hearing, and deep-brain stimulation (DBS) systems that quell the debilitating tremor of Parkinson’s disease, are also established therapies.

Encouraged by these successes, and buoyed by advances in computing and engineering, researchers are trying to develop evermore sophisticated devices for numerous other neurological and psychiatric conditions. Rather than simply stimulating the brain, spinal cord or peripheral nerves, some devices now monitor and respond to neural activity.

For example, in 2013, the US Food and Drug Administration approved a closed-loop system for people with epilepsy. The device detects signs of neural activity that could indicate a seizure and stimulates the brain to suppress it. Some researchers are aiming to treat depression by creating analogous devices that can track signals related to mood. And systems that allow people who have quadriplegia to control computers and prosthetic limbs using only their thoughts are also in development and attracting substantial funding.

The market for neurotechnology is predicted to expand by around 75% by 2026, to US$17.1 billion. But as commercial investment grows, so too do the instances of neurotechnology companies giving up on products or going out of business, abandoning the people who have come to depend on their devices.

Shortly after the demise of ATI, a company called Nuvectra, which was based in Plano, Texas, filed for bankruptcy in 2019. Its device — a new kind of spinal-cord stimulator for chronic pain — had been implanted in at least 3,000 people. In 2020, artificial-vision company Second Sight, in Sylmar, California, laid off most of its workforce, ending support for the 350 or so people who were using its much heralded retinal implant to see. And in June, another manufacturer of spinal-cord stimulators — Stimwave in Pompano Beach, Florida — filed for bankruptcy. The firm has been bought by a credit-management company and is now embroiled in a legal battle with its former chief executive. Thousands of people with the stimulator, and their physicians, are watching on in the hope that the company will continue to operate.

When the makers of implanted devices go under, the implants themselves are typically left in place — surgery to remove them is often too expensive or risky, or simply deemed unnecessary. But without ongoing technical support from the manufacturer, it is only a matter of time before the programming needs to be adjusted or a snagged wire or depleted battery renders the implant unusable.

People are then left searching for another way to manage their condition, but with the added difficulty of a non-functional implant that can be an obstacle both to medical imaging and future implants. For some people, including Möllmann-Bohle, no clear alternative exists.

“It’s a systemic problem,” says Jennifer French, executive director of Neurotech Network, a patient advocacy and support organization in St. Petersburg, Florida. “It goes all the way back to clinical trials, and I don’t think it’s received enough attention.”…(More)”.

The Wireless Body


Article by Jeremy Greene: “Nearly half the US adult population will pass out at some point in their lives. Doctors call this “syncope,” and it is bread-and-butter practice for any emergency room or urgent care clinic. While most cases are benign—a symptom of dehydration or mistimed medication—syncope can also be a sign of something gone terribly wrong. It may be a symptom of a heart attack, a blood clot in the lungs, an embolus to the arteries supplying the brain, or a life-threatening arrhythmia. After a series of tests ruling out the worst, most patients go home without incident. Many of them also go home with a Holter monitor. 

The Holter monitor is a device about the size of a pack of cards that records the electrical activity of the heart over the course of a day or more. Since its invention more than half a century ago, it has become such a common object in clinical medicine that few pause to consider its origins. But, as the makers of new Wi-Fi and cloud-enabled devices, smartphone apps, and other “wearable” technologies claim to be revolutionizing the world of preventive health care, there is much to learn from the history of this older instrument of medical surveillance…(More)”.

Citizen assemblies and the challenges of democratic equality


Article by Annabelle Lever: “…Creating a citizens’ assembly that truly reflects society as a whole isn’t so simple, however. In particular, only a very small percentage of those invited to participate actually agree to do so. According to a 2017 study published European Journal of Political Research, the precise percentage depends on how large, complex and time-consuming the process is likely to be. It ranges from 4% for larger, more onerous assemblies to 30% in a couple of exceptional cases, and averaging out at 15% across all countries and all forms of assembly. As a consequence, the formal equality of opportunity that unweighted lotteries promise tends to result in assemblies skewed to the socially advantaged, the partisan, and those most confident in their practical and cognitive abilities, whatever the reality.

To create an assembly that is more descriptively representative of the population – or one that looks more like us – several approaches are used. One is to have an initial phase of unweighted selection followed by a second phase that uses weighted lotteries. Another is to use stratified sampling or forms of stratification from the beginning.

For the Climate Assembly UK, organisers sent out 20% of its 30,000 letters of invitation to people randomly selected from the lowest-income postcodes, and then used random stratified sampling by computer to select 110 participants from all the people who were over 16 and free on the relevant dates.

Because citizen assemblies are very small compared to the population as a whole – France’s Convention for the Climate was made up of just 150 people – the descriptively representative character of the assembly can occur on only a few dimensions. Organisers must therefore decide what population characteristics the assembly should embody and in what proportion. Randomisation thus does not preclude difficult moral, political and scientific choices about the assembly to be constructed, any more than it precludes voluntariness or self-selection…(More)”.

How to think about policy in a polycrisis


Article by Martin Wolf: “Welcome to the “polycrisis” — a world in which, as historian Adam Tooze says, “economic and non-economic shocks” are entangled “all the way down”. We have an inflation shock that emanates from the disruptions caused by a pandemic, the policy responses to that pandemic and an energy shock caused by a war. That war in turn is related to the breakdown in relations among great powers. Slow growth, rising inequality and over-reliance on credit have undermined political stability in many high-income democracies. The credit boom led to a great financial crisis whose outcome included a decade of ultra-low interest rates and so even more financial fragility worldwide. Adding to these stresses is the threat of climate change.

It is indeed convenient to think about the world in intellectual silos, focusing in turn on macroeconomics, finance, politics, social change, politics, disease and the environment, to the exclusion of the others. In a reasonably stable world, this may even work well. The alternative of thinking about the interactions among these aspects of experience is also too hard. But sometimes, as now, it becomes inescapable.

It is not just theoretically true that everything depends on everything else. It is a truth we can no longer ignore in practice. As my colleague Gillian Tett often warns, silos are perilous. We have to think systemically. Economists have to recognise how the economy is interconnected with other forces. Navigating today’s storms compels us to develop a wider understanding.

This is not an argument against detailed analysis of individual elements in the picture. Economists should still look carefully at the things they know about, because they are both complex and important in themselves. Thus the data and analysis in the OECD’s latest Economic Outlook continue to be both invaluable and illuminating. But, inevitably, they also omit vital aspects….(More)”.

Orbán used Hungarians’ COVID data to boost election campaign, report says


Article by Louis Westendarp: “Hungarian Prime Minister Viktor Orbán’s ruling party Fidesz used citizens’ data from COVID-19 vaccine signups to spread Fidesz campaign messages before Hungary’s election in April 2022, according to a report by Human Rights Watch.

Not only was data from vaccine jabs used to help Fidesz, but also data from tax benefits applications and association membership registrations. This violates privacy rights, said the report — and blurs the line between the ruling party and government resources in Hungary, which has repeatedly been warned by the EU to clean up its act regarding the rule of law.

“Using people’s personal data collected so they could access public services to bombard them with political campaign messages is a betrayal of trust and an abuse of power,” said Deborah Brown, senior technology researcher at Human Rights Watch…(More)”.

Closing the gap between user experience and policy design 


Article by Cecilia Muñoz & Nikki Zeichner: “..Ask the average American to use a government system, whether it’s for a simple task like replacing a Social Security Card or a complicated process like filing taxes, and you’re likely to be met with groans of dismay. We all know that government processes are cumbersome and frustrating; we have grown used to the government struggling to deliver even basic services. 

Unacceptable as the situation is, fixing government processes is a difficult task. Behind every exhausting government application form or eligibility screener lurks a complex policy that ultimately leads to what Atlantic staff writer Anne Lowrey calls the time tax, “a levy of paperwork, aggravation, and mental effort imposed on citizens in exchange for benefits that putatively exist to help them.” 

Policies are complex, in part because they each represent many voices. The people who we call policymakers are key actors in governments and elected officials at every level from city councils to the U.S. Congress. As they seek to solve public problems like child poverty or improving economic mobility, they consult with experts at government agencies, researchers in academia, and advocates working directly with affected communities. They also hear from lobbyists from affected industries. They consider current events and public sentiments. All of these voices and variables, representing different and sometimes conflicting interests, contribute to the policies that become law. And as a result, laws reflect a complex mix of objectives. After a new law is in place, relevant government agencies are responsible for implementing them by creating new programs and services to carry them out. Complex policies then get translated into complex processes and experiences for members of the public. They become long application forms, unclear directions, and too often, barriers that keep people from accessing a benefit. 

Policymakers and advocates typically declare victory when a new policy is signed into law; if they think about the implementation details at all, that work mostly happens after the ink is dry. While these policy actors may have deep expertise in a given issue area, or deep understanding of affected communities, they often lack experience designing services in a way that will be easy for the public to navigate…(More)”.