Plumbing

Precision Air & Plumbing makes changing an AC unit EASY!

The New York Times

“Belonging Is Stronger Than Facts”: The Age of Misinformation

There’s a good chance you’ve received at least one of these rumors, all of which are false, and were recently brought to you as fact: President Joe Biden plans to force Americans to eat less meat; that Virginia eliminates advanced math in schools to promote racial equality; and that border officials are buying bulk copies of Vice President Kamala Harris’s book to distribute to refugee children. All were reinforced by partisan actors. But it’s just as likely, if not more likely, that you passed it on from someone you know. And you may have noticed that these cycles of false indignation keep recurring. Sign up for The Morning Newsletter from the New York Times. We are in an era of endemic misinformation – and utter disinformation. Lots of bad actors support the trend. However, some experts believe that the real drivers are social and psychological forces that people tend to share and believe in misinformation. And these forces are just on the rise. “Why do misperceptions about contentious issues in politics and science seem so persistent and difficult to correct?” Brendan Nyhan, a political scientist from Dartmouth College, featured in a new article in Proceedings of the National Academy of Sciences. It is not for lack of good information that is ubiquitous. Anyway, contact with good information does not reliably convey precise convictions. Rather, Nyhan writes that increasing evidence suggests that the ultimate culprits are “cognitive and memory impairments, directional motivations in defense or support of a group identity or belief, and messages from other people and political elites”. In simpler terms, people become more prone to misinformation when three things happen. First, and perhaps most importantly, when conditions in society cause people to have a greater need for what social scientists call grouping – a belief that their social identity is a source of strength and superiority and that of others Groups can be held accountable for their problems. As much as we like to think of ourselves as rational beings who put truth-finding above all else, we are social animals wired for survival. In times of perceived conflicts or social changes, we look for security in groups. And that makes us eager to consume information, whether true or not, that enables us to see the world as a conflict that places our righteous group against a nefarious outgroup. This need can arise in particular from a feeling of social destabilization. As a result, misinformation is common in communities that feel destabilized by undesirable change or, for some minorities, are powerless in the face of dominant forces. Viewing everything as a major conflict against scheming enemies can feel tremendously comforting. And therefore, perhaps the greatest culprit of our era, more than any other misinformation, is the era-defining rise in social polarization. “At the mass level, major partisan divisions in social identity create intense hostility towards opposition partisans,” which “appears to have increased the vulnerability of the political system to partisan misinformation,” Nyhan wrote in an earlier article. Growing hostilities between the two halves of America fuel social distrust, making people more vulnerable to rumors and falsehood. It also makes people cling much more closely to their partisan identity. And once our brains go into identity-based conflict mode, we become desperately hungry for information that confirms how we feel about them, and much less worried about things like truth or accuracy. In an email, Nyhan said it could be methodologically difficult to pinpoint the exact relationship between general polarization in society and general misinformation, but there is ample evidence to suggest that a person with more polarized views is more prone to believe in Will falsehood. The second driver of the era of misinformation is the emergence of high-profile political figures who encourage their followers to give in to their desire for identity-affirming misinformation. After all, an atmosphere of overarching political conflict often benefits these leaders, at least in the short term, by gathering people behind them. And then there’s the third factor – a shift to social media, which is a powerful medium for disinformation composers, an ubiquitous vector for misinformation itself, and a multiplier of the other risk factors. “The media has changed, the environment has changed, and that has potentially huge implications for our natural behavior,” said William Brady, a social psychologist at Yale University. “When you post things, you are very aware of the feedback you get, the social feedback on likes and shares,” said Brady. So when misinformation appeals to more social impulses than the truth, it gets more exposure online, which means people feel rewarded and encouraged for spreading it. “Depending on the platform, people are very sensitive to social rewards,” he said. Research shows that people who receive positive feedback for posting inflammatory or false statements are more likely to do so again in the future. “You are affected by it.” In 2016, media scholars Jieun Shin and Kjerstin Thorson analyzed a data set of 300 million tweets from the 2012 election. They found that Twitter users “selectively exchange fact-checking messages that cheer their own candidate and denigrate the opposing party candidate”. And when users came across a fact check that revealed that their candidate had done something wrong, their response wasn’t to be angry with the politician for lying. It should attack the fact checkers. “We found that Twitter users tend to retweet to agree, argue, attract attention and entertain,” wrote researcher Jon-Patrick Allem last year, summarizing a study he co-authored. “The truthfulness of a post or the correctness of a claim was not an identified motivation for retweeting.” In another study published in Nature last month, a team of psychologists tracked thousands of users who interacted with incorrect information. Republican test subjects who were shown the wrong headline about migrants attempting to enter the United States (“Over 500 migrant caravans arrested in suicide vests”) largely identified it as incorrect; only 16% said it was precise. But when the experimenters instead asked subjects if they should share the headline, 51% said they would. “Most people don’t want to spread misinformation,” the study’s authors wrote. “But the social media context focuses their attention on factors other than truth and accuracy.” In a highly polarized society like the United States of today – or even India or parts of Europe – these incentives have a strong impact on solidarity within the group and the exemption outside the group. They don’t much prefer the consensus reality or abstract ideals of accuracy. As people become more susceptible to misinformation, opportunists and charlatans are also better able to take advantage of this. That can mean that promising populists smash the establishment and control minorities. It can also mean that government agencies or freelance hacker groups are stirring up social divisions overseas in their favor. But the roots of the crisis go deeper. “The problem is that when we encounter opposing views in the age and context of social media, it’s not like reading them alone in a newspaper,” wrote sociologist Zeynep Tufekci in a widely used MIT Technology Review Items. “It’s like hearing them from the opposing team while you’re sitting in a football stadium with our fellow fans. We are connected to our communities online and ask our like-minded colleagues for approval. We bond with our team by yelling at each other’s fans. “In an ecosystem where this sense of identity conflict is pervasive, she wrote,” Belonging is stronger than facts. “This article originally appeared in the New York Times. © 2021 The New York Times Company

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button