Earlier this year, Meta, the company that owns Facebook, Instagram and WhatsApp, announced it would be ending its fact checking program.
Instead, the company said it would be relying on other users to correct misinformation, following the footsteps of X. However, the decision has made some concerned that the spread of misinformation and disinformation will grow.
Misinformation refers to inaccurate information. Fake news and disinformation refers to information meant to mislead. It can come in many forms, including manipulated photography, false press releases and propaganda.
Ben Lyons, an assistant communication professor at the University of Utah, described how misinformation online works.
“Even if you set aside misinformation, just think about how much faster the 24-hour news cycle has sped up, how much more information people are consuming, more pieces of information at less depth,” he said.
Echo Chambers
One thing perpetuated by disinformation online is the advent of echo chambers, which is a space or environment where a person only encounters information and perspectives that fit their own worldview.
“You’re not getting a lot of outside perspectives interjected in. You’re not getting a lot of outside moderation or something like that and so those can be breeding grounds for extremism,” Lyon said on the nature of more clustered social media like Reddit compared to more open platforms.
Platforms like Facebook and Instagram are more built around a person’s personal connections, encouraging users to follow family or friends. On the other hand, platforms like Reddit and X create a network that is less based on networking and more based on similar interests. This is much more optimal for echo chambers to be formed as there is less room for opposing opinions.
“Any sort of disjunction between a headline and the actual body of the text can be quite misleading, as well,” Lyon said. “Moralized content online gets a lot more engagement, things that turn politics into a moral debate where people can grandstand rather than talk about policy.”
Social media feeds operate on an algorithm that exploits a consumer’s interests to keep them online for as long as possible. Extremist opinions on events or people that are likely to convert views into engagement such as likes or comments are thus pushed to consumers. This means accurate and concise pieces of news are not favored.
Combating Misinformation
A 2022 study from the Reuters Institute found that only 41% of Americans said they trust the news they see. That same study also found that online sources have become favorable over print or television news. This means Americans, including students and voters, are relying on finding possibly sensitive news and information in a space that’s at a high risk of containing falsification due to the ease of producing and sharing fake news on social media.
Misinformation relies on an air of trust with consumers usually founded on preconceived beliefs, meaning that it is possible to know when to be critical and validate a source.
Aspects such as lack of evidence or source credibility, grammatical errors and ambiguity of headlines were all “technical” red flags according to participants who had to sort out false and true pieces of information in a 2021 study on experimental evidence for media literacy.
This study also showed practicing the habit of being critical of new information prevents the creation of an echo chamber for individuals in the online space. Checking information across several sources, diversifying news sources and being skeptical of information that elicits a strong emotional reaction are other ways to avoid misinformation.
t.govitviwat@dailyutahchronicle.com
The post Social Media Echo Chambers and How to Avoid Them first appeared on The Daily Utah Chronicle.