In the aftermath of last week’s election, the media has been actively looking for someone to blame for President Elect Donald Trump. Cultural blowback against Grubhub CEO, Matt Maloney, for sending out a company wide e-mail suggesting that Donald Trump supporters resign their positions within his company has demonstrated that a large percentage of Americans are aware that Trump didn’t win the election due to a strictly racist and misogynistic voting population. Instead of acknowledging that Hillary Clinton may not have been a particularly electable candidate, a number of outlets have taken to blaming Facebook. Specifically, the fake news stories that are frequently shared by Facebook’s users.
The theory is not without precedent. Forty-four percent of Americans say that they do get at least some of their news from the social media platform, and Facebook admits to conducting experiments to gauge the effect different content has on its user base. In 2010, Facebook conducted an experiment in which they showed sixty-one million users one of two messages encouraging Americans to vote in the primaries. One group was shown a simple message suggesting that they go vote, while the other group was shown the same message, with the addition of profile pictures of friends that had indicated that they voted. By comparing their user information to public voter polls, Facebook determined that the people that saw the different presentation of the “go vote” message that included their friends were significantly more likely to go vote in the primary themselves.
In 2012, Facebook conducted another experiment in which they intentionally showed one large group of users only negatively toned content and another group only “upbeat” posts. They found, once again, that those shown negatively toned content were significantly more likely to post negative content themselves, and vice versa. Earlier this year, Facebook was under fire for intentionally censoring conservative news stories from their trending news feature. Despite disputing these claims, Facebook switched to an algorithm to create trending news topics, instead of a team that’s capable of intentional or unintentional bias.
You've reached your daily free article limit.
Subscribe and support our veteran writing staff to continue reading.
In the aftermath of last week’s election, the media has been actively looking for someone to blame for President Elect Donald Trump. Cultural blowback against Grubhub CEO, Matt Maloney, for sending out a company wide e-mail suggesting that Donald Trump supporters resign their positions within his company has demonstrated that a large percentage of Americans are aware that Trump didn’t win the election due to a strictly racist and misogynistic voting population. Instead of acknowledging that Hillary Clinton may not have been a particularly electable candidate, a number of outlets have taken to blaming Facebook. Specifically, the fake news stories that are frequently shared by Facebook’s users.
The theory is not without precedent. Forty-four percent of Americans say that they do get at least some of their news from the social media platform, and Facebook admits to conducting experiments to gauge the effect different content has on its user base. In 2010, Facebook conducted an experiment in which they showed sixty-one million users one of two messages encouraging Americans to vote in the primaries. One group was shown a simple message suggesting that they go vote, while the other group was shown the same message, with the addition of profile pictures of friends that had indicated that they voted. By comparing their user information to public voter polls, Facebook determined that the people that saw the different presentation of the “go vote” message that included their friends were significantly more likely to go vote in the primary themselves.
In 2012, Facebook conducted another experiment in which they intentionally showed one large group of users only negatively toned content and another group only “upbeat” posts. They found, once again, that those shown negatively toned content were significantly more likely to post negative content themselves, and vice versa. Earlier this year, Facebook was under fire for intentionally censoring conservative news stories from their trending news feature. Despite disputing these claims, Facebook switched to an algorithm to create trending news topics, instead of a team that’s capable of intentional or unintentional bias.
This algorithm, however, has made it easier than ever for fake news stories to appear in the “trending” section of user’s platforms, but is Facebook really to blame for the amount of fake news stories shared throughout their site? At what point does the responsibility transfer away from the user to exercise a critical eye and onto the platform itself?
The real issue is psychological, not political. Human brains evolved without an internet’s worth of knowledge available at our fingertips, so when presented with the insurmountable influx of information we now have available, our brains struggle to keep up. Our two primary psychological defense mechanisms tie directly to the issue of fake news on social media: confirmation bias and source amnesia.
Confirmation bias speaks to our inherent desire to find information that supports our pre-existing positions on a subject. If you believe Donald Trump to be a successful businessman, you are more likely to find and agree with articles that support that position. If you believe Trump is an anti-Semite, you’ll likely scroll right past the article discussing his business acumen and instead hone in on an article about the KKK endorsing his candidacy for president. Both stories may be accurate and factual (or neither of them) but you will subconsciously assign more credibility to the article that agrees with your political bias.
Source amnesia plays an even more vital role in the effect sharing fake news on social media outlets such as Facebook can have. In effect, source amnesia speaks to our brains inability to retain everything we read, so it chooses only the portions that seem important to recall. A classic example of source amnesia might be the knowledge that Paris is in France. You know it to be true, though you likely don’t recall where you learned it. The human brain is forced to make the distinction between what’s important to retain and what isn’t, and more often than not, you brain will favor the fact, rather than the source.
This becomes a complicated issue online. Six out of ten adult Americans admit that they rarely read past the headlines in social media or on news sites, instead gleaning what they can as they scroll by. If you happen to spot an article with a bold headline that supports your existing position (confirmation bias) you are more likely to recall what the headline said, but unlikely to recall where you saw it or whether or not the source was credible (source amnesia). Two weeks later, you may recall that headline as a fact you know to be true, though you can’t quite recall where you learned it.
However, these psychological shortcoming can be combated through intentionally managing your attention. If you spot an article that confirms your suspicions that whiskey makes you a better singer, take a moment to assess the credibility of the source before clicking the “share” button. If we each played a more active role in the content we shared online, the quality of the content we scrolled past would improve.
So, armed with a better understanding of why we, as social media consumers, may be ill equipped to deal with the onslaught of fake and misleading “news” stories we find online, a clearer image of why many are blaming Zuckerberg and company for the election results begins to appear… but it’s still wrong.
Zuckerberg posits that, while there is certainly fake news shared through his site, it does not account for the majority of the content, nor are fake news stories inherently conservative in nature. To assume that an otherwise undecided voter may have been swayed by a fake news story seems plausible, but to suggest enough Americans were so thoroughly swayed by fake news stories that it cost Clinton the election is dismissive of the real concerns red state voters cast their ballots to address.
“I do think there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is they saw some fake news,” Zuckerberg said at a Thursday evening conference. “If you believe that, then I don’t think you have internalized the message the Trump supporters are trying to send in this election.”
The issue with fake news being distributed online isn’t strictly political; attention grabbing, click bait headlines have infiltrated every demographic, every type of content that could feasibly draw some clicks. The real problem is us, not the platform, failing to look past inflammatory headlines to assess the credibility of the sources and truly reading what’s being said. If we want to improve the quality of the content we see shared, we need to share higher quality content; content written by writers that care about the truth more than they care about hashtags, and what’s right more than what’s trending.
Ya know, like SOFREP.
Image courtesy of Shutterstock
Join SOFREP for insider access and analysis.
TRY 14 DAYS FREEAlready a subscriber? Log In
COMMENTS
You must become a subscriber or login to view or post comments on this article.