Fake news has been the real news. But it’s not really news: How many of us were surprised to hear the claims that fake news influenced the election? How many of us fell over when they learned that in the month leading up to the election, fake news outperformed real news? And even if that stat was higher than you would have guessed, if you’ve ever used Facebook it makes perfect sense in about three seconds.
What’s more, most of this reporting and analysis blames Facebook and Twitter for influencing the election. I’m not so sure of the extent of that, but in any case we can’t just point the finger at social media. Well, we can. And I do, partly — and from experience as (I admit it) a digital marketer. But there’s a lot more to this story: Don’t get suckered by the fake news about fake news.
We all know that “fake news” isn’t anything new. Talk radio blowhards such as Rush Limbaugh and Alex Jones and publications such as the National Enquirer and the Globe have had this beat covered for years — centuries, if you include propaganda. Fake news combines tabloid sensationalism with the persuasive power of propaganda, and then Facebook distributes it — all of it — faster than any newsie or station manager could have dreamed.
That’s how fake news works in the broadest sense. A lot of excellent and popular articles explain it in detail, so I’ll assume you’ve read or heard what they have to say and have at least a general idea of what’s going on. If not, these will cover the basics.
Once we understand how fake news works, it becomes clear there are many bigger questions for which Facebook has no answer. Those questions have serious implications not just for the future of media and politics, but for the future of information and American culture generally. Because for all the stupidity and cynicism of fake news, it’s truly an existential problem. Forgive me for going there so quickly, but it really is: Fake news is only a symptom of an irreconcilable and exponentially deepening split. Fake or real, wrong or right — those have nothing to do with it.
Sadly, not many people seem to be digging into these questions. First, though, we need to understand a little more about Facebook.
It’s worth noting that Google hasn’t come under the same kind of fire that Facebook has. Yet Google is also a news aggregator, and it also worked pretty hard to weed fake news out of its search results. Those efforts, predictably, birthed another alt-right conspiracy, but the company has managed to stay largely out of the post-election fracas. Facebook isn’t necessarily doing anything different than Google — but its platform works much differently, and serves a different purpose.
From the beginning, Facebook knowingly blurred the line of what counts as “news.” The company not only anticipated it would quickly become a media platform, it geared itself years ago to be one and probably didn’t want us to notice.
I mean, they called it a news feed.
Yet last week Zuckerberg said Facebook was “a tech company, not a media company.” And when the company caught shit this spring for suppressing conservative media, it chose to employ robots over journalists to control the news feed. Facebook could have easily gone the other way and hired scores of journalists, but it wanted to avoid the label of being a media company and the human responsibility of editorial integrity.
But media and tech aren’t mutually exclusive. Tech companies don’t just make tech to make tech; they make tech to make money. Facebook has a lot of business interests, and media is without a doubt one of them. And the simple fact that Facebook chose to call it a news feed — consciously or not — shows just how concerned the company is with vetting the importance or veracity of posts. The top news you see on your news feed isn’t necessarily the top news of the day—it’s just the most popular things your circle of friends is talking about. Kids. Vacations. Sucking your cat’s foot when you’re drunk. To Facebook, it’s all news. The most popular stuff gets the most engagement, and that gives Facebook more of the valuable data it sells to marketers as its core business.
To Facebook, all news is good news, and no news is bad news.
The platform has a moral responsibility to its users when it comes to serving up fake news, but its reactions to the accusations it’s faced lately show that driving engagement for advertising dollars is more important than the content that drives that engagement. That’s a level of cynicism that borders on nihilism. And as if he wanted to reinforce that fact explicitly, Zuckerberg chose to use robots instead of human experts to control fake news, suggesting he’d prefer it if Facebook avoided moral responsibility altogether. How can we hold Facebook morally responsible, he seems to challenge us, if the platform is itself amoral?
Or is it simply inept, and up against a much more powerful social force?
Facebook clearly has an incentive to keep you engaged on the site. It took them years, for instance, to let you do anything other than “like” or “share” a post. And it’s telling to look at the new “reactions” now: love; laugh; sad; wow; angry.
These aren’t exactly complex emotions. They don’t communicate critical thinking or present other points of view or competing arguments. But these simpler, more primal emotions that everyone shares are more engaging, and by definition have more reach. And outrageous fake news that inflames voters? Max emotion. Max engagement. Max clicks. Max data. Max money. Truth be damned.
“It’s hard to identify truth,” Zuckerberg said. He’s right. Truth is fuzzy. Some say we’re in a “post-truth” era, but you could make a solid argument that capital-T Truth doesn’t exist and never has. That’s a red herring here. We’re not asking Facebook to identify the truth — after all, misleading news is nuanced and deceptive and tough to weed out. We’re asking Facebook to identify lies, and it’s not hard to identify the kinds of flaming lies fake news sites promote. It’s not even that hard to identify misleading information. Takes me a couple clicks.
But it is controversial. Earlier this year the right wing took Facebook to task for suppressing “conservative” stories. Facebook investigated those claims and said it found no evidence of bias, but the company still fired its editors and revised its algorithm. And the result? Fake news outperformed real news.
How did that happen?
For instance, in my last piece for Paste I pointed out that undecided voters reacted almost exclusively to news stories about Hillary Clinton. The biggest negative reactions were to the email “scandal” and when she revealed she had pneumonia. Both of those stories were in part fueled by wildly irresponsible reporting from white nationalist sites (i.e., “the alt right”). Outright lies and conspiracy theories about both those stories spread across the country like mercury. There’s at least some evidence that the fake news putsch influenced not just partisans, but reportedly undecided voters.
It had nothing to do with the truth, even when the truth was out there in the mainstream media.
Fake news mainly — but certainly not only — targets conservative audiences. At least we can say this is true of this election cycle, as well as the few years leading up to it. In this election cycle, partly or completely fake “news” sites grew twice as much on the right as they did on the left. (These guys can tell you all about that.) And those “legitimate” conservative stories Facebook exonerated itself of suppressing? They weren’t so legitimate. Facebook, according to a Gizmodo report, chose not to suppress fake news in general because fake news “disproportionately impacted right-wing news sites by downgrading or removing that content from people’s feeds.”
Why does fake or misleading information appeal to conservatives? Related: Why did conservatives just elect a fake news site to be the next President of the United States of America
Perhaps it’s because conservatives — though certainly not only conservatives — have another common trait.
Recent research suggests that the more religious you are, the more likely you are to favor unfalsifiable evidence to explain your faith (“the lord works in mysterious ways”), as opposed to using falsifiable (that is, empirical or scientific) evidence (“man walked the earth with the dinosaurs”).
Further, when people with high religiosity feel their beliefs are threatened, this preference for unfalsifiable evidence — for the unprovable — intensifies.
The appeal seems pretty easy to explain: People want to be right. Or really, they want to feel they’re right and be perceived by others they respect as being right. This doesn’t have anything to do with being correct, or with using the system of logic to reach sound conclusions.
Even more interestingly, when the testability of a claim wasn’t mentioned — when people don’t know if the evidence if falsifiable or not — highly religious people show no preference between the two types of evidence. It’s only when they know they’re using unfalsifiable information that they prefer to use it — and especially when threatened.
If you’ve ever called someone out on a lie on Facebook you’re intimately familiar with this phenomenon: that person will either cling tighter to the lie, or they will lie more. The fact that they’ve been called out is sort of a badge of honor or rite of passage into voting for Trump. Don’t we understand their choice has nothing at all to do with what people on the left believe or tell them is true?
Further, this absolves conservatives of being the ones to draw the line — the person challenging their belief is the one who’s being divisive. Clinging to your story is expressing solidarity with your tribe. The wilder the claims get, the more you’re proving your faith. It digs you in deeper to a common identity — an identity that could be religious, but in our case happens to be political.
So, what exactly does this have to do with Trump? These authors don’t believe their results only apply to religion: “Political and religious ideologies are relatively substitutable for one another because they are rooted in the same psychological needs. Therefore it seems plausible that unfalsifiability might also bolster other types of worldviews or self-views that fulfill deeply held existential psychological needs.”
That is, cognitive research explains the popularity of fake news as an existential phenomenon, be it religious or political. This phenomenon is driven not by Facebook’s algorithm, and not simply by people believing what they want to believe — they want to believe what others like them believe. It’s about deeper issues: identity, belonging to a tribe, faith, and agency. After all, it’s not Facebook’s fault you believe the crap it feeds you, not its fault you share it. It’s your own fault. They just show it to you. They could show less of it, especially in the “trending” section, but that doesn’t address the root problem. It can’t change beliefs, and evidence shows it will probably just strengthen them, uniting the true believers against another perceived liberal, elitist platform.
Believing in the miracles of Jesus Christ and that he is the son of God gets you exclusive access into Club Heaven. You can’t prove either of those things, but any Christian will tell you that proof is not the point. In fact, proof is the very thing you are asked to deny. Ultimately it’s about exclusiveness (being sanctified among other sanctified people, and going to heaven) and belonging.
Can we link religious belief to conservatism? We can in two ways.
According to the Pew Research Group, 33 percent of Americans believe evolution happened solely by natural selection; 25 percent say evolution was guided by a supreme being; and 34 percent reject evolution entirely, say we’ve been like this forever (10,000 years).
That means a minority of Americans fully accept evolution through natural selection. And though 98 percent of scientists believe humans evolved over time, just 66 percent of Americans think scientists agree about evolution.
Further, evangelical denominations are the among the most likely to reject evolution. And as we all intuit but another Pew study illustrates, evangelicals are overwhelmingly conservative. In fact, nearly all fundamentalist Christian denominations are majority conservative, many overwhelmingly so.
But let’s look at the young voters. According to another Pew report, 51 percent of Americans under 30 believe in secular evolution (that is, not guided by a divine power). It might be coincidence but is likely correlated that 55 percent of Americans under 30 voted for Clinton.
There’s another way to connect conservatism, in any political system, to unfalsifiable belief: It’s the basic function of conservatism itself.
If we consider, just for an example, the fact that Republicans (as Trump did at the RNC) call themselves without detectable irony “the party of Lincoln,” we can maybe also draw another parallel. Among some conservatives, there is a strain of self-delusion, or “fake old news.” Of course we’re not racist, even though history shows otherwise.
The political role conservatives play in any culture is to serve as a check on wanton change happening too quickly. This also means that to be conservative is to always be losing (bit by bit) but to pretend you never lost. Fake news has helped to clarify that for us.
And one last passage from that research that links the conservative movement in the United States to religious belief: “Political ideologies, too, are driven by motives such as needs for meaning, symbolic immortality, control, justice identity, or some combination thereof.”
“I alone,” said Trump, perhaps the most un-Christlike presidential candidate we’ve ever seen.
“All this I will give you,” the devil said to Jesus in the desert, “if you will bow down and worship me.”
Have we given in to Trumptation?
In the end, it’s not about truth, it’s about true belief. The tribe of white America, perceiving their country is leaving them out, will carve out a new set of beliefs based on the premise of an identity defined by faith, not by truth. For instance, the faith that the phenomenally un-Christlike Trump can and will prove a savior. And with Steve Bannon—a sort of cynical Cyrano—now tweeting Gabriel’s trumpet for him, using as he always has the primitive drivers of fear of the other, fear of losing control, fear that white America, an entire ethnic identity, is in danger of being wiped off the face of the earth.
And we are increasingly dividing ourselves not into echo chambers, but into separate belief systems based on fundamentally incompatible premises — belief and information systems that are incapable of talking to each other, let alone resolving differences with pesky things like facts. We’re not post-truth here. It’s more profound than that: We’re post-logic. We disagree on the very systems on which we arrive at belief. It’s no coincidence that conservative “don’t believe” humans are causing climate change, for instance. And the fact that about 95 percent of scientists tell us that this is the case only fuels the believers’ contempt and strengthens their opposition.
Further, what role can morality play in such a world? Perhaps Facebook is onto something there. The company might realize that though it can address fake news, it is unable to resolve these two systems that are fundamentally at odds. Who wants to be the whipping boy for that?
And it’s going to get worse: The high priest of misinformation, Steve Bannon, is in our next president’s ear. Our next president himself is a fake news feed. And Alex Jones — whom our next president called after the election to thank for his and his audience’s role in his victory — is already mounting a fight to fact-check “real” news. The overarching national narrative could very well, and very quickly, turn into a fiction.
And in the end, that’s what fake news has to do with election of Trump: It’s a new Civil War, an infowar — as Alex Jones presciently named his insane publication. We’ve had a secession of identity. We’re a house divided.