How we digest news is changing.
Oversized, gaudy headlines still adorn the front pages of the world’s largest print publications, clickbait titles remain the best practice for online outlets chasing viewership and social media has grown from a dorm-room network of millennials to a deafening echo chamber circulating baseless claims.
Traversing today’s media landscape, akin to firewalking in search of facts, requires a diligent, tuned-in reader, a hunter of truth.
But who has the time or the desire to be so meticulous?
The average human attention span drops every year: Recent data suggests consumers’ focus maxes out at 8 seconds—shorter than that of a goldfish.
Nearly 60 percent of Twitter users share articles without reading beyond the headline.
Voter turnout levels in the U.S. bumble along each election cycle in the 55 percent range, showcasing just how lax Americans’ concerns are about participation. And it makes sense.
Very few are engaged enough to truly apply the level of observation and scrutiny required to wade through the sheer volume of misinformation, which applies to every facet of life.
At root, is the convenience of going big: big print, big talk, big lies. There’s very little retribution for doing so. The margin for error is a canyon, but the punishment for acting deceptively is negligent, which incentivizes risky behavior such as the spread of false information.
The nexus between verifiable facts, half-truths and blatant falsehoods is increasingly unsettling; they are many times construed to be one in the same depending on who releases the information and who interprets it.
For instance, half the population can knowingly absorb untruths with little concern for accuracy, citation or agenda. As long as it conforms to their brain’s wiring, then it reaches the threshold of belief, or even truth. There’s not all that much at stake, on first glance, with thinking in this manner, and, in fact, science has shown that the human brain rewards itself when it receives pleasurable news that fits into its existing ideology.
Scaling this wall of dissonance is a colossal, and likely futile, exercise—the payoff to breaking down cognitive barriers is virtually nonexistent, at least not at a scale that could prove to be momentous. So most people don’t try.
Another component to understand is the context in which information is released in the first place, and by whom.
President Trump has personified the idea of the bully pulpit, and his private Twitter account stands in as an effective mouthpiece for disseminating critical information in a way that only he intends. Short, staccato sentences, heavy exclamation and unveiled digital threats serve as the PR arm of his press staff—there’s no dilution of the message if it comes directly from himself.
The same could be said for other administrations (sans Twitter) because the power inherent in having the loudest and largest microphone in the world is too important to pass up; however, Trump has singlehandedly blurred the lines of truth, that by the time it reaches the ears and eyes of consumers, it’s no longer recognizable.
It’s this prism, through which information is filtered, that contributes to further entrenchment of incorrect views and fuels outright anti-truth positions. Trump is now the sole arbiter of right and wrong, the namesake of contemporary mass media—everything begins and ends with him.
And the noise coming from the deluge of Trump coverage is numbing.
The rise of Trumpism is not an isolated incident, however.
Online forums are littered with vitriolic, one-sided conversations; social media enables friends and followers to be banned for contradicting a user’s beliefs; popular news sites devote millions of dollars to broadcasting narrow-minded stories and think pieces; college campuses protest speakers with opposing viewpoints; and cable and radio news stations solicit biased guests.
So who’s to blame for the sequestering of honesty?
There’s no way to truly point fingers at a single culprit of our collective trust gap. Nearly every entity that Americans interact with has a level of residual bias that’s capable of shaping one’s opinions for life, for better or worse.
Recent polling shows that trust in the media regularly slips each year, and that a plurality of voters believe journalists make up their sources. Trust in other core institutions has eroded as well.
On the other end of spectrum, trust in the media is still higher than trust in Trump, which does not bode well for the president’s approval ratings. But Trump’s lackluster ratings don’t validate the general lack of efficacy at large, which existed prior to Trump and will likely continue beyond his administration.
The bigger issue facing the average citizen may be selecting what form of media to believe in the future. Perhaps we’ve reached rock bottom and a renewed faith in investigative journalism and fairness of our political organizations is on the horizon.
But studies show that who shares or reports information is a better determinant of whether a person believes it than the true accuracy of the information itself. In this sense, it’s our own fault if fake news spreads, and the only remedy is to hope that people place their trust in people who are categorically more honest—a tall order.
What is known is that singular moments (good or bad) can have an outsized influence on forming opinions, and each entity that shares news is responsible for placing truth on a pedestal. This could result in social media companies refining algorithms to penalize false news sources, as Facebook has experimented with, but it also means that those very same standards we pin on external organizations must be applied at an individual level as well.
In the world of alt-facts, as George Saunders says, we are each “our own alt-president.” We must all stand trial for our own peculiar form of impeachment, or elect a better version of ourselves to the high office of everyday life.