In August, Facebook released a statement in their newsroom, explaining that humans no longer write descriptions for the Trending feature on the sidebar of your Facebook profile and, rather, the whole product will now be automated.
Back in July, I wrote a piece examining Facebook’s attempt to return to its roots about people. This piece came after Facebook announced it was steering away from algorithm-based news feed so you would see less from publishers, and more from the people you actually interact with on its platform. Facebook also stopped recommending friends and events based on your location and rather, recommendations are now made based on a human opinion as to whether or not you actually know this person, or would want to attend a specific event.
But, in light of recent news about changes made to the Trending module, it seems Facebook is putting algorithms first again, only to find out that automation doesn’t lead to quality content.
Algorithms are Flawed
The main reason Facebook decided to switch to algorithms is, as it says in the statement about the matter, to help expand their coverage in different languages so they can cater to more countries. Basically, they want to be a news source for more users in other countries and the only way to reach a higher quantity (and to do so cost-effectively) is to automate the process.
Which is exactly what Facebook did, considering they fired most of their editorial contractors (who were mostly journalists) that wrote descriptions for news stories and confirmed their accuracy. It traded the steady editorial hand of humans for an algorithm. Unfortunately, the company learned the hard way just how flawed algorithms could be.
First, there was a video from iHeartdogs.com depicting a dog’s adorable reaction to seeing its owner after two years. The description on the video said something along the lines of, “You have to watch this dog’s reaction,” thus resulting in the algorithm posting the video under the heading, “Watch Dogs 2,” referring to the soon-to-be-released video game about hackers and cyber security.
Secondly, there was the more serious Megyn Kelly incident in which a fake story about the Fox News reporter remained in the Trending news section for hours before anyone on the Trending team realized. Talk about an embarrassing fumble for a platform that provides the news for 44% of the U.S. population.
And finally, there was this, which recently happened to me. Take a look at my Trending topics feed:
The story about Dropbox tells me about a data breach, and the story in Hawaii makes me aware of two hurricanes impeding on the Big Island. Then there’s something trending about Instagram and I would assume the top story is about the new pinch and zoom feature that has finally been released.
But, no, alas, when you hover over and click the story, the top trending piece is about Justin Bieber’s new Instagram account for his dog. It only lasted for a brief amount of time, but, at one point, the algorithm deemed this story as the top ranking piece of content and automated a description, photo and link to a Teen Vogue article. Was it removed because a human decided it did not belong on the popular page? Or did the algorithm eventually push the story back down?
Humans do still make decisions about trending topics
Though a human will still curate which stories make it into the Trending bar, an algorithm determines which stories are popular. A Facebook editor will then remove inappropriate and fake articles, hopefully in a faster manner to take down fake stories.
It’s possible this change is a response to a story published by Gizmodo or not Facebook wants to admit that this was the catalyst-back in May, when reports revealed that employees on the Facebook trending team were routinely suppressing conservative stories.
In it’s statement about the new Trending model, Facebook notes that, “Earlier this year, we shared more information about Trending in response to questions about alleged political bias in the product. We looked into these claims and found no evidence of systematic bias. Still, making these changes to the product allows our team to make fewer individual decisions about topics.”
It’s important to note that the statement refers to the changes as removing the need for an employee to “make few individual decisions about topics.” This is a vague description about how involved these editors are in the process. Sure, maybe humans will have a smaller impact on what shows up in the Trending section, but they still have the opportunity to place their bias on what we see because their algorithms are human-generated. Moreover, editors still have the chance to filter out what they consider inappropriate or fake news.
All that being said, humans are still determining what you end up seeing as trending news on Facebook and the company continues to be vague about exactly how involved they truly are.
What’s trending and what should be trending
I’ve always been annoyed by the Facebook trending module largely in part because of how much click bait and unimportant news it tells me to read. Before they had a so-called anti-click bait algorithm in place, Facebook would trend articles based on how often users were sharing articles, thus making it incredibly easy for stories from Buzzfeed and Upworthy to trend. I can’t tell you how many times my Facebook Trending section has been filled with “important news” about the Kardashians, Dancing With The Stars and whatever new jaw-dropping video I just have to see about a young kid who absolutely killed their audition on America’s Got Talent.
Now, with this new Trending module, Facebook is using a mix between an algorithm and the patterns of people who seemingly don’t know the difference between a reputable news source and a straight up fake one. There seem to be so many flaws in how these humans and algorithms work together and determine what should be viewed.
But, where I get annoyed is how Facebook determines what’s trending and what should be trending. Considering that almost half of Americans use Facebook to get their news that gives a lot of power to a company is continually vague about how it determines what pops up in someone’s Trending section. In an attempt to “remain neutral” Facebook doesn’t seem to recognize that they are, in fact, operating like a newspaper.
When the majority of my Trending feed continues to be about entertainment news and celebrities, I have to wonder—is this really what everyone is talking about and cares about? Or is it just Facebook creating content it wants us to react to? Why are certain articles trending more so than others, especially when said article is from a news source many of us have never heard about? Does Facebook even understand how their Trending module works? Again, I think it comes down to Facebook’s confusing cross of a part-human, part-algorithm process that is biased and neither can actually determine if something is trending by accident or not.
At the end of the day, maybe people just shouldn’t rely on Facebook Trending topics for finding their news. But how do you tell that to billions of users who do?