Anonymous sources recently claimed that Facebook’s Trending Topics may not be as algorithm-driven as believed. Facebook strongly denied the allegations, but are the algorithms receiving help from curators to surface topics in the Trending box? With so many people getting their news from Facebook, is it ethical to emphasize or devalue certain news in the trending topics?
Citing anonymous former Facebook employees, Gizmodo reported that the trending box was manipulated to suppress conservative content. Additionally the sources alleged that they were instructed to insert topics into Trending because they came from major news organizations, and not because they were naturally trending.
Facebook is a primary source of news for many users now, and that segment is growing. Therefore, it seems deeply unethical that Facebook would manipulate a feature users expect to be based on buzz. Facebook denied the accusations, yet the allegations have provoked inquiries from members of Congress.
Facebook isn’t necessarily struggling to compete with other networks, but when it comes to live news, it seems that Twitter comes first. In some ways, Facebook may be losing out by surfacing popular stories in the way it does. Twitter favors timeliness, while Facebook seems to favor critical mass, which means that Facebook will always be a step behind. Manipulating the trends would be one way to stay current.
Fighting the spread of bad information can also be a challenge because of the user-generated nature of news and trending topics on Facebook. Likes and shares for primary sources, particularly mainstream science publishers, pale in comparison to conspiracy theory publishers on Facebook. Hoaxes, conspiracy theories and other bad information tend to spread like wildfire on the site.
When we add the filter bubble effect of personalization and individual content selection, and theamplifying nature of internet outrage, the reality may be that social networks are not be a good place to get news.
Obviously, Facebook doesn’t want to disseminate dubious news through its most powerful news discovery tool. But what if that’s what users are talking about? In its attempts to be a better news source, Facebook may be at odds with a user base that isn’t interested in facts.
Users come to networks with their biases, and sometimes the networks they use will reinforce their biases. Facebook’s statements claim that it has no interest in supporting biased content, but is it reasonable to assume that human content gatekeepers will remain perpetually unbiased in their duty?