Filtering News Feeds & Fake News on Facebook: It’s All in the Terms & Conditions

Recently released studies show that people are unaware that their Facebook news feed is being filtered by the social network. Facebook uses an algorithm that looks at what videos people watch, what content they click on, and who they interact with through “likes” and “comments”, and, using that information, the algorithm filters new content to show things that are “more of the same.”

What ends up happening as a result of this creation of “filter bubbles” is that people only see content that they agree with, while the Facebook algorithm screens all dissenting points of view.

This makes it easier than ever for propaganda, in the form of “fake news,” to spread amongst groups of people who share the same beliefs and interests, and propaganda can come from all over the world. For example, teens in Macedonia were found to be behind a large portion of fake pro-Trump news.

In the wake of the Parkland school shooting, the danger of this filtering became apparent when students who survived the Parkland High School shooting and publically called for greater gun control were accused of being hired crisis actors in a series of videos that went viral on more-conservative, pro-Trump newsfeeds. Some were quick to condemn attacking the victims of the shooting, but others actually believe that these teenagers are hired actors paid to push a “left-wing agenda.” This is not the first time that videos have circulated after a mass-shooting questioning the validity of what happened, either. Following the Sandy Hook shooting, for example, there was a slew of stories claiming that it was all a hoax.

This is not to say that Facebook’s algorithm is responsible for creating the propaganda. Propaganda has been around long before the internet and Facebook was ever created. Take what happened in Little Rock in 1957, for example. There, the “nine African American teens who braved racist crowds to enroll in Little Rock Central High School in Arkansas” were accused of being paid protestors who were brought down from the North by the NAACP, when they were actually the children of local residents. The difference, though, between then and now is that those false stories were circulated in newspapers, and people did not have the ability to quickly search for evidence to the contrary. Today, a world of information is a few keyboard strokes away, but no one searches for it. For many, they do not search because they do not think that they have to; they are unaware that they are not getting the full picture.

Though Facebook denies it is a media company, almost half of U.S. adults get their news from Facebook. As a result, almost half of U.S. adults get news that is highly filtered based on who they are friends with and what political affiliations they have indicated through their page. The news is also filtered based on emotional content; Facebook has admitted to manipulating users’ feeds to adjust how many positive and negative posts they saw,” because it found that positive feeds leads to positive posts, and vice versa. Facebook argues that its news feed algorithm “only slightly decreases users’ exposure to news shared by those with opposing viewpoints,” because individual choices, in terms of friends and “what content they select, matters more than the effect of algorithmic sorting,” but when a major source of news shelters people from information that contradicts their belief, it is not doing those individuals, or society as a whole, any favors.

Is it legal for Facebook to filter news feeds and manipulate what information users see? Users agree to such manipulation in Facebook’s terms and conditions; terms and conditions that are long, complicated, and impossible to understand for the majority of people. The good news is that Facebook users can alter what information they see in their feeds. They can do so by intentionally seeking out content contrary to their own views, becoming friends with individuals whose views differ from their own, and following a wider range of pages. The question becomes whether or not they will choose to do so.