Facebook was a crazy place leading up to the election; however, much of content that was shared did little in the way of having people change their views on an issue. According to PEW research, only 20% of people have changed their minds because of information they obtained through social media. I believe the percentage is so low because Facebook only showed you one side of the story and that was the side on which you believed. Because of this, the Facebook algorithm played a large role in creating one of the most divisive and polarizing elections the United States has ever seen because it strengthened individuals already existing beliefs and did little to expose them to the other side.
Facebook’s newsfeed algorithm is setup to show individuals content they will most likely find interesting and click on. If your views skewed liberal, you would see more and more liberal content; if your views were conservative, you would see more and more conservative content. The Wall Street Journal actually curates content from each side and enables people to see them simultaneously on “Blue Feed, Red Feed; See Liberal Facebook and Conservative Facebook, Side by Side.” The Facebook algorithm shows individuals only content that aligns with their views in order to increase engagement. This created what many – including a recent TechCrunch article – have called an “echo chamber.” The echo chamber simply reinforced already existing beliefs to push people further and further apart.
The fact that Facebook is “technology company” and does not view itself as a “media company” further exacerbates the issue because Facebook has done little to verify that the content being shared on its network is factual. This allowed fake news stories to become extremely popular on the network, as many have suggested in recent days. The Washington Post calls attention to this in the article, “Facebook has repeatedly trended fake news since firing its human editors.” An article on Buzzfeed highlights how people saw this as an opportunity to make money by uncovering how some teens in Macedonia setup a bunch of pro-Trump websites, created fake news stories and decimated them on Facebook for profit. Because these fake news articles were never verified, coupled with the fact that 83% of people trust the recommendations of friends and family, according to Nielsen, these stories were perceived as reality by many on Facebook. The stories were then shared and amplified, and the echo chamber effect continued to reinforce already existing beliefs. This made Facebook the ultimate propaganda machine for this election cycle.
Facebook – and specifically Mark Zuckerberg, CEO – have been getting a lot of heat for this. Many, including myself, believe this is a huge problem that Facebook decided to ignore during the election. According to Buzzfeed, even some Facebook employees feel the newsfeed and the promotion of fake news had an impact on the election, prompting them to form an unofficial task force to investigate. Facebook did take some action in recent days, announcing it has made an update to language in its Audience Network policy to explicitly include fake news sites among its list of banned display ads. This was on the heels of a Google announcement that it will ban fake news sites from using its advertising platform. According to a New York Times article, “Taken together, the decisions were a clear signal that the tech behemoths could no longer ignore the growing outcry over their power in distributing information to the American electorate.”
I know I want a free internet, but with that comes the ability to say anything regardless of fact. It will be interesting to see how these tech giants continue to try and navigate this issue while remaining independent.