Politics in the age of Internet algorithms

We are more than 500 days out from the 2016 presidential election, but the race has already begun to dominate headlines. It will be the 58th quadrennial U.S. presidential election, but this campaign season will be like nothing that came before.

In 2008 and even in 2012, social media, mobile technology and the general sophistication of the Internet were still in their early stages. The role the Internet plays in how our elections play out can no longer be ignored. But what does that mean to those of us who do not work in campaign strategy?

In this digital age, more than ever before, we have the power of choice. I don’t have to watch whatever movie Hallmark is playing at 8 p.m. I can choose what I want to watch on Netflix. I can choose to skip those terrible political ads on television. I choose my own ad-free Spotify playlist, tuning out the radio babble. I decide what I want when I want it, right?

Well, that is only partially true. It seems that now a Filter Bubble algorithm makes many of these decisions before me. A Filter Bubble is a “result of a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history),” according to Internet activist Eli Pariser. “As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles.”

This invisible algorithmic editing of the Internet is used by all of the Internet giants. Powered by heavy commercial incentives, algorithms are used because companies know that if they feed you the “right” content, you will spend more time on their sites. You make them more money. Our increasing use of these Internet services proves the value of the algorithm to companies such as Facebook or Yahoo. Personalization algorithms are not limited to social media or search engine sites. Media outlets such as the New York Times and even the Washington Post have begun to flirt with this method of attracting Internet traffic.

Your Google search results and your Facebook timeline will never be


he same as the person next to you. Even if you are not logged into a specific account, there are “57 signals that Google looks at, everything from what kind of computer you are on, to where you are located. There is no standard Google anymore,” according to Pariser. As more and more people discover news and content through personalized feeds, the stuff that really matters can easily fall out of the picture.

Pariser has been an open critic of the algorithmic personalization of the web and its potentially harmful effects on our collective future. Faced with backlash, Facebook’s data science team put part of the Filter Bubble effects theory to the test and published a (heavily criticized) study. “We conclusively establish that on average in the context of Facebook, individual choices more than algorithms limit exposure to attitude-challenging content,” the three authors, Eytan Bakshy, Solomon Messing and Lada Adamic, wrote. “Our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals.” In summary, Facebook claims it’s our own fault.

Whether you believe that personal choice or the personalization algorithms are the dominant cause of the Filter Bubble effect, it doesn’t matter. The Filter Bubble is a phenomenon that can easily have a significant effect on how you view reality.

As the election season carries on, it is important to be aware that the opinions and stories you see on on the web are strongly influenced by what these algorithms think you should see. The Internet shows you what you want, whether it is by active choice or the product of an algorithm. Your news feed, your search results, are a reflection of your own interests and position in society.

Despite what Facebook repeatedly claims, the algorithms do significantly influence what you are exposed to. However, Facebook is partially correct in one regard: Choice is also a factor, so please choose to look beyond your personal reality.

Sarah Catherine Smith is the digital content editor of the N.C. State newspaper Technician in Raleigh.