Political Polarization

"How to Win Every Argument" by Eric Barker perfectly encapsulates an idea that has only worsened in our current political climate in the United States. As a nation, we have become increasing politically polarized to the extent that we do not discuss or attempt to persuade, instead we argue. We fight for our political views as if it is a strategy game, as if we need to win. Subsequently, compromise seems to have been forgotten by a significant portion of the public. Whether constructed by ourselves or social media algorithms, people are within filter bubbles. 

Self-Filtering

Amidst the lead up and aftermath of the 2016 presidential election, people on both sides of the political spectrum are opting to unfriend, unfollow, and unsubscribe from people or media sources that promote the opposing side. By doing so, one is isolating themselves from any opposing viewpoints and creating an environment, whether physical or digital, in which their ideas are only reinforced instead of called into question. I am guilty of doing this very thing. I have chosen to unfriend people on Facebook because their ideas were too conservative for my liking. My argument for my choice to unfriend these people has been to avoid toxicity. However by doing so, I am actively choosing to only see my side and my views. As Drew Westen's experiment suggests, exposure to stimuli that opposes one's worldview will result in the triggering of the fight-or-flight response. Political arguments become aggressive and battle-like as each side is attempting to "win." Yet both sides lose because neither can make any actual progress by shoving their views down the other person's throat. To avoid such confrontation, people are opting to avoid such interactions as much as possible. Thus, people self-filter and surround themselves with like-minded people and media.

Algorithm Filtering

Another source of filtering is in the form of social media algorithm filtering. It is becoming increasingly apparent that social media algorithms curate content for each individual based on their respective interests. The purpose for this is to keep people on the site or platform as long as possible. The next step is to show content that will keep those same people interested. Sites like YouTube do this through video suggestions based on what video a person are currently watching. Other platforms have altered people's timelines/dashboards/feeds to show curated content rather than showing content that has been posted most recently. Extending beyond these ideas, an algorithm will feed people content with their political beliefs in order to keep their attention and to keep them satisfied while on the site. Thus, another form of filtering is in effect.

A brief explanation of filter bubbles/echo chambers can be found in this NPR interview of Eli Pariser, CEO of liberal news site Upworthy.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.