Editorial: Show Me What I Can’t See

Instagram. Facebook. Twitter. Snapchat. YouTube. Ever wonder how these sites curate your content? The truth is: not well. Social media algorithms divert us from connecting with opposing viewpoints, stagnating progress.

 

New introductions of artificial intelligence continue to plague our world with inaccurate information that serves to narrow our viewpoints further. The reality is that media has a huge influence on public opinion, as we can see with the 2016 elections.

 

Sure, social media can evoke a good laugh and acts as comedic relief from the horrific events that have spread like a gross STD. However, there needs to be a balance between being playful and serious.

 

The content that covers our devices serves as static noise to hide the real political climate of our country. I mean, how many times can you say that an article shows an accurate depiction of all sides regarding a topic or argument? Is that not what makes a true news story?

 

The Berkin Klein Center for Internet and Society at Harvard University conducted research which looked at algorithmic curation and other forms of automated content control. This study introduced an idea called the “filter bubble” which is a term used to describe the production of content which is specifically tailored to a users’ specific viewpoints.

 

Sounds pleasant right; who doesn’t want to see more of what they like? The problem is that this kind of system creates more extremist views among users, blinding them from ever seeing the points of the opposing side.

 

This kind of polarization of views creates a disconnect between individuals, stopping them from ever coming to a middle ground. This is what we don’t want. The ability to be open minded and accepting of other views is virtually the only way in which the most amount of people can be pleased. Not everyone is going to be happy with every decision being made. However, if we were more accepting of new ideas, then we can curate solutions which we face today.

 

Not only are these kind of algorithms hiding us from the opposing side’s arguments, but they are also shaping our political and individual identities. The gravity of this is not being talked about and it is important to understand a system which we live by on a daily basis.

 

That being said, there is always a positive to a negative. This algorithmic system pulls together information from your content consumption and creates a stream of posts that are in line with your views. This means that others, with the same view, are also being grouped in the same section as you. There is a great power in numbers and by grouping individuals from the same political stance, they are more able to communicate and organize ways of setting things in motion.

 

Unfortunately, at the moment there is no remedy for this ailment. If only we could take a pill to get rid of this headache of a problem. What we can do is be more conscious of our content intake and move towards being more understanding of other views.