Blackbox algorithms: the modern-day Plato’s cave

The internet world in which we currently live is perfect. As social media users, we wield an immense amount of power to completely tailor who we are in our ability to pick and choose what we want the world to see of us. Our Facebooks, Twitters, Instagrams, Soundclouds, etc. essentially become highlight reels—presentations of perfected characters and personas we have crafted. Why would anyone want to post about that one shitty night they had when they weren’t invited to that one big thing happening at that one place, when instead they could post a #throwbackthursday photo of their trip last month to Punta Cana with more saturation and a C1 filter? The internet world is perfect—because of the power it holds to ignore, manipulate, and conceal alternative facts.

A major contributor to this perfection and alteration online are blackbox algorithms and their interference with our access to information on social media. Blackbox algorithms operate on guesstimation—taking all the data you provide, calculating what might be relevant to you based on that information, and then showing you what you may want to see. It is an imperfect process based on suggestions that learns and improves with the more data you give out and practice it receives.

The problem with this is two-fold. First, blackbox algorithms inhibit the broadcasting of news and information regarding current events. By promoting posts that seem more relevant to you rather than more controversial national or global news which may seem more alternative given your online presence, important news stories are stifled. In a well-known TED talk on the subject, sociologist Zeynep Tufekci references how during the summer of 2014, her Facebook feed was flooded with fun posts during the ALS ice bucket challenge, just as it was for many others. It was only later she realized that it had been completely void of any references regarding the Ferguson shooting controversy occurring at the same time. I think we are all aware of the threat these algorithms pose to the prevention of fake news. Algorithms are the gate-keepers to the knowledge and truth of information we have access to online. It is the essence of blackbox algorithms to interfere with our access to information in the modern day world.

Second, by keeping us in these filtered and tailored bubbles, algorithms restrict the development and progression towards attaining a civil discourse. Our current resources and state of technological advancement puts us in an incredible position to connect with people across the world in different cultures and political climates. Yet the workings of algorithms promotes isolation from different information and viewpoints, which are necessary for a greater understanding and therefore the development of more productive discourse. Given that your friends are often more prioritized given their relevancy, your world becomes smaller along with what you know about it. It introduces the risk of entering into a soundboard for communication with only sources closest to you such as your family and friends, and advertisements tailored to you, all of which are often mostly reflections of your own values.

One might argue that there is a reason for these algorithms, that these social media sources are designed to improve the online experience as consumers, so what if you actually prefer that? What’s wrong with some personalization or tailoring? Of course, maybe you truly just don’t want to hear what others have to say. However, given that online and social media serves as the second most popular source of information for Americans today, it needs to be representative of reality in its diversity of news and opinion. One cannot gain an accurate understanding of the world with tailored information. With blackbox algorithms, one’s communication online truly can become void of other viewpoints, which are necessary for conversation and understanding. They can inhibit, or even regress, the production of civil discourse.

One’s engagement in reality offers the privilege of exposure, but unfiltered exposure along with unmanipulated information is nearly impossible to find in this wildly interconnected world. Still, given the serious ethical concern such issues raise we ought to do what we can to take control of our access to information. We need to obtain more agency online, in both our exposure to and consumption of information. As social media users we must demand the power to deactivate such algorithms if we so choose to. Otherwise, we knowingly permit the continued interference of our access to raw and unfiltered information, and forfeit the ability to engage outside personalized bubbles of hyperreality. Without more control over blackbox algorithms, we risk trapping ourselves in our own futuristic allegory of the cave, looking at shadows programmed to keep us satisfied and ultimately shape our understanding and perception of the world. We risk staying alone and static, without others and without progress. We must leave the cave.