All content-based social media platforms, from YouTube to Twitter, are built upon a complicated algorithm. These algorithms are coded to feed users content from outside their existing followed-list. But the algorithms work one (or both) of two ways: finding content it thinks you’ll like (often pulling from the likes of those you follow, along with similarities to content you’ve previously liked), and content you’ll engage with the most.
The algorithm. Engagement. These are big ol’ buzz words in the content creation sphere. Both of which, in a way, feed the flames of division and political polarization. Success on social media is all about cracking the algorithm and keeping people engaged. For some companies or groups, this means keeping their content light, fun, and politically neutral. But for most, content succeeds when it is carefully built to piss half the viewers off, and rally the other half to defend them.
The light and politically-neutral posts have a chance at garnering likes, brief and happy comments, and maybe a share to a close friend. Meanwhile the divisive posts prompt paragraph comments from both the viewers that love it and hate it. Those viewers wind up debating in the comment sections, driving up the comment-count while letting the content run over and over in the background. These posts, and the comments on them, often force viewers to pick their side of the mob, inching further out toward the passionate fringes of their political beliefs with every post. They’re brain feeding them rushes of dopamine to associate with the feeling of passion and belonging they get when typing to defend those they agree with or attack those they don’t. Then all of it loops with more posts, an endless feed, just one more hit beyond every scroll.
Another factor that contributes to the political division we see online is this idea that everyone who doesn’t believe the same as we do must be ‘blind’ or ‘stupid’. Clearly they aren’t paying attention to the information they have. Unless, they don’t have it. Platforms with combinations of the two aforementioned algorithm types will interweave the types of content its showing you. You’ll see a few posts that you’re likely to like, from political leaders, news organizations, influencers, or even bot pages that believe the way you do. You get lulled into this political mean world syndrome, assuming the world you’re seeing on your specific feed is how life really is for everyone. Then the divisive post comes, and its hard to fathom how the people you’re disagreeing with in the comments have missed it all. The thing is, they’re likely getting a completely different strand of information, maybe even on the same topics and events.
So, what could possibly be done to fix all this and bridge the political divide? Call me a pessimist, but short of a complete shut-down of major social media platforms, there is no feasible solution. It is going to take some sort of government regulation of social media content, a thorough fact-checking system, and an overhaul of the existing algorithm structure. With the way things are build now, so long as we have any social media installed on our devices, or are even surrounded by others who do, we are locked in a mental workhouse, tricked into exchanging valuable mental real estate for fleeting hits of dopamine. Until we are all seeing the same content, being given the same facts, we can’t expect to agree on anything. We can never expect basic reason to bring us to a consensus on any topics when we are getting our information from sources that profit off of division.
Marlowe Lalonde, a student in Jon Pfeiffer’s media law class at Pepperdine University, wrote the above essay in response to the following prompt: “The Influence of Social Media on Political Polarization: Assess how social media contributes to political polarization and the potential solutions to mitigate this effect.” Marlowe is a Public Relations major with a focus in Sport, aspiring to a career in PR and Communications in international motorsport.
Contact Jon and his team today.