The Consequences of Social Media Require Regulation

Home > Blog > The Consequences of Social Media Require Regulation
The Consequences of Social Media Require Regulation

Jan 30, 2023

Today, more people than ever before have access to a vast stream of information: ranging on everything from movie plot synopsis, recipes, the price of a neighbor’s house and more. Additionally, social media networks “allow users to have conversations, share information and create web content,” with people across the globe, according to the University of South Florida’s University Communications and Marketing webpage. This information and connectedness, however, comes with consequences — consequences that point to a need for regulation.

The Social Dilemma, a film about the dangers of sustained social media use details some of these consequences. One, is the danger social media poses to democracy. Jonathan Haidt and Tobias Rose-Stockwell wrote in the Atlantic’s “The Dark Psychology of Social Networks” that in the federalist papers, Madison thought the United States was outfitted uniquely for democracy because “hard for anyone to spread outrage over such a large distance.” However, the function of social media as an immediate way to spread information undermines the size of the United States. If someone in Los Angeles and New York City can see a post with disinformation at the same time, the vastness of America can do nothing. “Nuance and truth are casualties in this competition to gain the approval of the audience” write Haidt and Rose-Stockwell. Because of this need to gain approval of an audience, content gets more extreme, and algorithms promote this content, widening the gaps between opinions.

Additionally, as Haidt and Rose-Stockwell write, social media platforms’ desire to expedite the user experience results in less bumps in the road — making it easier to send messages and information without thinking it through. All of these combine into making social media a minefield for democratic thought and action. As outrage builds, social media users devolve into an “us vs. them” attitude, limiting moderate opinions. As people move farther and farther away from impartial observation.

Social Media also has a marked mental effect on its users. Users report feelings of inadequacy, isolation, addiction, depression, anxiety, self-absorption and an increase in cyberbullying, according to Help Guide.

However, because the government should regulate social media, it does not mean they are able to. The average age of a House of Representatives member is 58 and the average of a Senator is 64, according to the Library of Congress Research Guides. This is well over the age of a member of Generation Z, the generation who uses social media the most, according to Pew Research Center’s Social Media Fact Sheet. And let’s face it — we’ve all seen the video of Mark Zuckerberg explaining the internet to members of congress from CNETTV. If Congress cannot understand the internet, how can they be expected to regulate it effectively?

Social media itself is addictive — that is how it makes a profit. The goal of a social media company is to get as many people as possible addicted to the product. For this reason, the companies themselves cannot be responsible to set boundaries that have the consumer’s best interest at the forefront of their decisions — these media organizations, however, might be the only hope for effective regulation. Bad regulation might be worse than no regulation, as Pew Research Center shows 72% of adults use at least one social media site, it has become an effective way to get information to a large group of people quickly. In the case of an emergency like a mudslide or fire, the faster information moves the better.

Twitter’s crisis misinformation policy seeks to include misinformation alerts on false coverage or accusations — and prioritizes investigating high profile accounts first. This means that the more likely information is to be seen, the quicker it is vetted for incorrect claims. Information that is false is suppressed, in an effort to limit harm. These alerts are one step towards harm mitigation. One other option is including a warning for harmful content that promotes things such as mental illnesses or suicidal ideation.


Samantha Torre, a student in Jon Pfeiffer’s media law class at Pepperdine University, wrote the above essay in response to the following question: Should the government regulate social media platforms such as Twitter, Facebook, TikTok, YouTube or Instagram? The class covers copyright and social media. Samantha is a journalism major.

Sign Up for Pfeiffer Law's Monthly Newsletter

Contact Jon and his team today.

Subscribe