Legal Liability for Social Media Algorithms

Home > Blog > Legal Liability for Social Media Algorithms
Legal Liability for Social Media Algorithms

Nov 21, 2025

In our current world of innovation and social media, algorithms have taken over the digital sphere. Algorithms deliver relevant content to their users based on their interests and actions taken on previous posts (Golino, 2021). The point of an algorithm on social media is to filter through millions of posts and other content in order to show information interesting to the user; however, these algorithms can cause some serious problems. From encouraging bias to fostering harmful content, algorithms can have a negative impact on users, especially younger users and other susceptible users. Additionally, these algorithms can result in discriminatory impacts on the users. Since algorithms have the potential to harm users, social media companies should be held legally liable for discriminatory impacts caused by their algorithms.

Cambridge Dictionary defines discrimination as “treating a person or particular group of people differently, especially in a worse way from the way in which you treat other people, because of their race, gender, sexuality, etc.” (Cambridge Dictionary, n.d.). With the personalization of these algorithms, discriminatory impacts occur. The algorithm gathers information and personal data to target; thereby, leading to stereotyping and ultimately discrimination. For example, with job advertisements, the algorithms use “gender stereotypes [to] determine how… [to display] job ads” (Sombetski, 2024). By using these stereotypes, the algorithm directly harms the workplace as a female may be listed lower or not even shown the job opportunity in comparison to a male.

The harms of discrimination accentuated by these algorithms should be minimized. In order for the harmful effects to be reduced, social media companies should be held accountable for any discriminatory actions by their algorithms. By holding these companies accountable and legally liable, they have the incentive to protect their users from this harm as well as reducing the promotion of stereotypes. The social media companies host a variety of users, and these third parties currently are responsible for the content they personally post. However, the algorithms pushed by the companies are those formulated and created by these companies. Since they are created and implemented by these companies, the company, then, becomes responsible for the harmful actions that occur due to their algorithms. Even if these algorithms are from third-party vendors, the company has either purchased the algorithm for use or has made its own decision for using such algorithms. Because it was a direct decision for the company to use these algorithms, they should be held accountable legally. One could argue that currently social media companies are not responsible for third party activity; however, this is for when the platform has limited or no decision on whether or not the platform hosts or presents the content. When it is the third-parties direct decision, they should hold responsibility; however, in the case of algorithms, the platform has made the direct decision, themselves, to host and utilize this third-party content.

Currently, Section 230 of the Communications Decency Act offers immunity for social media companies when “content is posted by third parties on their platforms” (Stephan, 2021). Additionally, another argument for the immunity for these companies includes the idea that these laws would dampen innovation. However, innovation should come with restrictions when it comes to the safety of the public. Innovation is important for the world to continue to grow and develop; however, as we invent and develop new technologies, we should always be aware of the harms and strive to minimize them whenever possible. For this reason, a party should be held legally responsible for the discriminations they face due to their algorithms. Since social media companies host these algorithms and promote some types of discrimination due to these algorithms, these platforms and companies should hold this responsibility. Additionally, the companies who developed these algorithms should also hold legal accountability alongside these companies for when their algorithms openly discriminate.

Holding social media companies legally liable for algorithm discrimination promotes the companies to focus on getting measures set up to prevent discrimination that could lead to harm. When someone experiences discrimination due to the algorithm, the parties responsible should be required to be liable, which includes the algorithm makers and the social media platforms that host the algorithms. Limiting harm for the users should be at the forefront of the social media companies’ goals, which is incentivized more when they are held legally accountable.


Betsy Burrow, a student in Jon Pfeiffer’s media law class at Pepperdine University, wrote the above essay in response to the following prompt: “Should social media companies be held legally liable for discriminatory impacts caused by their algorithms?” Betsy is an Advertising major, with a minor in Multimedia Design.

Resources

Cambridge Dictionary. (n.d.). Discrimination. Cambridge Dictionary.

https://dictionary.cambridge.org/us/dictionary/english/discrimination.

Golino, M.A. (2021, April). Algorithms in social media platforms. Institute for Internet & the

Just Society. https://www.internetjustsociety.org/algorithms-in-social-media-platforms.

Sombetski, P. (n.d.). How and why algorithms discriminate. Algorithm Watch.

https://algorithmwatch.org/en/how-and-why-algorithms-discriminate/.

Stephen, K. (2021, December). The Social Responsibility of Social Media Platforms.

https://www.theregreview.org/2021/12/21/stephen-social-responsibility-social-media-platforms/.

Sign Up for Pfeiffer Law's Monthly Newsletter

Contact Jon and his team today.

Subscribe