Business

Are Addiction-Inducing Algorithms in Social Media Lawful?

Studies have revealed the harmful effects of social media. This is not just a coincidence. Social media companies intentionally craft algorithms to manipulate users into giving their apps as much attention as possible. But it is still an open question whether they can be held legally responsible for doing so.
By
Social Media Addiction-ai

Isometric Social Media Addiction, Influence, Popularity, Modern Lifestyle, and Ad Concept. Men with smartphone devices. A depressed and sad man chained and shackled to a big mobile smartphone. © Golden Sikorka / shutterstock.com

January 04, 2024 07:00 EDT
Print

Addiction to social media applications is a growing concern worldwide. There is not yet a consensus on how severe and widespread the problem is; socioeconomic, technological, cultural and even geographic factors play a role in the prevalence of this addiction. Some researchers estimate that social media addiction affects over 5% of the global population of Internet users. This number might be an underestimation, as aggregated studies of a group of 32 countries show that over a fifth of their populations have already developed some degree of social media addiction.

Researchers, including medical professionals, agree that social media is addictive and that its excessive use can lead to the development of mental health issues among its users. They point, for instance, at how excessive use of social media might interfere with aspects of a healthy life, including regular sleep, fulfillment of commitments, exercise or socialization. In extreme cases, usage can even lead to life-endangering scenarios. These risks are higher among younger or vulnerable individuals who might not regulate their social media consumption.

Who is responsible for social media addiction?

Despite the damage social media addiction causes, in consumerist societies, users are expected to hold themselves accountable for their social media use, as individuals are assumed to be making decisions freely. Just as any other potentially addictive yet legally commercialized product would be, management of internet use is left to the consumer.

Consider, for comparison, tobacco and alcohol. Despite an overwhelming amount of evidence proving the negative health consequences of their use, more than a billion people around the world continue to consume both substances out of their purported free will. Some might claim that comparing social media to such products is unfair. Social media is not a chemical substance and does not have anywhere near the same effects as alcohol and tobacco on the human body. Yet, as happened with the alcohol industry in the 1920s, and the tobacco industry in the 1960s, there is a growing debate on the marketing behind these substances. Individuals question whether social media companies are intentionally creating addictive products and whether legal action against them is possible.

The companies’ motivation is clear. Their profits skyrocket with increased interaction. Social media companies invest significant resources in creating, developing and perfecting their platforms to induce increased user engagement. They monetize the time users spend looking at photos, watching videos, posting messages or sharing memes. By tailoring content to the extreme for their audiences, they manipulate users into spending more of their time-consuming content. In turn, users create their own content, enabling a cycle of constant engagement. Unfortunately, but intentionally, this process thrives on the human need for personal connection and social approval.

Algorithms manipulate our minds — intentionally

This level of hyper-specific content tailoring is only possible because of carefully crafted algorithms. An algorithm is a set of instructions made of text strings and mathematical operations. It is designed to deliver an answer or perform a task. Algorithms are useful for sorting data, processing it to transform it into information, and using it to solve a problem. Some authors have likened algorithms to cooking recipes. Like algorithms, recipes need structured input to provide an output. After all, a recipe is just a collection of words until they are structured logically. 

All algorithms are created with a purpose. Their intent is determined by the person, or group, designing them. With the help of multiple algorithms, computers can “learn” how to find better solutions to a problem and improve the relevance of their outputs. This is called machine learning. Machine learning can train an algorithm to sift through a user’s internet searches and learn about, for instance, their love for vintage cars. The frequency or specificity of a query can factor into the algorithm’s behavior. Then, a company can use the algorithm’s findings to tailor search results for that user. It may generate advertisements about vintage cars more often to heighten the likelihood of the user’s interaction with them. This engagement is one way that search engines and social media companies generate revenue.

Algorithms do not write themselves yet. They need humans to create them. Even when algorithms are created using machine learning, a human has to specify the desired behavior for the machine to learn how to do it.

To maximize potential profit, social media companies invest a considerable share of resources, including time, money and work hours, to better understand their user base. It should be unsurprising, for example, that in 2021 alone, Alphabet (the parent company of Google), Meta (formerly Facebook), and Twitter (now called X), collectively spent nearly $57.5 billion in research and development (R&D). For comparison, that is almost $25 billion more than all the companies, research institutes, universities, and government-owned laboratories in Türkiye invested in R&D that same year

The claim that social media companies are not collectively liable for creating addiction among their users is implausible, to say the least. The amount of resources invested in these algorithms and the intent of their parent companies paints a picture of corruption. 

What legal mechanisms can protect us from addictive algorithms?

However, the existing legal frameworks regarding addiction-inducing algorithms are complex. They often must strike a balance between too much and too little transparency. On one hand, users have the right to know which algorithms influence their lives. They should be informed about how a platform’s code can control their decision-making capabilities. On the other hand, these well-crafted algorithms are trade secrets for their companies.

Even if a company made the design of its algorithms public, it would likely be too difficult for users to understand. In this situation, individuals would be in a weak and passive position as consumers. Established social media platforms would determine almost all the terms of the relationship. One option for a consumer would be to use the product without fully understanding its dangers. Another is to legally protect the algorithms through an independent authority. A court or third party would regulate consumers’ usage. In reality, however, the law has not caught up to the influx of ethical issues brought about by technological advancements. So, both options are non-starters at this point.

Although there are already some restrictions on social media platforms in certain countries (most times, motivated by political interests or socially conservative preoccupations), there are still no regulations to protect users specifically from addiction-inducing algorithms. These regulations must secure individuals’ rights to know about the platforms’ impacts on their mental health. The danger of social media addiction is a critical public health issue. Yet, attempts to do so have been limited so far.

Some countries have been working separately on public health initiatives to ensure the effective protection of digital consumers. These endeavors aim to minimize the harmful effects of social media use on their populations. Yet, lawmakers must go beyond attenuating the effects of social media use. They must address the root of this problem—the creation of addiction-inducing algorithms. To social media companies, their greedy objectives trump the rights of individuals to a healthy life. Unfortunately, it is unlikely that laws will be enacted at the pace necessary to catch up with the rapid growth of technology. Given the lobbying power of these companies, they will slow any attempt at regulating their platform substantially.

For this reason, it is almost inevitable that the use of addiction-creating algorithms in social media will continue to be lawful for years to come despite the evidence of their harmful effects on users around the world.

The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.

Comment

Only Fair Observer members can comment. Please login to comment.

Leave a comment

Support Fair Observer

We rely on your support for our independence, diversity and quality.

For more than 10 years, Fair Observer has been free, fair and independent. No billionaire owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other publications, we keep our content free for readers regardless of where they live or whether they can afford to pay. We have no paywalls and no ads.

In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality of perspectives from around the world. Anyone can publish with us, but everyone goes through a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of noise.

We publish 2,500+ voices from 90+ countries. We also conduct education and training programs on subjects ranging from digital media and journalism to writing and critical thinking. This doesn’t come cheap. Servers, editors, trainers and web developers cost money.
Please consider supporting us on a regular basis as a recurring donor or a sustaining member.

Will you support FO’s journalism?

We rely on your support for our independence, diversity and quality.

Donation Cycle

Donation Amount

The IRS recognizes Fair Observer as a section 501(c)(3) registered public charity (EIN: 46-4070943), enabling you to claim a tax deduction.

Make Sense of the World

Unique Insights from 2,500+ Contributors in 90+ Countries

Support Fair Observer

Support Fair Observer by becoming a sustaining member

Become a Member