InMyArea.com earns commissions from some of the providers we list on our site. Learn more  

Social Media Regulation: What Happens If It Becomes Law?

Written on:
A person holding a smartphone with a blurred background
Social media has grown into a multiplatform tool used to communicate information within seconds. (Image: Shutterstock)

“Freedom of speech” is the first phrase that comes to mind when navigating social media. Platforms like Twitter and Reddit are great for threading conversations about trending topics from people of different backgrounds, while Instagram and Snapchat are great for visual storytelling and highlighting life moments in an easy-to-navigate portfolio.

Figuring out how to regulate social media, though, is complex. Established laws would require nuances to account for different interactions and dynamics that take place in the digital landscape, which begs the question: How would constitutional laws affect social media regulation?

Social Media During COVID-19

A person wearing a mask while using a smartphone
COVID-19 quarantine measures in 2020 forced corporations and schools into online environments. (Image: Shutterstock)

Internet usage skyrocketed at the start of the pandemic in March 2020. Workplaces shifted from on-site to remote, group video calls replaced physical classrooms, and in-person social events had to adapt to a digital space. Many users took to social media to express their opinions on how these major lifestyle changes affected different communities. 

A combination of emotions ranging from joy to anger surfaced globally, and caused threads of intense conversations surrounding vaccines, medical care, and financial stability, among many other concerns. People’s reliance on phones and other digital devices eventually decreased as more people looked to curb phone addictions in favor of spending time away from the internet, but pandemic protocols still persisted and, with them, the conversations.

Misinformation and ‘Fake News’

A person holding documents and being interviewed by a newscaster
With misinformation so easily spread, it’s often difficult to differentiate between accurate reporting and “fake news.” (Image: Shutterstock)

The public pointed fingers at employees of social media platforms and demanded they hold users accountable for their words and actions. In the context of COVID-19, users relied heavily on media outlets for accurate reports on mask mandates, social distancing, and other COVID-19 protocols. 

Many news outlets provided research-backed findings and proper calls to action, but others were seen spreading misinformation about vaccines and using government control as the primary reason for their claims. It caused a divide between people for and against the enforcement of COVID-19 protocols, and both sides claimed the other was publishing “fake news.” 

With two opposing sides publishing reports with vastly different viewpoints, how do we ensure that individuals are exercising their right to free speech while still offering some level of social media regulation? It may seem like the easy solution is to block or remove a user for using hate speech, but it takes many more steps than that to reach a proper solution. Even then, what constitutes a “proper solution”? Who holds that responsibility has been a continuous discussion — one that led its way to U.S. law.

Section 230 and Social Media Platforms

The 1996 Communications Decency Act, specifically Section 230, shields social media companies from liability for the content users post, with certain restrictions. It has sparked mixed reactions: Tech companies say it allowed the internet to thrive, while others have said it does little to combat the spread of hate speech and misinformation online. Former President Donald Trump said Section 230 negatively affected his social media experience, citing at least two incidents of “selective censorship.” As a result, he issued an executive order in May 2020 aiming to limit these legal protections. 

Measures are being taken to investigate how to interpret Section 230 and its effect on social media, it’s still unclear whether it will stay as part of U.S. law or be repealed. The usage of its policies, however, has brought forth many discussions that may ultimately decide how social media platforms are monitored.

Internet Safety for Younger People

A kid and dad with headphones sitting together looking at a tablet
Parents aren’t the only ones responsible for monitoring their kids online. (Image: Shutterstock)

One of the reasons social media requires active regulation is because of how society views the way technology connects people to the internet. With educational resources and online learning tools becoming more accessible, even children and young adults are now experiencing increased screen time through computers and phones. Technology — and by extension social media — is becoming more integral to everyday life, especially as more people start working from home as small business owners or company employees. 

When media influences a generation, it has the potential to create trivial but unaligned conversation spaces that fail to cater to multiple opposing opinions when not reviewed or regulated. By creating social media accounts, people, regardless of age, are exposed to a myriad of targeted business ads, trending controversial headlines, and age-inappropriate content.

One of the biggest concepts that may be intense for people who don’t use social media regularly is “mob mentality” — swarms of people acting in accordance with what others say or think as a result of a major event. Depending on the context, this behavior could be seen as detrimental to casual internet users or, at the very least, extremely intense. Information overload and high traffic on social media platforms could lead to both stressfully high data usage and the inability to regulate the potential spread of misinformation properly. That’s where government action needs to be taken, which we’ve seen very recently.

Where the Law Stands With Social Media

U.S. Capitol during the day
Figuring out how much federal involvement is needed for social media may require more than one solution. (Image: Shutterstock)

Social media is an ever-growing part of innovative technology; it adapts and evolves as more people flock to digital spaces. As platforms continue to foster communities, however, policies that apply to the outside world may need to migrate into the digital landscape to better facilitate how people interact with one another online. 

The lengthy and complex process of enacting policies into law and the fast-paced manner of internet culture are two different timelines that require careful consideration and planning in order to reach any form of a long-term solution for social media regulation.