Raising the minimum age for social media - will it work?

It's probably best for social media platforms to have an age minimum of at least 16, but there's numerous challenges that won't make this effective.

Raising the minimum age for social media - will it work?

Social media, we all know and love (or hate) it. These platforms have become such an integral part of our lives it's like a stranglehold, and it's amplified more among youth. It has been used for good and helped us connect in so many ways. However, concerns have been raised about the safety, mental health and well-being of minors using social media. To address these concerns, some lawmakers and experts have advised that social media platforms should raise the minimum age to open an account to 16. While such a measure could help protect the young populace from known dangers, there are bumps in the road which can cause horrible precedents and effects. I will explore the benefits and drawbacks of setting an age limit of 16 for social media platforms, and give my personal opinion on how what should be done to help keep everyone of all ages safe online.

How it is now

In most cases, the minimum age to register an online account is 13. This is due to the Children's Online Privacy Protection Act (COPPA), which took effect in 2000. COPPA did not pertain to social media, but rather restricted the collection of data among minors under the age of 13. Explicit parental consent would be required in order for a minor's information to be collected.

Violations of COPPA are not cheap - it can carry a hefty price of $40,000 per violation. The FTC smacked down on ByteDance (parent of TikTok) with a $5.7 million fine for violating this law in 2019. This is why most platforms restrict access to anyone under 13, as it's easier to deny access to underage users instead of changing the fundamentals of their platform to comply with the law.

But how about our friends across the pond? That's where things get a bit more complicated. In Europe, the General Data Protection Regulation (GDPR), which took effect in 2018, sets the minimum age for data collection without parental consent at 16. However, the regulation allows for provisions to be made where the age can be 13.

The problems with this

There are several issues with adolescents aged 13-16 having unrestricted access to social media platforms. Platforms such as Instagram, Twitter, and TikTok have highly addictive algorithms. These algorithms are designed to maximize the amount of engagement and time you spend in an app. In a recent research study conducted by the University of North Carolina, overuse of social media can have an impact on young adolescents' brain development. Those who frequently checked social media were more likely to be hypersensitive to feedback from their peers, and their perception of the real world.

Take the popular social media "challenges" that you see pop up from time to time. These challenges are pushed by an algorithm to users, which demonstrates them performing dangerous acts such as exiting a moving car (Kiki Challenge, 2018), gluing hair (Gorilla Glue Challenge, 2021), eating tide pods (Tide Pod Challenge, 2018), dangerous driving (Cha-Cha Slide Challenge, 2020), spreading germs (Coronavirus Challenge, 2020), or just flat out vandalism (Devious Licks, 2021) and grand theft auto (Kia Boyz, 2022). There are dozens of these challenges and we'd sit here all day going through them, but those I mentioned were amongst the most popular. These challenges have resulted in several injuries and even deaths, because of the assumption that doing it is an easy way to get views and interactions.

In addition to powerful algorithms influencing brains, the age requirement is about as effective as putting a band aid on a broken bone. Because most aged 13-15 do not have easily accessible government-issued identification (and collecting this would be a privacy & security nightmare), platforms are forced to rely on the honor system and hope kids won't lie about their age. As the internet has grown, more and more have done so (I for sure have when I was just a youngin'). A 2018 survey by Ofcom, the UK's communications regulator, revealed that 23% of 8-11 year olds had a social media profile.

The highly addictive algorithms, ease of use, and the laughably easy ability to circumvent the 13+ age restriction make the floodgates open to minors aged anywhere from 5 to 16. And even if they follow the age limit, it is not guaranteeing that they won't encounter content that is dangerous or inappropriate. Many of the inappropriate things they are exposed to come from an algorithm, influence, or just natural curiosity. Everyone develops differently, and those between 13-16 may not have developed the necessary skills to know something is dangerous or illegal, and not to do it.

A solution?

With all of that in mind, it sounds like the good ol' government wants to step in and make a modernized version of COPPA, but pointed directly at social media platforms. In January 2023, the US Surgeon General said that he believed 13 was "too young" for children to be on social media platforms. A bill Introduced in the United States Senate by Sen. Josh Hawley (R-MO), the MATURE Act, and in the United States House by Rep. Chris Stewart (R-UT), the Social Media Child Protection Act, would "require social media platforms to verify that all individuals who create an account on the platform are age 16 or older". The reasoning for setting the age to 16 can likely be attributed for the following reasons:

  • At age 16, while still underdeveloped, it is more likely that the human brain has developed enough skills to recognize harmful acts
  • Most aged 16+ have valid and easily accessible government identification, such as a state learners permit or drivers license

Both of these bills state that platforms would be required to collect personally identifiable information on any registering user to verify their age, which includes a valid photo ID. The ID would be used to verify age and determine if they can sign up for a platform or require parental consent if under 16.

Only the MATURE Act explicitly specifies what companies can and cannot do with the PII data, and if you delete your account, "the operator of the social media platform shall delete any information collected from an individual for the purpose of verifying the individual’s identity and age not later than 30 days after the date of such deletion." The general idea of raising the minimum age has bipartisan support, but the outcomes of each bill is yet to be known (as of March 13 2023)

The solution is not foolproof

While these bills may seem well-intentioned and can protect kids & adolescents from online harms, neither of them are foolproof for multiple reasons.

First is the devastating effect this will have on privacy & security for everyone, not just kids. Both of these bills will require anyone that creates a social media account to fork over their:

  • Full, legal name
  • Date of birth
  • Government issued ID, which contains data such as address, DL#, gender, etc.

In 2021, I wrote a post about a petition in the United Kingdom which would require the same thing. The same principles apply here. Companies such as Facebook and Twitter have already proved that they should not be trusted with your private data. All it takes is some unauthorized third party to get the data via some vulnerability, and you have every single social media user identified. In addition to actions such as doxxing, the information could be enough to perform identity theft.

Second is how this may end up becoming counterproductive at achieving their goals. While we have come a long way in age verification technology, it is still possible for fake ID's to pass through at times. However, kids don't have to resort to fake ID's if they can just share accounts or have their parent sign up for them. Alternatively, they may turn to other platforms that are less regulated or more harmful than mainstream social media sites. Both bills would only apply to companies that operate in the United States. What's stopping a kid from searching up some chat room that they can access with no verification required? The answer is nothing! The root problem of being exposed to dangerous content still exists, but it's just been transferred to places that, arguably, are worse.

Third is the potential for unintended consequences. Social media platforms are not only used for entertainment or personal communication but also for public discourse and information sharing. Prohibiting adolescents from accessing and participating in these spaces can have a negative effect on their awareness to events that happen around the world. In addition, these platforms also offer ways for marginalized groups such as minorities and the LGBTQ+ community to connect and offer support/advice, as discrimination might be common in the area they are from.

Invest in education instead

These reasons are why I believe an age requirement of 16 for social media is not that far-fetched or crazy as some people think, but at the same time you cannot just solve the problem by age gating it off. ID verification creates significant privacy & security risks, and some offshore site probably won't follow the law.

Instead of attempting to just regulate social media platforms, we should heavily invest in educating the youth on proper social media use & etiquette. Many schools do not teach proper digital citizenship enough (and for those that do, the content is lackluster at best), so kids are up the stream without a paddle.

Here's what I personally suggest:

  • Create a year-round digital citizenship course in schools that teach the fundamentals, proper use, and dangers of social media.
  • Encourage more parental involvement in social media and technology as kids age. Tech will continue to advance as time passes, and it is extremely important for everyone to stay on the same page.
  • Don't punish kids for the simple act of stumbling across something harmful. Instead, use it as a learning opportunity. Instruct them that what they have come across is not okay, and it could potentially land them in a lot of trouble or cause injury.
  • Create activities that involve kids, parents, and schools where they use technology critically. This can involve research, coding challenges, social media analysis, video editing, etc.
  • Develop clear policies on acceptable use of technology in the home and school, such as rules on device usage during class or family time and consequences for violating them. These rules should apply to everyone equally, parents, teachers, and kids.

I know these suggestions may seem like a challenge to implement with school budgets and how every parent style is different, but I think that they can seriously go a long way. Regulation is not always the solution to our problems, and I believe that when it comes to social media, regulation and education can coexist to create a better, safer digital future for kids & adults.