Image

Should Social Media Be Restricted to Users Over 16?

What should the age limit on social media access be, and how can this be enforced in a meaningful and effective way?

Current age gating tools are largely ineffective, and rely on honesty from the user. And while various alternatives have been proposed, and are even in production within some apps, there’s no definitive solution that’s able to stop youngsters from lying about their age to sign up.

Or is there?

The Australian Government is the latest to propose enforceable age limits for social media platforms, gating them for teens under 16, though again, without viable age checking systems, that’s all it is, a proposal that highlights an issue, without a means to address it. The Australian Government says that it is trialing several methods for age verification, but it’s provided limited detail, so we don’t know whether it actually has a workable solution as yet.

Similar age restriction proposals have already been tabled in Denmark, Florida, the U.K. and various other regions, though again, without a true measure with which to hold the platforms to account, there’s really no solution, as yet.

So while legislators are saying the right things, its hard to know whether they’re legitimately exploring this, or if its PR spin to win more votes. 

One of the companies set to be the most significantly impacted by any such rule change would be Meta, and for it’s part, it’s actually looking to put the onus onto Google and Apple to enforce age restrictions at the app download level instead. Meta’s proposal would require users under the age of 16 to get parental permission to download any apps deemed to fall within whatever the classification parameters may be.

That would seemingly be a more workable solution than leaving the apps to establish their own verification parameters. A more centralized, OS level approach might address many of the challenges of at least scaling a uniform solution, while Apple and Google accounts require parental consent for youngsters already. This proposal is still being debated, but that would, on the surface at least, cover many key aspects.

But then again, it’s also worth noting which apps would be most impacted by such, and which would benefit.

Sure, Meta would lose out if users under 16 were suddenly blocked from its apps. But you know who would lose out more? 20% of Snapchat’s 850 million active users are reportedly under 17, while a third of TikTok’s total audience in the U.S. is believed to be 15 or under.

Both apps would have a lot more to lose if any such rules were implemented. And while Meta’s publicly supporting new measures to implement age restrictions, its also reducing the age requirements for its Horizon Worlds VR social experience. If you believe, as Meta does, that this is the future of social media interaction, then it seems like Meta’s willingly pushing for restrictions in the old school paradigm, while creating the same problem yet again in the next stage.

Maybe, then, Meta’s hoping that any such measures will only impact it in the short-term, but hurt its competitors more, while also slipping these VR changes under the radar, while regulators are paying less attention.

So, is it actually workable, and should age limits be put in place?

Well, there’s a growing body of evidence to suggest that social media usage can be harmful for teens, and as such, maybe restrictions on access will reduce this concern. But still, a lot of it is too little, too late, while we’re yet to see a truly viable detection and gating process.

Until we have that, the proposals are mostly PR spin, which is worth noting in regards to when and how they’re announced. 

SHARE THIS POST