TLDR: Starting January 2026, Roblox requires all users to submit to facial recognition age verification or upload government ID before they can use any chat features.
Parents are furious because the scans collect permanent biometric data from children, often without proper parental consent, and there are serious questions about data retention and whether this actually makes kids safer.
If you’re a parent who opened Roblox with your kid recently and discovered that chat suddenly stopped working, you’re not alone. And if you’re absolutely livid about what Roblox is now asking you to do to fix it, welcome to the club.
Starting the first week of January 2026, Roblox rolled out a global mandate that’s causing an absolute firestorm among parents: mandatory facial recognition scans for anyone who wants to use chat features in the game.
Yes, you read that right. Your child has to either record a video of their face for an AI to analyze, or upload a government-issued ID, just to talk to their friends in a video game.
The backlash has been swift and brutal. Parents are flooding social media, Reddit forums, and consumer protection sites with complaints ranging from privacy violations to questions about whether this “safety measure” might actually make their kids less safe.
Roblox is calling it the “gold standard” of child protection. Parents are calling it something else entirely.
Here’s everything you need to know about what’s happening, why Roblox did this, and why so many families are ready to delete the app all together.
What Exactly Is Roblox Asking Kids to Do?
Let’s break down the actual requirement, because it’s more invasive than most parents initially realize. When a user tries to access any chat feature in Roblox (text chat, voice chat, or party chat), they now hit a wall.
The game doesn’t just ask for a birthday anymore. It demands biometric proof of age.
Roblox is giving users two options, and both are invasive:
- Facial Age Estimation: Your child opens the app, grants camera access, and records a video selfie where they turn their head left and right while an AI analyzes their facial geometry, skin texture, and bone structure to estimate their age. The system assigns them to one of six age groups: Under 9, 9-12, 13-15, 16-17, 18-20, or 21+. This video is processed by a third-party company called Persona, not Roblox directly.
- Government ID Upload: For users 13 and older, they can instead scan a passport, driver’s license, or state ID, then take a live selfie to prove they’re the person in the photo. Yes, that means handing over sensitive government documents to a gaming company.
Here’s the kicker: this isn’t processed by Roblox directly. They’ve outsourced the biometric processing to Persona, a San Francisco-based identity verification vendor.
Your child’s biometric data passes through Persona’s systems before Roblox gets the age determination.
And while Roblox swears up and down that the images are “deleted immediately after processing,” the fine print tells a different story.
@dakreekcraft PSA: You'll soon be forced to face scan or ID verify to chat on Roblox.
♬ original sound – KreekCraft
The Privacy Nightmare That’s Making Parents See Red
The number one complaint from parents isn’t just that this requirement exists. It’s the type of data being collected and what happens to it afterward. Biometric data is permanent. You can change a password. You can get a new credit card. You cannot change your face.
Parents understand instinctively that facial geometry is one of the most sensitive pieces of information their child will ever have, and handing it over to access a video game feels insane.
Here are the specific privacy concerns that have parents losing their minds:
- The retention discrepancy: Roblox says images are deleted immediately, but Persona’s privacy policy allows them to retain biometric data for up to three years. Which one is actually happening? Nobody knows for sure, and that ambiguity creates a massive trust gap.
- The “honeypot” problem: Millions of children’s facial scans in one centralized system creates an incredibly valuable target for hackers. One breach could expose permanent biological identifiers that can never be changed. In an era of AI-driven identity theft and deepfakes, that’s a parent’s worst nightmare.
- Third-party vendor access: Your child’s data doesn’t just go to Roblox. It passes through Persona’s systems, and their privacy policy mentions sharing with “service providers,” “cloud services,” and “fraud prevention services.” How many companies ultimately touch this data?
- It’s permanent and immutable: Unlike a password or credit card number, your face is the one piece of identifying information you can never change. Once this data exists somewhere in a corporate database, it exists forever in some form.
Kids Are Scanning Their Faces Without Parents Even Knowing
Here’s where the fury gets really personal. Parents are reporting on forums like Reddit that their children performed the facial scan without any parental approval being required.
The sequence goes like this: kid opens Roblox, sees a prompt saying “Chat is locked. Unlock it now!”, follows the on-screen instructions, scans their face, and boom—verified. All before mom or dad even knows it happened.
This is a massive problem under COPPA (the Children’s Online Privacy Protection Act), which requires verified parental consent before collecting personal information from kids under 13. The law exists specifically to prevent companies from going directly to children to extract data.
But according to numerous parent reports, the Roblox interface is prompting kids to complete the scan themselves, often without a clear “ask your parent first” barrier.
Parents feel completely usurped. They’re supposed to be the gatekeepers of their children’s privacy, but Roblox is leveraging the child’s desire to chat with friends (and the brutal social pressure of being left out) to go around parental authority.
By the time many parents found out about the requirement, their kids had already done the scan. The sense of betrayal is palpable.
Making matters worse, notifications about this major policy change were buried in blog posts and terms of service updates that most parents never read. There was no big flashing email to every parent account saying “YOUR CHILD WILL BE ASKED TO SUBMIT BIOMETRIC DATA.” It just rolled out, and families discovered it when chat stopped working.
The Creepy Factor Nobody’s Talking About Enough
Beyond the legal and privacy concerns, there’s a visceral emotional reaction many parents are having that’s hard to quantify but impossible to ignore. The process of a video game asking a child to “turn your head left, now right, now move closer” feels dystopian.
Parents are using words like “creepy,” “invasive,” and “Big Brother” to describe their gut reactions.
This is what sociologists call the normalization of surveillance. By making facial scans a routine part of logging into a game, Roblox is conditioning an entire generation of children to accept biometric checks as a mundane administrative task. Parents worry this isn’t just about Roblox.
It’s about training kids to be comfortable with AI analyzing their faces, to see surveillance as normal, to lower their defenses against privacy violations in every aspect of their digital lives.
One parent put it bluntly on a Reddit thread: “Why should I teach my kid to protect their privacy online if a gaming company is going to undo all that by making them think scanning their face is no big deal?”
The concern is that today it’s Roblox, tomorrow it’s every app, every website, every digital interaction demanding biometric proof before participation. And kids who grow up thinking this is normal won’t push back.
Wait, This Might Actually Make Kids Less Safe?
Here’s the part that’s making security experts and child safety advocates deeply uncomfortable. Roblox is marketing this as the ultimate protection for kids, but there’s a compelling argument that it could actually increase risk.
Security experts are warning that age-verified zones could actually increase risk in three specific ways:
- Concentrated targets: If a predator bypasses the check using deepfake tools (which are increasingly accessible), they’re not entering a random mixed-age server anymore. They’re entering a verified pool of confirmed children—a “pen of sheep” instead of a haystack. Roblox has essentially created a curated menu of verified minors.
- False sense of security: Kids in age-verified zones might lower their guard, assuming “everyone here is a kid like me because Roblox checked.” They’ve been told the system is the “gold standard” of safety. That psychological disarmament makes them more vulnerable to grooming tactics, not less.
- No more benevolent bystanders: The old system had older teens and adults playing legitimately who might notice inappropriate behavior and report it. By strictly separating ages, Roblox removed those potential protectors. The “it takes a village” aspect of community moderation is gone, replaced by algorithmic walls that determined predators can climb.
Parents are asking the obvious question: if a sophisticated predator can bypass this system (and tech experts say they absolutely can), hasn’t Roblox just made their job easier by concentrating all the targets in one place and giving them a false sense of security?
So Why Did Roblox Do This Now?
To understand the fury, you have to understand the pressure Roblox was under. This wasn’t just a random corporate decision. It was a defensive maneuver made under the threat of financial and legal annihilation.
@cbsmornings #Roblox said it will soon require players to use AI-powered facial age-estimation technology to help verify their age. CEO and co-founder Dave Baszucki told Tony Dokoupil about how the technology is expected to work and why he believes their #AI ♬ original sound – CBS Mornings
As of late 2025, Roblox was facing nearly 80 lawsuits from parents and state attorneys general accusing the platform of being a “breeding ground” for predators.
The cases are horrifying. Lawsuits detail instances where adults posed as children, gained trust through Roblox chat, then moved victims to encrypted apps like Snapchat for sexual exploitation.
Texas Attorney General Ken Paxton filed suit accusing Roblox of “flagrantly ignoring state and federal online safety laws.” Florida issued criminal subpoenas. Louisiana’s AG called the platform a haven for child predators.
But the real hammer came from overseas. The UK’s Online Safety Act went into full enforcement in 2025, and it doesn’t mess around. The law requires platforms likely to be accessed by children to implement “highly effective age assurance” and specifically endorses facial age estimation as a compliant method.
The penalties for non-compliance? Fines of up to 10% of global revenue and potential criminal liability for executives.
Because Roblox operates one global platform rather than separate regional versions, the UK’s strict requirements effectively set the standard for everyone. So American parents who are furious about this policy are, in a weird way, dealing with the fallout of European regulation.
Their kids are being scanned to satisfy British regulators, and they never got a vote.
From Roblox’s perspective, the facial scan mandate is a legal shield. If they get sued again for a predator incident, they can now stand in court and say, “We implemented the gold standard of age verification. We did everything technologically possible.”
The predator becomes a sophisticated criminal who defeated state-of-the-art security, rather than someone who exploited a negligent open platform. For the legal team, this is brilliant. For parents, it looks like their kids are being surveilled to protect corporate profits, not actual children.
The Technical Problems Making Everything Worse
Even parents who reluctantly accept that some age verification might be necessary are infuriated by how poorly the system actually works. The AI age estimation has a reported margin of error of 1.4 years, which sounds good until you’re the teenager with a baby face who gets flagged as 11 and locked out of age-appropriate content.
Or the 12-year-old with mature features who gets sorted into the 16-17 bracket and exposed to content they’re not ready for.
Adults trying to verify are running into nightmares. People in their twenties are being rejected as “too young” and forced to upload government IDs to prove they’re adults. The appeals process is slow, manual, and requires sharing even more sensitive documents.
For many users, this feels less like safety and more like bureaucratic harassment.
Then there are the accessibility and equity issues. What about kids whose parents don’t have a smartphone with a good enough camera? What about families who can’t afford the latest devices? What about users with facial differences or disabilities that might confuse the AI?
These kids are being excluded from the social aspects of Roblox through no fault of their own, creating a two-tier system where “verified” becomes a status symbol and “unverified” users are treated like second-class citizens.
Is This Just Security Theater?
Perhaps the most cynical take among parents is that this entire system is “security theater.” It looks impressive, it sounds high-tech, and it might satisfy regulators and judges. But does it actually stop determined predators?
Probably not.
With the rapid advancement of AI-generated deepfakes, creating a convincing video that passes a liveness check is getting easier every month. The tools exist. They’re accessible. A predator with moderate technical knowledge could absolutely bypass this system.
So you end up with a situation where average families are being forced to sacrifice their children’s privacy, while the actual bad actors the system is supposed to stop just use deepfake tools to waltz right through.
Parents are calling it a “Maginot Line”—a formidable-looking defense that’s actually quite easy to circumvent if you know where to go.
The real winners? Roblox’s legal team, who now have evidence they can present in court showing they took “all reasonable measures.”
The real losers? Families who gave up permanent biometric data for a security measure that might not actually work against the threats it’s designed to stop.
What Can Parents Actually Do?
The situation has left parents feeling powerless. Many are choosing to simply not verify, which means their kids lose chat functionality. Some are deleting Roblox entirely and moving to other platforms. Others are reluctantly going through with the verification because the social pressure on their kids is too intense to resist.
There’s a growing movement calling for legislative action. Privacy advocates are asking Congress to clarify COPPA rules around biometric data from minors. Some are calling for age-verification laws that don’t rely on biometric collection.
Others are demanding that platforms like Roblox be held to the same data security standards as financial institutions if they’re going to collect this sensitive information.
For now, parents are stuck in an impossible position. They want their kids to be safe online. They also don’t want their 8-year-old’s facial geometry stored somewhere in a corporate database. They want predators stopped.
They also don’t trust that this system actually accomplishes that goal without creating new risks.
The fury isn’t going away anytime soon. Because at its core, this controversy represents a fundamental breakdown of trust between a platform and the families that made it successful.
Roblox may have built the “gold standard” of age verification, but they’ve done it on a foundation of parental resentment, privacy violations, and serious questions about whether they’ve made children safer or just made their own lawyers happier.
Time will tell which side of that equation history lands on.
Do I have to let my child do the facial scan?
Technically, no. The age verification is “optional” in that your child can still play Roblox games without it. However, they cannot use any chat features (text, voice, or party chat) without verifying. For most kids, this makes the game essentially unplayable since communication is central to the Roblox experience. So while it’s technically optional, it feels mandatory if your child wants to actually interact with friends.
Can my child still play Roblox without doing the verification?
Yes, but with severe limitations. They can play single-player experiences or games that don’t require communication, but any game that involves teamwork, coordination, or social interaction becomes difficult or impossible without chat. Many parents report their kids saying the game is “broken” or “boring” without the ability to talk to other players.
Is the facial scan actually deleted immediately like Roblox claims?
This is where it gets murky. Roblox says images are deleted immediately after processing. However, the third-party vendor they use (Persona) has a privacy policy that allows retention of biometric data for up to three years. Roblox maintains they’ve instructed Persona to delete immediately, but the discrepancy between their promise and the vendor’s legal terms is exactly what’s making parents furious. There’s no independent verification of immediate deletion.
Can I verify my child’s age as a parent instead of having them scan their face?
Not really. The verification is tied to the individual user account, not the parent account. While parents can manage settings through parental controls, the actual age verification (facial scan or ID upload) has to be done by or on behalf of the specific user. For kids under 13, parental consent is supposed to be required, but many parents report their children completed the scan before they were even aware of the requirement.
What happens if the AI gets my child’s age wrong?
If the facial age estimation incorrectly categorizes your child (which happens, especially with “baby faces” or mature-looking kids), they can appeal the decision. However, the appeals process requires uploading a government-issued ID for manual verification, which means sharing even more sensitive documents. The process can take several days, during which chat remains locked. Many users report frustration with being incorrectly flagged and having to prove their real age through documentation.
Are there any alternatives to the facial scan or ID upload?
No. Those are currently the only two options Roblox offers. There’s no option to verify through credit card (like some age-gated sites use), through a different form of parental consent, or through any other method. It’s either facial scan, government ID, or no chat access.
Can predators actually bypass this system with deepfakes?
Security experts say yes, it’s possible. While Roblox uses “liveness checks” (having users move their head, come closer to camera, etc.) to prevent someone from just holding up a photo, AI-generated deepfake videos are becoming increasingly sophisticated. A determined predator with moderate technical knowledge and access to deepfake tools could potentially create a convincing video that passes the liveness checks. This is why many parents view the system as “security theater” that inconveniences legitimate users while determined bad actors can circumvent it.
What are the six age groups Roblox uses?
Once verified, users are placed into one of six age brackets: Under 9, 9-12, 13-15, 16-17, 18-20, and 21+. Users can generally only chat with people in their own age group and adjacent groups. For example, a 9-12 user can chat with Under 9, 9-12, and 13-15 users, but not with anyone 16 or older. The system is designed to prevent adults from communicating with young children by default.
Is this requirement legal under COPPA?
That’s a major point of contention. COPPA requires verified parental consent before collecting personal information from children under 13. Many parents argue that Roblox is violating the spirit (if not the letter) of COPPA by prompting children directly to complete facial scans without requiring explicit parental approval first. The interface shows kids a “Chat is locked. Unlock it now!” message that leads them through the verification process, often before parents know it’s happening. Privacy advocates are calling for regulatory clarification on whether biometric collection falls under COPPA’s stricter requirements.
Why did Roblox make this change now?
Two main reasons: legal pressure and regulatory requirements. Roblox was facing nearly 80 lawsuits from parents and state attorneys general over child safety failures, with horrific cases of predators using the platform to groom children. Additionally, the UK’s Online Safety Act went into full enforcement in 2025, requiring platforms to implement “highly effective age assurance” with penalties of up to 10% of global revenue for non-compliance. Because Roblox operates one global platform, the UK’s strict rules effectively forced this change worldwide.