Roblox Implements Mandatory Age Verification and Restricted Chat Features Globally to Enhance Child Safety
Roblox implements mandatory age verification for chat features, restricting interactions to similar age groups. This global initiative enhances child safety and addresses legal challenges.
Roblox announces measures to strengthen protections for minors

Roblox blocks children from chatting to adult strangers

Roblox rolls out age-verification features in Australia as gaming platform insists child social media ban should not apply

Roblox steps up age checks and groups younger users into age-based chats
Overview
Roblox is implementing mandatory age verification for chat features, starting in select countries in December and expanding globally by January, to enhance user safety.
Users will be assigned to one of six age groups after verification, restricting chat interactions to their or nearby age ranges, preventing minors from chatting with adult strangers.
The age verification process requires users to submit a video selfie for AI facial estimation or a government ID, with all submitted data deleted after processing.
This initiative makes Roblox the first major gaming platform to mandate age checks for communication, utilizing Persona technology for the verification process.
The new safety measures are a direct response to multiple lawsuits and criticism regarding child safety concerns and inappropriate content on the popular gaming platform.
Analysis
Center-leaning sources cover the story neutrally, focusing on factual reporting of Roblox's new safety measures. They present the company's actions in response to child safety concerns, providing context from regulators and advocates without employing loaded language or selective emphasis in their own narrative. The reporting prioritizes informing readers about the changes and their implications.