In a lawsuit alleging child safety concerns, online gaming service Roblox announced on Wednesday it will expand its age estimation technology for all users and partner with the International Age Ratings Union (IARC) to provide ratings for the age and content of games and apps on its platform.
The company said by the end of the year, a system will be rolled out to estimate age for all Roblox users who access the company's communication tools, such as voice and text-based chat. This includes scanning users' selfies and analyzing facial features to estimate their age.
This age estimation technique, combined with other systems that include identity age verification and validated parental consent, provides a more accurate measure of the user's age, especially when comparing children to typing their birth dates when creating an account. The company also points out that it plans to launch a system that further restricts communication between adults and minors on the platform.
Meanwhile, the company's partnership with IARC will replace its own content and maturity labels with those used by rating agencies around the world. This means that US users can see ratings from the ESRB, but in other countries you can see the countries used by your rating authorities. Players from the Republic of Korea will see the ratings from GRAC. German players will see the ratings from USK. For example, other players from Europe and the UK will see ratings from Pegi.
The system is intended to help parents better understand the type of game their children are playing, based on factors that can cause concern, such as blood, gore, violence, substance, gambling, adult language, and more.
These updates are moving forward with the company announced in July, designed to better protect younger users.
Roblox has introduced a set of safety features, including an ageing system that analyzes users' age via video selfies. This information is used to prevent users under the age of 13 from accessing certain features within Roblox, such as voice and text chat features without filters. Roblox also prevents users aged 13 to 17 from adding users to “trusted connections” in real life.
The move also follows the rollout of increasingly stringent laws and regulations around the world that require social platforms to verify the age of users, such as the UK's online safety law and the Mississippi age guarantee law. Similar laws are at various stages in other states, including Arizona, Wyoming, South Dakota and Virginia.
Roblox has also invested in safety features for many years.
Among the tools is Roblox Sentinel, an open source AI system designed to detect early signals of child danger. The company also offers parental control, tools to limit communications, and technology to detect when a server where many users are breaking rules, so they can defeat them.
Despite the company's efforts, child predators still have access to the platform and access to target children, according to complaints filed in Louisiana Attorney General Liz Malil and other lawsuits in California, Texas and Pennsylvania.
Additionally, the Guardian recently reported on a research study that found that children on Roblox's services could easily come across inappropriate content and interact with bad actors.
Recently, popular Roblox Farming Simulator game Grow A Garden has created headlines as it violated the rules of the platform and began buying and selling virtual items in the online market for real money. Parents and others raised concerns that the game was luring children in, and forced them to spend cash to catch up with other players.
Changing the rating system may not end all of these bad experiences, but at least parents can gain more insight into the games their children are playing.
Matt Kaufman, Roblox Chief Safety Officer, said: “We are excited to partner with IARC and hope to provide globally clearer and more confident when it comes to age-appropriate content.”