Are kids safe in social virtual reality? Experts warn of rising risks

The study found that parents familiar with social VR were more likely to support lower minimum age requirements for its use compared to non-parent adults. However, while parents were more permissive, they also recognized the risks associated with allowing children into unregulated virtual spaces.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 21-03-2025 20:03 IST | Created: 21-03-2025 20:03 IST
Are kids safe in social virtual reality? Experts warn of rising risks
Representative Image. Credit: ChatGPT

Social virtual reality (VR) platforms are becoming increasingly popular among kids. As children spend more time in virtual spaces, a new research highlights widespread concerns among both parents and non-parent adults about harassment, inappropriate interactions, and the absence of sufficient tools to safeguard children.

A study "Exploring the Perspectives of Social VR-Aware Non-Parent Adults and Parents on Children's Use of Social Virtual Reality" reveals that children as young as under 13 years old are actively engaging in social VR spaces despite platform age restrictions. The research, published in Proceedings of the ACM on Human-Computer Interaction and conducted by a team from the University of Glasgow, surveyed 149 adults, including 79 parents, about their perspectives on children's use of social VR. The findings highlight widespread concerns among both parents and non-parent adults about harassment, inappropriate interactions, and the absence of sufficient safeguarding tools.

According to the study, 43% of children under 13 are using social VR at least every two weeks, despite most platforms requiring users to be at least 13 years old. Many parents are unaware of the full extent of their children's social VR activities, raising concerns about potential exposure to inappropriate content and interactions with unknown adults.

Differing views on age appropriateness and risks

The study found that parents familiar with social VR were more likely to support lower minimum age requirements for its use compared to non-parent adults. However, while parents were more permissive, they also recognized the risks associated with allowing children into unregulated virtual spaces.

Non-parent adults using social VR platforms frequently reported encountering immature or disruptive behavior from children. More concerning, the study found that children in social VR often face bullying, harassment, and exposure to explicit content. Reports of children self-disclosing personal information to adults they meet in virtual spaces were also noted, raising alarms about online safety and potential grooming risks.

Both parents and non-parent adults expressed concerns about the lack of effective supervision in social VR environments. Unlike traditional social media, where parental controls and content filtering are more advanced, social VR lacks built-in real-time monitoring tools that would allow parents to oversee their children's interactions.

Key risks identified in the study include:

  • Harassment and bullying: Children frequently encounter verbal abuse from both peers and adults in social VR spaces.
  • Exposure to inappropriate content: Reports of children witnessing sexually explicit avatars and conversations, as well as drug-related discussions, were common.
  • Lack of parental oversight: VR headsets occlude reality, making it difficult for parents to observe their child's activities in real time.
  • Over-reliance on self-reported ages: Many platforms rely on users to self-declare their age, making it easy for underage users to gain access to adult-oriented VR spaces.

Parents surveyed in the study expressed frustration over the limited control options available to them, with many relying on physical supervision or restricting headset usage to common areas in the home. However, technical limitations make it difficult for parents to monitor conversations or interactions within the VR environment.

Child safety in social VR: What needs to change? 

The study suggests several interventions that could enhance child safety in social VR, including:

  • Stronger age verification measures: Implementing AI-based verification tools to prevent children from accessing age-inappropriate spaces.
  • Real-time monitoring tools: Giving parents the ability to remotely view and control their child’s VR activity through secondary devices.
  • Automated moderation: AI-driven content moderation to detect and block inappropriate language, behaviors, or environments.
  • Private VR spaces for minors: Creating controlled environments where only verified minors can interact, reducing exposure to harmful interactions with unknown adults.

Tech giants such as Meta, Microsoft, and Google have already started exploring AI-powered moderation tools to enhance online safety, but experts warn that without regulatory intervention, platforms may not take sufficient action.

Amidst the growing popularity of social VR among children, researchers stress the need for evidence-based guidelines on age-appropriate usage. Unlike traditional online gaming or social media, VR’s immersive nature makes interactions feel more real, increasing the risk of psychological and emotional harm for young users.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback