Desi News — Celebrating our 28th well-read year!

View Original

A ROOM WITH A POINT OF VIEW

ARE YOU IN THE DARK ABOUT THE RISKS OF THE METAVERSE?

Image credit: MINH PHAM on Unsplash.

From CYBERTIP.CA

Cybertip.ca, Canada’s tipline for reporting online child sexual abuse and exploitation, is urging parents to be aware of the metaverse and the emerging risks that may come along with children and teens being in the largely unrestricted virtual space.

What is the metaverse? In the broadest sense, the metaverse is an online, 3D world where people can interact as digital characters just as they would in the real world. It’s mainly accessed by using a virtual reality headset, such as Meta’s (the parent company of Facebook) Oculus VR headsets, and downloading virtual and augmented reality apps for games, chatting, or just hanging out.

What are the risks? In the metaverse, children and adults can mix in a predominantly unmoderated world, which increases the risk of tweens/teens being groomed and victimized by those looking to harm and sexually exploit youth.

There have been multiple media reports about the metaverse’s safety shortfalls and risks. One Washington Post reporter stumbled across an apparent nine-year-old who was using their parent’s VR to play in the 18+ app Horizon Worlds, while the Mirror showcased screenshots of a user name “pedo” talking to a girl as they walk her to a private area in the 3D world of VRChat, a popular social game. A BBC researcher posing as a 13-year-old even found strip clubs in the app, was shown sex toys, and was approached by numerous adult men.

In reviewing VRChat, the Center for Countering Digital Hate found that users, including minors, are exposed to abusive behaviour every seven minutes. The group identified 100 potential violations of Meta’s policies for VR, however only 51 met the criteria for reporting offending content. As of when the Center published its findings, Meta had not responded to any of the reports of abusive behaviour, including sexual harassment and grooming of minors.

Some of the potential areas of concern include:

Teens being groomed: Conversations within the metaverse could lead to youth being exploited in the app (e.g., committing virtual sex acts or engaging in sexualized chat). Or conversations that start in the metaverse could lead to teens being moved to other platforms – ones that may have a video chat or livestream component – which increases the risk of exploitation.

Being exposed to graphic sexual content: For example, the Mirror article outlines a room in VRChat where users – including apparent children – were crowded around a SpongeBob SquarePants avatar as he pretended to commit sex acts.

Sexual harassment, cyber-bullying, threats of violence, and racism: Unlike other platforms, there are often no content filters, and many apps within the metaverse rely on users to regulate their own experience, muting, blocking, or reporting users.

Often no age verification: For example, Meta’s safety page states, “While we know that children under 13 may want to use Oculus devices, we do not permit them to create accounts or use Oculus devices”. However, no age verification is required. Instead, the company relies on users’ Facebook accounts to use an Oculus device. This minor hurdle can easily be bypassed by a child using a shared family device or Facebook account.

Limited parental controls with VR devices: Meta’s Oculus Quest 2 headset doesn’t currently come with the option to turn on parent controls that help to limit access to 18+ or otherwise harmful content (although parents can restrict specific sites through the settings for the web browser on the device). Other companies are adding options. HTC recently rolled out Vive Guardian app, which allows users to limit app and content access within Vive VR devices, as well as pushing requests for downloads.

Significantly reduced ability for parents to “see” what’s happening: With other video games, parents can watch what’s happening on screen, but with a VR/AR device, where all the activity is self-contained, parents can be in the dark about what’s occurring in the metaverse. A few systems, like Quest and Vive, allow for what’s happening within the device to be “cast” to a phone, tablet, or smart TV, leaving parents and teens to work together on rules around providing visual access.

What can parents do? The meta-verse is not intended for children under the age of 13. If you have a VR/AR device, take care that your child does not have access to it. Consider signing out of your account.

Learn about the metaverse, the VR/AR devices, and the games/apps your teen is interested in playing. Does the game/app have any form of privacy controls (e.g., can you limit who your teen plays with or chats with)? Are there options to report inappropriate activity? Does the game/app connect to other platforms/apps/sites outside the metaverse?

Remind your teen that not everyone in the metaverse is who they say they are. Just because the person’s avatar appears as a peer, doesn’t mean they actually are.

Have regular conversations about what they are doing in the metaverse and who they are playing/chatting with. Know your teen’s passwords, screen names, and the people they are playing with.

Discuss how to get out of uncomfortable situations. Due to the immersive nature of the metaverse, it may be harder for teens to “walk” away from a situation or conversation. Reinforce with your teen that you understand how quickly it is possible to encounter an inappropriate or problematic situation and you are the right person to go to if they need help or are upset. Remind them that you want to know so you can support and help them.

Ensure your teen understands they can talk to you about anything they encounter that makes them feel uncomfortable without fear of losing phone or gaming privileges.

If you see or read anything sexual towards your teen online, report it to Cybertip.ca or your local law enforcement agency.

Cybertip.ca recommends a high level of parental supervision and monitoring when teens use metaverse due to the situations they may encounter.

The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child-serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca and Project Arachnid, a web platform designed to detect known images of child sexual abuse material (CSAM) on the clear and dark web and issue removal notices to industry.