YouTube to limit teens' exposure to videos about fitness and weight across global markets

youtube ios app

Image Credits: TechCrunch

YouTube is going to limit teens’ exposure to videos that promote and idealize a certain fitness level or physical appearance, the company announced on Thursday. The safeguard first rolled out in the U.S. last year and is now being introduced to teens globally.

The announcement comes as YouTube has faced criticism over the past few years for potentially harming teens and exposing them to content that could encourage eating disorders.

The type of content that YouTube will limit exposure to includes videos that compare physical features and idealize certain fitness levels, body types and weight. Separately, YouTube will also limit exposure to videos that display “social aggression” in the form of non-contact fights and intimidation.

The Google-owned platform notes that this type of content may not be as harmful as a single video, but if the content is repetitively shown to teens, then it could become problematic. To combat this, YouTube will limit repeated recommendations of videos related to these topics.

Since YouTube’s recommendations are driven by what users tend to watch and engage with, the company needs to introduce these safeguards to protect teens from being repeatedly exposed to the content even if it adheres to YouTube’s guidelines.

“As a teen is developing thoughts about who they are and their own standards for themselves, repeated consumption of content featuring idealized standards that starts to shape an unrealistic internal standard could lead some to form negative beliefs about themselves,” YouTube’s global head of health, Dr. Garth Graham, said.

Thursday’s announcement comes a day after YouTube introduced a new tool that allows parents to link their accounts to their teen’s account in order to access insights about the teen’s activity on the platform. After a parent has linked their account with their teen’s, they will be alerted to their teen’s channel activity, such as the number of uploads and subscriptions they have.

The tool builds on YouTube’s current parental controls that allow parents to test supervised accounts with children under the age of consent for online services, which is 13 in the U.S. It’s worth noting that other social apps, like TikTok, Snapchat, Instagram and Facebook, also offer supervised accounts linked to young users’ parents.

An image showing how parents can disable My AI

Snapchat now lets parents restrict their teens from using the app’s ‘My AI’ chatbot

An image showing how parents can disable My AI

Image Credits: Snapchat

Snapchat is introducing new parental controls that will allow parents to restrict their teens from interacting with the app’s AI chatbot. The changes will also allow parents to view their teens’ privacy settings, and get easier access to Family Center, which is the app’s dedicated place for parental controls.

Parents can now restrict My AI, Snapchat’s AI-powered chatbot, from responding to chats from their teen. The new parental control comes as Snapchat launched My AI nearly a year ago and faced criticism for doing so without appropriate age-gating features, as the chatbot was found to be chatting to minors about topics like covering up the smell of weed and setting the mood for sex.

Snapchat says the new restriction feature builds on My AI’s current safeguards, including “including protections against inappropriate or harmful responses, temporary usage restrictions if Snapchatters repeatedly misuse the service, and age-awareness.”

In addition, parents will now be able to see their teens’ safety and privacy settings. For instance, a parent can see if their teen has the ability to share their Story with their friends or a smaller group of select users. Plus, a parent can see who is able to contact their teen on the app by viewing their contact settings. Parents can now also see if their teen is sharing their location with their friends on the Snap Map.

As for parents who may be unaware about the app’s parental controls, Snapchat is making Family Center easier to find. Parents can now find Family Center right from their profile, or by heading to their settings.

Image Credits: Snapchat

“Snapchat was built to help people communicate with their friends in the same way they would offline, and Family Center reflects the dynamics of real-world relationships between parents and teens, where parents have insight into who their teens are spending time with, while still respecting the privacy of their personal communications,” Snapchat wrote in the blog post. “We worked closely with families and online safety experts to develop Family Center and use their feedback to update it with additional features on a regular basis.”

Snapchat launched Family Center back in 2022 in response to increased pressure on social networks to do more to protect young users on their platforms from harm both in the U.S. and abroad.

The expansion of the app’s parental controls come as Snapchat CEO Evan Spiegel is scheduled to testify before the Senate on child safety on January 31, alongside X (formerly Twitter), TikTok, Meta and Discord. Committee members are expected to press executives from the companies on their platforms’ inability to protect children online.

The changes also come two months after Snap and Meta received formal requests for information (RFI) from the European Commission about the steps they are taking to protect young users on their social networks. The Commission has also sent similar requests to TikTok and YouTube.

Snapchat isn’t the only company to release features related to child safety this week, as Meta introduced new limitations earlier this week. The tech giant announced that it was going to start automatically limiting the type of content that teen Instagram and Facebook accounts can see on the platforms. These accounts will automatically be restricted from seeing harmful content, such as posts about self-harm, graphic violence and eating disorders.

Meta and Snap latest to get EU request for info on child safety, as bloc shoots for ‘unprecedented’ transparency

Meta to restrict teen Instagram and Facebook accounts from seeing content about self-harm and eating disorders

An image showing how parents can disable My AI

Snapchat now lets parents restrict their teens from using the app’s ‘My AI’ chatbot

An image showing how parents can disable My AI

Image Credits: Snapchat

Snapchat is introducing new parental controls that will allow parents to restrict their teens from interacting with the app’s AI chatbot. The changes will also allow parents to view their teens’ privacy settings, and get easier access to Family Center, which is the app’s dedicated place for parental controls.

Parents can now restrict My AI, Snapchat’s AI-powered chatbot, from responding to chats from their teen. The new parental control comes as Snapchat launched My AI nearly a year ago and faced criticism for doing so without appropriate age-gating features, as the chatbot was found to be chatting to minors about topics like covering up the smell of weed and setting the mood for sex.

Snapchat says the new restriction feature builds on My AI’s current safeguards, including “including protections against inappropriate or harmful responses, temporary usage restrictions if Snapchatters repeatedly misuse the service, and age-awareness.”

In addition, parents will now be able to see their teens’ safety and privacy settings. For instance, a parent can see if their teen has the ability to share their Story with their friends or a smaller group of select users. Plus, a parent can see who is able to contact their teen on the app by viewing their contact settings. Parents can now also see if their teen is sharing their location with their friends on the Snap Map.

As for parents who may be unaware about the app’s parental controls, Snapchat is making Family Center easier to find. Parents can now find Family Center right from their profile, or by heading to their settings.

Image Credits: Snapchat

“Snapchat was built to help people communicate with their friends in the same way they would offline, and Family Center reflects the dynamics of real-world relationships between parents and teens, where parents have insight into who their teens are spending time with, while still respecting the privacy of their personal communications,” Snapchat wrote in the blog post. “We worked closely with families and online safety experts to develop Family Center and use their feedback to update it with additional features on a regular basis.”

Snapchat launched Family Center back in 2022 in response to increased pressure on social networks to do more to protect young users on their platforms from harm both in the U.S. and abroad.

The expansion of the app’s parental controls come as Snapchat CEO Evan Spiegel is scheduled to testify before the Senate on child safety on January 31, alongside X (formerly Twitter), TikTok, Meta and Discord. Committee members are expected to press executives from the companies on their platforms’ inability to protect children online.

The changes also come two months after Snap and Meta received formal requests for information (RFI) from the European Commission about the steps they are taking to protect young users on their social networks. The Commission has also sent similar requests to TikTok and YouTube.

Snapchat isn’t the only company to release features related to child safety this week, as Meta introduced new limitations earlier this week. The tech giant announced that it was going to start automatically limiting the type of content that teen Instagram and Facebook accounts can see on the platforms. These accounts will automatically be restricted from seeing harmful content, such as posts about self-harm, graphic violence and eating disorders.

Meta and Snap latest to get EU request for info on child safety, as bloc shoots for ‘unprecedented’ transparency

Meta to restrict teen Instagram and Facebook accounts from seeing content about self-harm and eating disorders

Evan Spiegel, co-founder and chief executive officer of Snap Inc., during a Senate Judiciary Committee hearing in Washington, DC

Snap CEO says 20 million US teens use Snapchat, but only 200,000 parents use its Family Center controls

Evan Spiegel, co-founder and chief executive officer of Snap Inc., during a Senate Judiciary Committee hearing in Washington, DC

Image Credits: Kent Nishimura/Bloomberg / Getty Images

During today’s Senate Judiciary Committee hearing on kids’ online safety, Snap CEO Evan Spiegel shared that 20 million teenagers use Snapchat in the United States and that around 200,000 parents use its Family Center supervision controls. He also shared that approximately 400,000 teen accounts have been linked to a parent’s account through Family Center. Spiegel’s testimony marks the first time that Snap has shared real-world metrics regarding the usage of Snapchat’s parental controls.

Snapchat’s Family Center, which allows parents to see who their teens are friends with on the app and who they have been communicating with, first launched in 2022.

Spiegel shared the numbers after Senator Alex Padilla (D-Calif) asked the CEO’s of Meta, TikTok, X and Discord to disclose how many minors were using their platforms and how many parents were using parental supervision controls offered by the services.

“We create a banner for Family Center on the user’s profiles,” Spiegel said after being asked what Snapchat was doing to ensure parents and guardians are aware of the tools. “So the accounts we believe may be the age that they can be parents can see the entry point into Family Center easily.”

Snap introduced the parental controls in response to increased pressure on social networks to better protect minor users from harm. Snapchat’s rollout of Family Center followed the launches of similar parental control features across other apps, including Instagram, TikTok and YouTube.

Spiegel was the only CEO to share numbers in response to Senator Padilla’s question.

Meta CEO Mark Zuckerberg said he was unable to provide specific numbers, but said that the company runs “extensive ad campaigns” both on its platform and outside to raise awareness of its parental supervision tools.

X CEO Linda Yaccarino shared that less than 1% of the platform’s 90 million U.S. users are between the ages of 13 and 17, and that the company is discussing parental controls.

“Being a 14-month-old company we have reprioritized child protection and safety measures,” Yaccarino said. “We have just begun to talk about and discuss how we can enhance those with parental controls.”

TikTok CEO Shou Zi Chew said he was unable to share specifics, but that TikTok was “one of the first platforms” to give parents supervision controls. Discord CEO Jason Citron said that Discord raises awareness of its parental controls through promotional videos and in-app prompts.

Mark Zuckerberg says Apple and Google should manage parental consent for apps, not Meta

Mark Zuckerberg defends teenage creators’ right to public Instagram accounts