What to Know
- AT&T is pulling its ads from YouTube after reports that pedophiles are lurking in the comments sections of videos of young children
- AT&T said it is removing all advertising until Google can protect its brand from offensive content
- YouTube reportedly sent a memo to advertisers outlining changes it's making this week to help protect brands
AT&T is the latest company to pull its ads from YouTube following reports that pedophiles have latched onto videos of young children, often girls, marking time stamps that show child nudity and objectifying the children in YouTube's comments section.
"Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube," an AT&T spokesperson told CNBC. The company originally pulled its entire ad spend from YouTube in 2017 after revelations that its ads were appearing alongside offensive content, including terrorist content, but resumed advertising in January.
On Wednesday, Nestle and "Fortnite" maker Epic Games pulled some advertising. Disney reportedly also paused its ads.
There's no evidence that AT&T ads ran before any of the videos brought into question by recent reports. Advertisers such as Grammarly and Peloton, which did see their ads placed alongside the videos, told CNBC they were in conversations with YouTube to resolve the issue.
YouTube declined to comment on any specific advertisers, but said in a statement on Wednesday, "Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments."
Also on Thursday, AdWeek obtained a memo YouTube sent to advertisers that outlines immediate changes YouTube says it's making in an effort to protect its younger audience. CNBC confirmed its authenticity with one of the brands that received the memo.
Business
YouTube said it is suspending comments on millions of videos that "could be subject to predatory comments." It's also making it harder for "innocent content to attract bad actors" through changes in discovery algorithms, making sure ads aren't appearing on videos that could attract this sort of behavior, and removing accounts "that belonged to bad actors." YouTube is also alerting authorities as needed.
This story first appeared on CNBC.com. More from CNBC:
FDA head says federal government may take action if states don't adjust lax vaccine exemption laws
Under Armour CEO Kevin Plank reportedly had 'problematic' ties to MSNBC anchor Stephanie Ruhle