Google will reportedly start more strictly vetting YouTube channels that are part of its Google Preferred premium advertising program, reports Bloomberg Technology. Anonymous sources tell Bloomberg that the company will use both human moderators and machine learning to identify videos that shouldn’t be part of Preferred bundles. The move is supposedly a response to advertisers’ concerns over inappropriate videos featuring children, as well as offensive behavior from YouTube stars like Logan Paul, who was kicked off the Preferred platform after uploading a video of a dead body in Japan’s Aokigahara forest.
Google touts Preferred as a collection of “the most popular YouTube channels among US 18- to 34-year-olds” and “the most engaging and brand safe content on YouTube,” organized into categories like fashion, pop culture, and recipes. But the recent Logan Paul controversy seemingly caught YouTube flat-footed. Paul removed the video himself only after it had been viewed widely, and copies continued to percolate across the platform. It’s also part of a larger moderation crisis for Google, which said last month that it was expanding its staff of moderators to 10,000 people. There’s not much detail in this latest report, but it’s not a surprising move for the company.
A spokesperson for YouTube tells The Verge and Polygon that “we built Google Preferred to help our customers easily reach YouTube’s most passionate audiences and we’ve seen strong traction in the last year with a record number of brands. As we said recently, we are discussing and seeking feedback from our brand partners on ways to offer them even more assurances for what they buy in the Upfronts.”