YouTube Headquarters

After months of complaints, YouTube has updated the guidelines that decide which videos can be monetized. The move follows a clash between creators and advertisers about which content is acceptable to promote on.

Major brands like McDonalds and Starbucks pulled out from the platform, resulting in a huge decrease in income for many creators. Google then implemented a way for advertisers to exclude certain types of videos, causing further drops.

It was far from a perfect system, and some YouTubers saw videos that were fairly benign unmonetized. This happened with little explanation from Google, and it was difficult to know which videos would lose revenue.

Advertisement

The new guidelines seek to address that, trying to sate advertisers need for control while providing better communication to creators.

“We’ve heard loud and clear from the creator community and from advertisers that YouTube needs to broaden our advertiser-friendly guidelines around a few additional types of content,” said YouTube. “While it’s not possible for us to cover every video scenario, we hope this additional information will provide you with more insight into the types of content that brands have told us they don’t want to advertise against and help you to make more informed content decisions.”

The Guidelines

Those new guidelines are pretty short and succinct. VP of product management Ariel Baldin says Google will take a “tougher stance” on the following:

  • Hateful content: Content that promotes discrimination or disparages or humiliates an individual or group of people on the basis of the individual’s or group’s race, ethnicity, or ethnic origin, nationality, religion, disability, age, veteran status, sexual orientation, gender identity, or other characteristic associated with systematic discrimination or marginalization.
  • Inappropriate use of family entertainment characters: Content that depicts family entertainment characters engaged in violent, sexual, vile, or otherwise inappropriate behavior, even if done for comedic or satirical purposes.
  • Incendiary and demeaning content: Content that is gratuitously incendiary, inflammatory, or demeaning. For example, video content that uses gratuitously disrespectful language that shames or insults an individual or group.”

A new page on its Creator Academy goes into further detail. Videos that feature controversial issues, inappropriate language, and drugs are also considered “not advertiser-friendly”.

However, some questions still remain. It’s unclear, for example, where gaming videos fit into this model. It’s one of the most popular genres on the website, and many contain content like gore or highlight discriminatory topics.

Though YouTube will not take down these videos, some argue that limiting income creates a natural censorship. Creators will think twice about creating videos with controversial topics because, in many cases, it’s their full-time job.

It’s clear that YouTubers aren’t happy, and many argue for further consideration of context. In its current system, Google fails to distinguish between words used in jest or fictional scenarios and real violence. Advertisers widely avoid videos of Call of Duty: World War II, for example.

Hopefully, Google will enforce these guidelines more accurately now they’re in place. Only time will tell if it will sate both advertisers and creators.

Advertisement