In a recent update, YouTube decided to change its existing policies to make it easier for gamers and streamers to make money from violent video game content.
A series of changes will affect the site’s content moderation, most of them centred around new distinctions between actual real-world violence and the “scripted or simulated violence” you see in video games.
Previously, content which features violent games would be age-restricted.
As a result, these videos won’t appear in search results in some cases, and the platform itself would occasionally prevent creators from monetizing violent gaming videos with ads.
Thanks to this new update, violent video games will be treated in the same manner as scripted movies and TV shows.
A lot of violent content will remain to be be age-restricted, which is only appropriate, but creators won’t have to worry anymore about their videos being flagged or demonetized.
In a tweet announcing the policy change, Youtube explains:
“We’ve heard loud and clear that our policies need to differentiate between real-world vs. simulated violence, and we’re updating our enforcement to reflect that.”
Today, “scripted or simulated violent content found in video games will be treated the same as other types of scripted content,” rather than as depictions of actual violence.
Future uploads that depict video game violence may also be “approved instead of being age-restricted.”
“We may still age-restrict content if violent or gory imagery is the sole focus of the video. For instance, if the video focuses entirely on the most graphically violent part of a video game”
Nonetheless, you can now except a little to few videos to be age-gated going for simply showing scenes of video game violence.
This new policy went into effect on December 2, and creators have been grateful for YouTube’s decision to loosen the restrictions so far!
What do you think of this new lax policy?