With the (relative) anonymity that the internet affords us, there are some who have abused this by leaving hateful comments on websites, forums, videos, and more, things that they may never say to the person if they were talking to them face to face.
The good news for YouTube creators who have to deal with these types of comments is that in an update to its community guidelines, YouTube has announced that they will be taking toxic comments more seriously by imposing a timeout on users who have violated those guidelines.
According to YouTube, users will be given a chance where if a comment has been detected and removed, that user will be notified that they have violated the guidelines. If they were to repeat their comments and continue to post toxic things, they will then receive a timeout which will effectively ban them from leaving further comments for the next 24 hours.
We’re launching a new feature that warns users when we’ve detected and removed some of their comments for violating our Community Guidelines. Additionally, if a user continues to leave multiple abusive comments, they may receive a timeout and be temporarily unable to comment for up to 24 hours. Our testing has shown that these warnings/timeouts reduce the likelihood of users leaving violative comments again.
YouTube claims that so far based on their testing, these warnings and timeouts have resulted in the reduced likelihood of users leaving these types of comments again. Of course, this doesn’t completely solve the issue of online toxicity, but if it reduces and makes people think twice before leaving hurtful and hateful comments, that’s a win.