YouTube Will Start Issuing Warnings, 24-Hour Bans

YouTube, the popular video-sharing platform, on Tuesday, announced that it has introduced a new feature that will restrict extensive spam and abuse in comments as well as live stream chats.

The latest move by YouTube comes in response to increasing concerns expressed by big content creators about the growing number of spam and abusive comments on the video streaming platform.

Even creators such as Linus Tech Tips, Jacksepticeye, and MKBHD have all made videos about comment spam this year.

In an effort to combat comment spam, YouTube with the help of automated moderation bots aided by machine learning and a new warning system will now notify users when an abusive comment is removed and issue a 24-hour ban for ‘multiple abusive comments’.

According to YouTube’s Community Guidelines page, the comment regulations cover extensive topics, including hate speech, child safety, nudity and sexual content, vulgar language, harassment and cyberbullying, and COVID-19 medical misinformation.

“We’ve been working on improving our automated detection systems and machine learning models to identify and remove spam. In fact, we’ve removed over 1.1 billion spammy comments in the first six months of 2022. As spammers change their tactics, our machine learning models are continuously improving to better detect the new types of spam,” reads a post signed by “Rob” at TeamYouTube.

The post later adds, “We’ve improved our spambot detection to keep bots out of live chats. We know bots negatively impacts the live streaming experience, as the live chat is a great way to engage and connect with other users and creators. This update should make live streaming a better experience for everyone.”


How Does The New Automated Moderation Program Work?

YouTube says it will automatically send a notification to the user warning them when the platform has detected and removed some of their comments for violating its Community Guidelines.

Further, if the user keeps posting multiple abusive comments, they may receive a “timeout” and be temporarily unable to comment for up to 24 hours.

“Our testing has shown that these warnings/timeouts reduce the likelihood of users leaving violative comments again,” TeamYouTube said.

Currently, the notification is only available for English comments; however, the company is hoping to expand the ability to more languages in the coming months.

“Our goal is to both protect creators from users trying to negatively impact the community via comments, as well as offer more transparency to users who may have had comments removed to policy violations and hopefully help them understand our Community Guidelines,” the TeamYouTube added.

If a user thinks their comment or post has been incorrectly flagged or removed, they can share their feedback on the issue.

Sadly, toxic and hateful comments on YouTube have been a non-stop saga, which has also been accepted by the video streaming platform, which says, “Reducing spam and abuse in comments and live chat is an ongoing task, so these updates will be ongoing as we continue to adapt to new trends.”

Subscribe to our newsletter

To be updated with all the latest news

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Subscribe to our newsletter

To be updated with all the latest news

Read More

Suggested Post