An unfortunate side effect of any social media platform’s success is a correlating increase in cyberbullying and harassment, and TikTok is no exception.
Fortunately, the company has started implementing new anti-bully strategies to keep its users safe, beginning with stronger anti-harassment and cyberbullying policies introduced in December 2020. The latest part of TikTok’s anti-bullying initiative is two new features aimed at reducing harmful comments.
The first is a new comment approval option. TikTok already lets users filter comments by keyword—so you can prevent users from posting slurs, profanity, or other problematic content on your videos—but the app can now hide all new comments until you review and approve them. The ones you approve will appear publicly in the video’s comments section, and the ones you dismiss won’t be posted at all.
To find and enable the new comment review tool in the TikTok app:
- Open your TikTok profile.
- Tap the “…” icon.
- Go to Privacy and Safety > Who can comment on my videos.
- Select “Comment filters.”
- Turn on “Filter all comments.”
To see a list of all pending comments, go back to the Comment Filters menu, then tap “Review filtered comments.” Tap a comment, then select whether to delete or approve it.
Preventative anti-bullying prompts
In addition to the new comment review tool, TikTok now automatically screens comments before a user posts them.
If the app’s AI recognizes language that goes against TikTok’s policies, the app warns the users that their comment may break the rules, and gives them an opportunity to revise or discard the comment.
Users won’t be prevented from writing potentially harmful comments, however—users can ignore TikTok’s prompts to reconsider and post their comment anyway. If they do, the expanded comment filter will hopefully prevent those gross responses from ultimtely showing up in the comment section. Still, the company hopes the extra step will reduce instances of harassment—and general negativity—on its platform.