Online harassment is a real plague and social networks are stepping up actions to try to put an end to it. The latest effort goes to TikTok. The platform has just announced the implementation of a new comment filtering and control system for creators, as well as the display of an alert when a user is about to post a derogatory comment.
Content creators will now be able to apply a filter on all comments they receive. When this filter is enabled, comments do not appear until they have been approved by the author of the video.
To boost its artillery against harassment, TikTok will display an alert pop-up to ask users about to post a comment that seems inappropriate, think about what they want to say, and edit their comment accordingly. .
This alert will at the same time remind you of the rules of use in force on the platform. The implementation of this system is reminiscent of the one tested by Twitter a few months ago, and which has just been relaunched a few weeks ago.