Following steps taken by Twitter and Facebook, TikTok announced new measures Wednesday to curb the spread of misinformation on the social media platform. The company will begin adding banners on unverified content and notify users before they share potentially misleading content.
The new procedures are part of the company’s efforts to advance media literacy among TikTok users. TikTok videos are usually fact-checked when they’re flagged by users for misinformation or when they’re related to Covid-19, vaccines, elections or other topics about which the spread of misleading information is common.
If TikTok is unable to draw a conclusion about the accuracy of the information in a video using readily available information, the company works with partners at PolitiFact and Lead Stories to fact-check.
Under the new guidelines, if TikTok is unable to verify whether certain content is accurate, the content will include a label saying it includes unverified information.
In addition, if users try to share unverified content, they’ll get a message with a “caution” icon asking whether they’re certain they want to share it. The user will have to choose whether to cancel or to share it anyway.
The policy is a crucial intervention to stop users from sharing unverified contents. Once a video is confirmed to have misleading information, it is taken off the platform, and users are given a way to appeal the decision. If a previously unverified video is found to be accurate, the banner will be taken off, and users will be able to share the content without a sharing prompt.