Video-sharing app TikTok has shared a content moderation mechanism for removing any potentially harmful or inappropriate content from its platform, saying that Pakistan is one of the five markets with the largest volume of removed videos.
The response comes after the Pakistan Telecommunication Authority (PTA) last month issued a final warning to Chinese-owned social media app to clamp down on what it called “immoral, obscene and vulgar” content on the video-sharing platform.
The company in a statement on Thursday said content moderation is performed by deploying a combination of policies, technologies, and moderation strategies to detect and review problematic content, accounts and implement appropriate penalties.
Referring to its transparency report, the video-sharing app said: “It demonstrates TikTok’s commitment to remove any potentially harmful or inappropriate content reported in Pakistan.”
TikTok has become a global sensation with its 15 to 60-second video clips and is hugely popular among young Pakistanis, with some users building up millions of followers.
While users enjoy creating content on TikTok, with it comes the responsibility to keep users safe on the platform, the company said in the statement adding that it has released an updated publication of the Community Guidelines in Urdu that will help and maintain a supportive and welcoming environment on TikTok for users in Pakistan.
“The Community Guidelines provide general guidance on what is and what is not allowed on the platform, keeping TikTok a safe place for creativity and joy, and are localised and implemented in accordance with local laws and norms.”
TikTok’s teams remove content that violates the Community Guidelines , and suspends or bans accounts involved in severe or repeated violations, it added.
TikTok’s systems automatically flag certain types of content that may violate its Community Guidelines, enabling it to take swift action and reduce potential harm, accoridng to the Chinese-owned social media app.
“These systems take into account things like patterns or behavioral signals to flag potentially violative content.”
The app’s managment said content moderation can not solely be performed with the help of technology and it has appointed a trained moderators to review the context of the content.
“TikTok has a team of trained moderators to help review and remove content. In some cases, this team removes evolving or trending violative content, such as dangerous challenges or harmful misinformation. Another way the TikTok team moderates content is based on reports that it receives from its users. There is a simple in-app reporting feature for users to flag potentially inappropriate content or accounts to TikTok.”