TikTok Releases Report on Its Efforts to Address Content Moderation Challenges in Kenya
In its recent report on Kenya, TikTok has outlined its efforts to address content moderation issues that have sparked debate in the country. Last year, a petition was presented to parliament calling for the ban of TikTok, accusing the platform of failing to filter inappropriate content. In response, TikTok defended its practices during an April appearance before the Parliament’s Public Participation Committee, promising to enhance its content moderation measures.
In September, the committee decided against banning the platform but recommended that TikTok improve its moderation efforts. In its Q2 2024 Community Guidelines Enforcement report, TikTok detailed its ongoing efforts to maintain transparency in its moderation process in Kenya. The report revealed that over 360,000 videos were removed for violating TikTok’s policies, which accounts for 0.3% of all videos uploaded in the country during the quarter. The majority (99.1%) of these videos were taken down proactively before users could report them, with 95% of removals happening within 24 hours.
Additionally, TikTok suspended 60,465 accounts for policy violations, including 57,262 accounts suspected to belong to users under 13, in line with its policy to protect younger audiences from inappropriate content. With its vast user base and daily video uploads, TikTok continues to invest in advanced moderation technologies. In June 2024, it removed over 178 million videos globally, 144 million of which were automatically detected and removed. TikTok’s proactive detection rate stands at 98.2%, driven by AI-powered tools that quickly identify and remove harmful content, often before it can be seen by viewers.