Kuala Lumpur, September 4, 2025 — Malaysia has issued a strong directive urging TikTok to implement robust age-verification mechanisms to protect underage users from harmful content. Communications Minister Fahmi Fadzil, expressing growing frustration, summoned TikTok’s top management for talks and called on both the Malaysian Communications and Multimedia Commission (MCMC) and the police to collaborate closely with the platform to design an effective system.
During the meeting, Fahmi made clear that current content moderation efforts are inadequate in shielding minors from dangerous and inappropriate material. The MCMC and law enforcement will engage TikTok in developing protocols that ensure only age-appropriate access to its platform.
This initiative forms part of Malaysia’s broader crackdown on harmful online content—including scams, cyberbullying, grooming, and content offensive to race, religion, or royalty. Under new regulations effective from January, social media platforms with more than 8 million users in Malaysia must be licensed or face penalties. TikTok and WeChat have already secured licences, while Meta (Facebook, WhatsApp, Instagram) and other platforms are under scrutiny.
This move aligns Malaysia with global trends in regulating children’s online safety. Countries like Britain, France, and Australia have introduced laws requiring age checks or have banned users under a certain age from accessing social media.
Source: Reuters








