TikTok has announced plans to tighten its age verification measures across Europe, rolling out enhanced technology to better identify and prevent minors from using the platform.
Under new measures, TikTok's systems will assess a user's likely age based on their profile information and activity. If the platform detects an account belonging to someone under 13, a specialist moderator will review the case to determine whether it should be banned. Users in Europe will receive notifications explaining these measures and be given the opportunity to learn more.
The platform also allows users to report accounts that they suspect may belong to underage individuals, who can then be flagged for further review by moderators or specialists.
TikTok has stated that it removes approximately 6 million underage accounts from its platform every month. The company acknowledges that there is no single ideal solution to verifying age and preserving user privacy.
These measures are part of a pilot program launched last year in Europe, which helped the platform identify and remove thousands more underage accounts. TikTok worked with data protection regulators to ensure compliance with EU data protection standards.
As social media bans for under 16s become increasingly common, such as in Australia, where a ban went into effect last month, other platforms are taking similar steps. Reddit has even filed a lawsuit against its own proposed ban.
In the UK, there is growing public pressure and cross-party support for an under-16 social media ban, with Prime Minister Keir Starmer stating that "all options" are on the table. The House of Lords will vote next week on proposals for such a ban, with a binding vote potentially looming in the coming months.
Under new measures, TikTok's systems will assess a user's likely age based on their profile information and activity. If the platform detects an account belonging to someone under 13, a specialist moderator will review the case to determine whether it should be banned. Users in Europe will receive notifications explaining these measures and be given the opportunity to learn more.
The platform also allows users to report accounts that they suspect may belong to underage individuals, who can then be flagged for further review by moderators or specialists.
TikTok has stated that it removes approximately 6 million underage accounts from its platform every month. The company acknowledges that there is no single ideal solution to verifying age and preserving user privacy.
These measures are part of a pilot program launched last year in Europe, which helped the platform identify and remove thousands more underage accounts. TikTok worked with data protection regulators to ensure compliance with EU data protection standards.
As social media bans for under 16s become increasingly common, such as in Australia, where a ban went into effect last month, other platforms are taking similar steps. Reddit has even filed a lawsuit against its own proposed ban.
In the UK, there is growing public pressure and cross-party support for an under-16 social media ban, with Prime Minister Keir Starmer stating that "all options" are on the table. The House of Lords will vote next week on proposals for such a ban, with a binding vote potentially looming in the coming months.