TikTok is set to implement new age-verification technology across the European Union in the coming weeks. This initiative comes as discussions intensify regarding potential social media restrictions for users under 16 years old, particularly in the UK. The pressure on TikTok, owned by ByteDance, aligns with similar scrutiny faced by other platforms popular among young audiences, such as YouTube.
The age-verification system has been quietly tested in the EU over the past year. It evaluates profile data, uploaded videos, and user behavior to determine if an account likely belongs to an individual under 13 years old. Accounts identified by this system will not face automatic bans; instead, they will be reviewed by specialized moderators who will decide whether to remove them. This approach has already proven effective, with the pilot in the UK leading to the removal of thousands of accounts.
Social media platforms are increasingly under the spotlight for their age-verification processes, particularly following Australia’s December 10, 2023, ban on social media for individuals under 16. The eSafety Commissioner of Australia recently disclosed that over 4.7 million accounts had been removed across ten platforms—including TikTok, Instagram, and Facebook—since the ban was enacted.
In the UK, Keir Starmer, the leader of the Labour Party, has expressed openness to similar restrictions for young users. He cited concerns over the amount of time children spend on smartphones, highlighting alarming reports of children as young as five spending hours in front of screens each day. Previously, Starmer opposed a blanket ban on social media for children, arguing that enforcement would be challenging and could inadvertently drive teenagers towards more dangerous online spaces.
Calls for stronger parental rights regarding access to children’s social media accounts have also gained momentum. Ellen Roome, whose 14-year-old son tragically died after participating in an online challenge, has advocated for reforms that would grant parents greater access to their children’s accounts in the event of their death.
In response to increasing regulatory scrutiny, TikTok has stated that the new technology has been developed to comply with the EU’s stringent data protection regulations. The company has collaborated closely with Ireland’s Data Protection Commission, the EU’s lead privacy regulator, to ensure that its age-verification measures meet legal requirements.
A recent investigation by The Guardian revealed concerns regarding moderators being instructed to allow under-13 users to remain on the platform if they claimed parental supervision. This finding has further fueled discussions about the effectiveness of current age-verification practices and the responsibilities of social media companies to protect young users.
The European Parliament is actively pushing for stricter age limits on social media usage, with Denmark also advocating for a ban for individuals under 15. As digital safety continues to be a contentious issue, platforms like TikTok are facing heightened expectations to safeguard young users while navigating complex regulatory landscapes.
