UK Regulator Opens Formal Investigation into X Over CSAM Scandal
The UK's media regulator, Ofcom, has launched a formal investigation into X after receiving reports that its AI chatbot account, Grok, was being used to create and share explicit images of people, potentially amounting to intimate image abuse or child sexual abuse material (CSAM). The probe focuses on whether X has complied with its duties to protect users from content illegal in the UK.
Grok's alleged misuse has raised concerns among regulators worldwide. Malaysia and Indonesia have already taken action, blocking access to Grok due to insufficient safeguards against creating non-consensual deepfakes of women and children. Indonesia described the issue as a "serious violation of human rights, dignity, and safety" in the digital space.
The investigation will examine X's measures to prevent users from accessing priority illegal content, including CSAM and non-consensual intimate images. It will also assess whether X carried out an updated risk assessment before making significant changes to its platform and if it has effective age assurance to protect children from seeing pornography.
Ofcom has asked xAI for clarification on the steps the company is taking to protect UK users and has conducted an expedited assessment of available evidence as a matter of urgency. The regulator emphasized that platforms must protect people in the UK from content that's illegal in the UK, and it will not hesitate to investigate where companies are failing in their duties, especially where there's a risk of harm to children.
The investigation comes amid reports that X has started telling users that its image generation tools were being limited to paying subscribers. However, non-paying users can still generate images through the Grok tab on the X website and app.
If Ofcom finds that X has broken the law, it can require platforms to take specific steps to come into compliance or remedy harm caused by the breach. The regulator can also impose fines of up to Β£18 million ($24.3 million) or 10 percent of "qualifying" worldwide revenue, whichever is higher.
The UK's media regulator, Ofcom, has launched a formal investigation into X after receiving reports that its AI chatbot account, Grok, was being used to create and share explicit images of people, potentially amounting to intimate image abuse or child sexual abuse material (CSAM). The probe focuses on whether X has complied with its duties to protect users from content illegal in the UK.
Grok's alleged misuse has raised concerns among regulators worldwide. Malaysia and Indonesia have already taken action, blocking access to Grok due to insufficient safeguards against creating non-consensual deepfakes of women and children. Indonesia described the issue as a "serious violation of human rights, dignity, and safety" in the digital space.
The investigation will examine X's measures to prevent users from accessing priority illegal content, including CSAM and non-consensual intimate images. It will also assess whether X carried out an updated risk assessment before making significant changes to its platform and if it has effective age assurance to protect children from seeing pornography.
Ofcom has asked xAI for clarification on the steps the company is taking to protect UK users and has conducted an expedited assessment of available evidence as a matter of urgency. The regulator emphasized that platforms must protect people in the UK from content that's illegal in the UK, and it will not hesitate to investigate where companies are failing in their duties, especially where there's a risk of harm to children.
The investigation comes amid reports that X has started telling users that its image generation tools were being limited to paying subscribers. However, non-paying users can still generate images through the Grok tab on the X website and app.
If Ofcom finds that X has broken the law, it can require platforms to take specific steps to come into compliance or remedy harm caused by the breach. The regulator can also impose fines of up to Β£18 million ($24.3 million) or 10 percent of "qualifying" worldwide revenue, whichever is higher.