UK's Bold Move: New Laws to Halt AI-Generated Nude Image Scourge

Published 7 hours ago3 minute read
Uche Emeka
Uche Emeka
UK's Bold Move: New Laws to Halt AI-Generated Nude Image Scourge

The United Kingdom is set to implement new laws making it illegal to create online sexual images of individuals without their explicit consent. These measures come in the wake of a significant global outcry concerning the misuse of Elon Musk’s artificial intelligence chatbot, Grok, which was reportedly used to generate sexualized deepfakes of women and children.

Musk’s company, xAI, responded to the controversy by announcing new protocols to prevent Grok from facilitating the editing of photos of real people to depict them in revealing attire in jurisdictions where such actions are unlawful.

British Prime Minister Keir Starmer acknowledged these steps but emphasized that X, Musk's social media platform, must ensure immediate and full compliance with UK law. Starmer affirmed his government’s commitment to remaining vigilant against any transgressions by Grok or its users, asserting that, "Free speech is not the freedom to violate consent."

The chatbot, developed by xAI and integrated with the X platform, faced intense international scrutiny after reports surfaced detailing its use in recent weeks to produce thousands of digitally altered images that effectively "undress" people without their permission. These images ranged from nude depictions to portrayals of women and children in bikinis or sexually explicit poses.

Critics have long argued that laws regulating generative AI tools are overdue, suggesting that the UK's legal changes should have been implemented much earlier. Britain’s media regulator, Ofcom, has launched an investigation into whether X has violated UK laws regarding the Grok-generated images, particularly those sexualizing children or depicting individuals undressed. Ofcom stated that such images, along with similar content created by other AI models, could be classified as pornography or child sexual abuse material.

The root of the issue traces back to the launch of Grok Imagine last year, an AI image generator equipped with a "spicy mode" capable of producing adult content through text prompts.

Technology Secretary Liz Kendall highlighted a report from the Internet Watch Foundation, which detailed deepfake images involving the sexualization of 11-year-olds and women subjected to physical abuse. Kendall unequivocally stated, "The content which has circulated on X is vile. It is not just an affront to decent society, it is illegal."

In response, UK authorities are enacting legal changes to criminalize the use and supply of "nudification" tools. This includes fast-tracking provisions within the Data (Use and Access) Act, which will make it a criminal offense to create or request deepfake images. Passed by Parliament last year, this legislation is slated to come into effect on February 6.

Justice Secretary David Lammy issued a stern warning: "Let this be a clear message to every cowardly perpetrator hiding behind a screen: you will be stopped and when you are, make no mistake that you will face the full force of the law."

Furthermore, the government is criminalizing "nudification" apps as part of the ongoing Crime and Policing Bill. This new criminal offense will prohibit companies from supplying tools specifically designed to create non-consensual intimate images, an approach Kendall described as targeting the problem "at its source."

The Ofcom investigation remains active, and Kendall warned that X could face substantial penalties, including a fine of up to 10% of its qualifying global revenue, depending on the investigation’s findings and potential court orders that could block access to the site. Downing Street is also re-evaluating its presence on the X platform.

Despite these developments, Elon Musk reiterated his stance that Grok complies with the law, stating that it will refuse to produce anything illegal and that any unexpected outcomes from "adversarial hacking of Grok prompts" would be immediately addressed as bugs.

Loading...
Loading...

You may also like...