It is not clear whether the image in the ad is of a real person or is itself AI-generated, with the ASA telling the BBC making such an assessment had not been part of its investigation.

The regulator said that PixVideo did not permit its users to remove clothing from digital images to create sexually explicit content, but that viewers may have got the impression that it did.

“Because the ad implied that viewers could use an app to remove a woman’s clothing, we considered it condoned digitally altering and exposing women’s bodies without their consent,” the agency said in a statement.

It added that the ad was “irresponsible, included a harmful gender stereotype and was likely to cause serious offence”.

Saeta Tech, which owns PixVideo, said it understood why the ad was likely to cause offence, but blamed its presentation and messaging, rather than the intended use of its product.

It said it prohibited the creation of nude or sexually explicit content and had automated detection and blocking tools to prevent such imagery from being generated.

The company has agreed not to show the ad again and has paused all advertising while it carries out an internal review.

The issue of apps that “declothe” women and girls without their consent hit the headlines in January, when Elon Musk’s chatbot Grok was used to flood X with sexualised images.

After a global backlash, Musk subsequently blocked Grok from generating such images in jurisdictions where it is illegal, but X still faces investigations and lawsuits around the world.

The UK government announced in December that it would make it illegal to create and supply AI tools letting users edit images to seemingly remove someone’s clothing.

The new offences will build on existing rules around sexually explicit deepfakes and intimate image abuse.

Leave A Reply