REVIEWS

18-year prison sentence for man who used AI to create child abuse images


Nelson in August admitted to creating and selling bespoke images of child sexual abuse tailored to customers’ specific requests. He generated digital models of the children using real photographs that his customers had submitted. Police also said he further distributed the images he had created online, both for free and for payment.

It comes as both the tech industry and regulators are grappling with the far-reaching social impacts of generative AI. Companies such as Google, Meta, and X have been scrambling to tackle deepfakes on their platforms.

Graeme Biggar, director-general of the UK’s National Crime Agency, last year warned it had begun seeing hyper-realistic images and videos of child sexual abuse generated by AI.

He added that viewing this kind of material, whether real or computer-generated, “materially increases the risk of offenders moving on to sexually abusing children themselves.”

Greater Manchester Police’s specialist online child abuse investigation team said computer-generated images had become a common feature of their investigations.

“This case has been a real test of the legislation, as using computer programs in this particular way is so new to this type of offending and isn’t specifically mentioned within current UK law,” detective constable Carly Baines said when Nelson pleaded guilty in August.

The UK’s Online Safety Act, which passed last October, makes it illegal to disseminate non-consensual pornographic deepfakes. But Nelson was prosecuted under existing child abuse law.

Smith said that as AI image generation improved, it would become increasingly challenging to differentiate between different types of images. “That line between whether it’s a photograph or whether it’s a computer-generated image will blur,” she said.

Daz 3D, the company that created the software used by Nelson, said that its user license agreement “prohibits its use for the creation of images that violate child pornography or child sexual exploitation laws, or are otherwise harmful to minors” and said it was “committed to continuously improving” its ability to prevent the use of its software for such purposes.

© 2024 The Financial Times Ltd. All rights reserved. Not to be redistributed, copied, or modified in any way.


Source link

Related Articles

Back to top button