👉 Relocate to Canada Today!

Live, Study and Work in Canada. No Payment is Required! Hurry Now click here to Apply >> Immigrate to Canada

NewsTechnology

Offenders confused about ethics of AI child sex abuse

A charity that helps people worried about their own thoughts or behaviour says an increasing number of callers are feeling confused about the ethics of viewing AI child abuse imagery, AS PER BBCinformationguideNigeria

The Lucy Faithfull Foundation (LFF) says AI images are acting as a gateway.


👉 Relocate to Canada Today!

Live, Study and Work in Canada. No Payment is Required! Hurry Now click here to Apply >> Immigrate to Canada

The charity is warning that creating or viewing such images is still illegal even if the children are not real.

Neil, not his real name, contacted the helpline after being arrested for creating AI images.

The 43-year-old denied that he had any sexual attraction to children.JAMB Portal

The IT worker, who used AI software to make his own indecent images of children using text prompts, said he would never view such images of real children because he is not attracted to them. He claimed simply to be fascinated by the technology.

He called the LFF to try to understand his thoughts, and call handlers reminded him that his actions are illegal, regardless of whether or not the children are real.JAMB Result

The charity says it has had similar calls from others who are expressing confusion.

Another caller got in touch after discovering that her 26-year-old partner viewed indecent AI images of children, but said they were not serious because the pictures “aren’t real”. The offender has since asked for help.

A teacher asked for the charity’s advice because her 37-year-old partner was viewing images that seemed illegal, but neither of them was sure if they were.

👉 Relocate to Canada Today!

Live, Study and Work in Canada. No Payment is Required! Hurry Now click here to Apply >> Immigrate to Canada

The LFF’s Donald Findlater says some callers to its confidential Stop It Now helpline think that AI images are blurring the boundaries for what is illegal and morally wrong.

“This is a dangerous view. Some offenders think this material is in some way OK to create or view because there are no children being harmed, but this is wrong,” he says.

Read Other News:

In some cases, AI abuse images might also be wrongly labelled or advertised as AI-made and the difference in realism is becoming harder to spot.

Mr Findlater says that deviant sexual fantasy is the strongest predictor of reoffending for anyone convicted of a sexual crime.

“If you feed that deviant fantasy, then you’re making it more likely you’re going to do harm to children,” he said.

The charity says the number of callers citing AI images as a reason for their offending remains low, but is rising. The foundation is urging society to recognise the problem and lawmakers to do something to reduce the ease in which child sexual abuse material (CSAM) is made and published online.

Although the charity would not name any specific sites where it has found the imagery, one popular AI art website has been accused of allowing users to publish sexual and graphic images of very young models. When the BBC approached Civit.ai about the issue in November, the firm said it takes potential CSAM on the site “very seriously” and asks the community to report images that users consider to “depict under-age characters/people in a mature or photorealistic context”.NYSC Portal

The LFF also warned that young people are creating CSAM without realising the seriousness of the offence. One caller, for example, was concerned about his 12-year-old son who had used an AI app to create inappropriate topless pictures of friends, and then subsequently searched for terms such as “naked teen” online.

Criminal cases in Spain and the US have recently been launched against young boys using declothing apps to create naked pictures of school friends.

In the UK, Graeme Biggar, head of the National Crime Agency, said in December that he wanted to see tougher sentences for offenders who possess child abuse imagery, adding that AI abuse imagery “matters, because we assess that the viewing of these images – whether real or AI-generated – materially increases the risk of offenders moving on to sexually abusing children themselves”.
NEWS CREDIT: BBC

Check JAMB Result

Check and Confirm: How much is Dollar to Naira

📢 We are hiring writers!

Article Writing Jobs - We are hiring good freelance writers - Click here to apply

Copyright Warning!

Contents on this website may not be republished, reproduced, or redistributed either in whole or in part without due permission or acknowledgment. . Proper acknowledgment includes, but not limited to (a) LINK BACK TO THE ARTICLE in the case of re-publication on online media, (b) Proper referencing in the case of usage in research, magazine, brochure, or academic purposes,. . All contents are protected by the Digital Millennium Copyright Act 1996 (DMCA). . . The images, except where otherwise indicated, are taken directly from the web, if some images were inserted by mistake violating the copyright, please contact the administrator for immediate removal.
. We publish all content with good intentions. If you own this content & believe your copyright was violated or infringed, please contact us  for immediate removal.

Emediong Silver

Emediong Ekpe is a graduate of English. A professional Sports journalist/analyst, and a spoken word artist. He is passionate about decimating information and putting smiles on people's faces via news writing. Whatapp: 08088735884

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

It looks like you're using an ad blocker!

This website InfoGuideNigeria.com is maintained by the advertising revenue and we noticed you have ad-blocking enabled. Please disable Ad-Blocker