Drudge Retort: The Other Side of the News
Wednesday, May 22, 2024

Federal prosecutors have charged a Wisconsin man for allegedly using a popular artificial intelligence image generator to create thousands of explicit images of children, marking what is potentially the first federal charge of creating child sexual abuse material applied to images produced entirely through AI. In a statement Monday afternoon, the Justice Department said it has charged Steven Anderegg, 42, of Holmen, Wis., with using the AI image generator Stable Diffusion to create over 13,000 fake images of minors, many of which depicted fully or partially nude children touching their genitals or engaging in sexual intercourse with men.

More

Comments

Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.

IMO the AI company is just as guilty for allowing those images to be created in the first place. Sick.

#1 | Posted by qcp at 2024-05-22 08:53 AM | Reply

If this is a derivative work made from existing illegal images, I can see how that's illegal.

If the images are truly novel, and no minors were harmed by their manufacturing, this seems like a stretch.

"The images did not show real children and were made by typing strings of text into an image generator, underscoring what Justice officials have long argued: that a 2003 law banning photorealistic fake and obscene images applies to AI."

#2 | Posted by snoofy at 2024-05-22 08:58 AM | Reply

Seems like it should go the other way. Let the sickos produce as much fake kiddie ---- as they want and maybe it will dry up the market for real exploitation.

#3 | Posted by TFDNihilist at 2024-05-22 11:33 AM | Reply

Let the sickos produce as much fake kiddie ---- as they want and maybe it will dry up the market for real exploitation.
#3 | POSTED BY TFDNIHILIST

Unfortunately that's not how addiction works.

#4 | Posted by oneironaut at 2024-05-22 12:11 PM | Reply

How does addiction work.
And why isn't "fake" stuff a path towards harm mitigation.
Since that's exactly the reason Methadone is a treatment for heroin addicts.

#5 | Posted by snoofy at 2024-05-22 01:11 PM | Reply

---- no, absolutely not. Not on any planet or in any timeline should this be acceptable.

I expected comments but not pro-child porn comments. Yuck.

#6 | Posted by qcp at 2024-05-22 01:16 PM | Reply

Anyone with Photoshop skills can generate this stuff without harming any children.

Why do you think that needs to be criminalized?

#7 | Posted by snoofy at 2024-05-22 01:23 PM | Reply

Why you think it shouldn't be is a better question.

#8 | Posted by qcp at 2024-05-22 01:27 PM | Reply | Newsworthy 1

Because where is the harm?

"Imagine a priest raping a child."

You just depicted child sexual abuse in your mind, should my words in quotes be a crime, did you commit a crime, maybe both?

#9 | Posted by snoofy at 2024-05-22 01:30 PM | Reply

Google "Elizabeth from BioShock NSFW" and tell us if Google is distributing child pornography.

#10 | Posted by snoofy at 2024-05-22 01:32 PM | Reply

I picked that example because the community effort to get that fictional character into salacious poses actually led to advances in computer animation.

#11 | Posted by snoofy at 2024-05-22 01:33 PM | Reply

I think lolicons are gross too. But criminalizing fantasy content is a First Amendment bridge too far.

#12 | Posted by snoofy at 2024-05-22 01:36 PM | Reply

Nope. Imagining an awful scenario is not the same as producing and distributing child porn. Plonked.

Maybe you should bother read the article before you comment, you sick ----.

"In two other recent cases, investigators said men in North Carolina and Pennsylvania had used AI to superimpose children's faces into explicit sex scenes, creating what's known as a deepfake, or to digitally remove the clothing from children's real photographs."

#13 | Posted by qcp at 2024-05-22 01:36 PM | Reply

"Maybe you should bother read the article before you comment, you sick ----."

Maybe you should learn to control your volatile emotions and not be so easily triggered, you petulant child.

I'm commenting on the generated "fake images" not on pasting a real child's face onto an existing real image of adults.

#14 | Posted by snoofy at 2024-05-22 01:41 PM | Reply

"or to digitally remove the clothing from children's real photographs."

Sony voluntarily stopped selling their HandyCam that could pretty effectively see through clothes. That was what, twenty years ago?

You propose criminalizing that technology?

TSA is using an even higher tech approach to see through clothing right now;
pretty sure they use it on children.

#15 | Posted by snoofy at 2024-05-22 01:45 PM | Reply

How does addiction work.

Addiction is an urge to overcome the current level of stimuli. Images satiate, but create a need for more more more ... eventually leading to the physical.

And why isn't "fake" stuff a path towards harm mitigation.

It satiates, but then becomes normalized, boring or built up resistance to the current level of stimuli, and the need for physical satisfaction is created

I don't know how someone could get to the point of sexually abusing a child, but I imagine its through images first, then physically later. I don't imagine people just have urges and act them out on children without some previous stimuli.

Since that's exactly the reason Methadone is a treatment for heroin addicts.

No, you don't seem to understand Methadone and its use to treat drug addiction. Its hilarious you used "exactly" like you're some expert on the topic.

Methadone is used block pain. It's used in rehab so withdrawal isn't so painful. People can still be addicted to Methadone.

Methadone would be similar to blocking the internet from people looking at sexually explicit child content.

#16 | Posted by oneironaut at 2024-05-22 01:53 PM | Reply

I'm commenting on the generated "fake images" not on pasting a real child's face onto an existing real image of adults.

There really is no difference.

Fake, as in not real ... AI gen or people generated content all has a source that is the same.

#17 | Posted by oneironaut at 2024-05-22 01:55 PM | Reply

"Methadone would be similar to blocking the internet from people looking at sexually explicit child content."

LOL no.

#18 | Posted by snoofy at 2024-05-22 01:59 PM | Reply

"People can still be addicted to Methadone."

Yes but it's better than being addicted to heroin.

Do you even know why the phrase "----- For Pyros" exists, or do you just reject the concept of harm mitigation?

#19 | Posted by snoofy at 2024-05-22 02:01 PM | Reply

"Addiction is an urge to overcome the current level of stimuli. Images satiate, but create a need for more more more ... eventually leading to the physical."

You're saying that everyone who looks at porn will eventually become a sex offender, which is just another reason you are Deplorable.

Tell us about Trump having sex with a porn star, or is that your Exhibit A for why everyone who looks at porn will eventually become a sex offender?

#20 | Posted by snoofy at 2024-05-22 02:05 PM | Reply

Why you think it shouldn't be is a better question.

#8 | POSTED BY QCP

Who is the victim?

#21 | Posted by truthhurts at 2024-05-22 02:10 PM | Reply

I'm not surprised this conversation died out, but I'd be happy to see legislation about making deepfakes of anyone. This doesn't only affect children and that's just one reason what the government is doing here is the wrong approach.

I don't think anyone is advocating it's okay to make deepfake porn of someone the moment they turn 18.
But that's what prosecuting this will establish as a legal landscape.
And that's why it's stupid.
And that's worth discussing.

This buttresses against Scarlett Johansson's voice being used by Sam Altman without her permission. That's a deepfake. That needs to be criminal.

To get that we need laws. Not stupid kiddie porn prosecutions that will have zero impact on 95% of the human population, and we're talking about a product which is transmitted over the Internet anyway, so actual interdiction will be done by the major content aggregators just like it's done now.

#22 | Posted by snoofy at 2024-05-22 08:04 PM | Reply

Google "Elizabeth from BioShock NSFW" and tell us if Google is distributing child pornography.

#10 | Posted by snoofy at 2024-05-22 01:32 PM | Reply | Flag:

Not Googling that. But just the fact that right off the cuff you threw that out there, is concerning in itself.

#23 | Posted by lfthndthrds at 2024-05-22 11:54 PM | Reply

As much fun as it is to see Snoofy pulled under by their own tactics...

It's absurd to argue that a completely, 100%, created AI image of child in a sex act is treated the same as an actual image taken of a child sex act. One is basically a really good drawing and the other actually abuses a child.

Do you feel that the illustrator who drew the images for the books being argued about for being in schools that depict one boy giving the other a bj should be prosecuted for child pornography. The only difference is in the quality of the art work produced.

If it were good enough to be indistinguishable from the real thing and flooded the market, and kept kids from being used to produce this... would that be so bad?

#24 | Posted by kwrx25 at 2024-05-23 12:49 PM | Reply

Shut the ---- up you fascist ----.

#25 | Posted by LegallyYourDead at 2024-05-23 08:43 PM | Reply

And legally showing his deep thinking abilities again I see

#26 | Posted by kwrx25 at 2024-05-23 09:47 PM | Reply

If no children were harmed making it it's THOUGHT CRIME.

Too Stupid. One's brain is free to wander.

Snoofy is absolutely 100% correct on this.

No thought crime. The mind at least is free.

If someone pervs on kids and get his fix digitally, it's way better than him acting out on kids.

Duh?

It's not even A real crime, it's just creepy.

That shouldn't be illegal in a free society.

#27 | Posted by Effeteposer at 2024-05-23 11:47 PM | Reply

It's ridiculous to think that some dude watching AI kid stuff in his basement is going to want to go out and rape some real ones. What's really going to happen is he's going to pop his load and then fall asleep in his chair.

#28 | Posted by TFDNihilist at 2024-05-24 08:40 AM | Reply

Comments are closed for this entry.

Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy | Copyright 2024 World Readable

Drudge Retort