He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
It’s worth mentioning that in this instance the guy did send porn to a minor. This isn’t exactly a cut and dry, “guy used stable diffusion wrong” case. He was distributing it and grooming a kid.
The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.
For example, websites like novelai make a business out of providing pornographic, anime-style image generation. The models they use deliberately tuned to provide abstract, “artistic” styles, but they can generate semi realistic images.
Now, let’s say a criminal group uses novelai to produce CSAM of real people via the inpainting tools. Let’s say the FBI cast a wide net and begins surveillance of novelai’s userbase.
Is every person who goes on there and types, “Loli” or “Anya from spy x family, realistic, NSFW” (that’s an underaged character) going to get a letter in the mail from the FBI? I feel like it’s within the realm of possibility. What about “teen girls gone wild, NSFW?” Or “young man, no facial body hair, naked, NSFW?”
This is NOT a good scenario, imo. The systems used to produce harmful images being the same systems used to produce benign or borderline images. It’s a dangerous mix, and throws the whole enterprise into question.
Is every person who goes on there and types, “Loli” or “Anya from spy x family, realistic, NSFW” (that’s an underaged character) going to get a letter in the mail from the FBI?
I’ll throw that baby out with the bathwater to be honest.
Simulated crimes aren’t crimes. Would you arrest every couple that finds health ways to simulate rape fetishes? Would you arrest every person who watches Fast and The Furious or The Godfather?
If no one is being hurt, if no real CSAM is being fed into the model, if no pornographic images are being sent to minors, it shouldn’t be a crime. Just because it makes you uncomfortable, don’t make it immoral.
For now, if you read the article, it states that he shared the pictures to form like minded groups where they got emboldened and could support each other and legitimize/normalize their perverted thoughts. How about no thanks.
What do you mean focus your energy, how much energy do you think I spend on discussing perverts? And what should I spend my time discussing contact sports. It’s sound like you are deflecting.
Pedophiles get turned on abusing minors, they are mentally sick. It’s not like its a normal sexual desire, they will never stop at watching “victimless” images. Fuck pedophiles they don’t deserve shit, and hope they eat shit he rest of their lives.
they will never stop at watching “victimless” images.
How is that different from any other dangerous fetish? Should we be arresting adult couples that do Age Play? All the BDSM communities? Do we even want to bring up the Vore art communities? Victimless is victimless.
It’s worth mentioning that in this instance the guy did send porn to a minor. This isn’t exactly a cut and dry, “guy used stable diffusion wrong” case. He was distributing it and grooming a kid.
The major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.
For example, websites like novelai make a business out of providing pornographic, anime-style image generation. The models they use deliberately tuned to provide abstract, “artistic” styles, but they can generate semi realistic images.
Now, let’s say a criminal group uses novelai to produce CSAM of real people via the inpainting tools. Let’s say the FBI cast a wide net and begins surveillance of novelai’s userbase.
Is every person who goes on there and types, “Loli” or “Anya from spy x family, realistic, NSFW” (that’s an underaged character) going to get a letter in the mail from the FBI? I feel like it’s within the realm of possibility. What about “teen girls gone wild, NSFW?” Or “young man, no facial body hair, naked, NSFW?”
This is NOT a good scenario, imo. The systems used to produce harmful images being the same systems used to produce benign or borderline images. It’s a dangerous mix, and throws the whole enterprise into question.
I’ll throw that baby out with the bathwater to be honest.
Simulated crimes aren’t crimes. Would you arrest every couple that finds health ways to simulate rape fetishes? Would you arrest every person who watches Fast and The Furious or The Godfather?
If no one is being hurt, if no real CSAM is being fed into the model, if no pornographic images are being sent to minors, it shouldn’t be a crime. Just because it makes you uncomfortable, don’t make it immoral.
deleted by creator
If no real child is involved in any way, who is hurt?
For now, if you read the article, it states that he shared the pictures to form like minded groups where they got emboldened and could support each other and legitimize/normalize their perverted thoughts. How about no thanks.
Maybe you should focus your energy on normalized things that actually effect kids like banning full contact sports that cause CTE.
What do you mean focus your energy, how much energy do you think I spend on discussing perverts? And what should I spend my time discussing contact sports. It’s sound like you are deflecting.
Pedophiles get turned on abusing minors, they are mentally sick. It’s not like its a normal sexual desire, they will never stop at watching “victimless” images. Fuck pedophiles they don’t deserve shit, and hope they eat shit he rest of their lives.
How is that different from any other dangerous fetish? Should we be arresting adult couples that do Age Play? All the BDSM communities? Do we even want to bring up the Vore art communities? Victimless is victimless.