Is there an actual case for outlawing this that isn't based on moral panic? Wouldn't you actually want people to generate those images with AI so they are less incentivized to pay for the real stuff?
As long as you don't need actual CSAM material in the training data and the generated images are different enough from a real person (both of which seem to be very possible technology-wise), that seems to be a good thing.
Or is there any indication that availability of CSAM material actually increases the likelihood that people act on it later?
We don't have (and I doubt we will ever have) tools for distinguishing between real and ai generated images with a guaranteed 100% accuracy (and 0% false negative and false positive rates).
Given that, I don't see how you can allow ai generated CSAM without effectively making "real" csam images be unprosectable.
So you think that currently, until this law is implemented, CSAM is effectively unprosecutable because people can just claim they generated the image with AI?
I think that there is a >0% probability that a individual case can be unprosecutable (or at least have the image evidence be much less useful) if the person in question actively starts generating CSAM using AI for the purpose of casting doubt on the legitimacy of any individual real image that the prosecutor wants to use as evidence.
The standard is beyond reasonable doubt, and I think that's going to become an increasingly difficult bar to clear if the AI generated versions (either made for their own case or as decoys) are allowed to remain legal.
You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.
(You need to sign both the models and the programs to make sure there's no img2img.)
You don’t even need to give them a model, just generate some images and publish them.
If you find those images, it’s fine, if you find anything else, arrest them.
I completely agree. Most people were marrying 13 and 14 year olds less than 100 years ago. Yes we want to make people only be sexually attractive after they turn 18 but that’s not reality.
That being said I don’t know if the availability of CSAM would increase or decrease real world abuse.
We should really up the game and completely ban all AI generated images depicting people, because we have no good way of knowing whether an image is AI generated or real, and images depicting people have terrible consequences in society when weaponized.
I don’t understand why it needs to be banned. If it is artificial, whether it is a story someone wrote, or an animation someone drew, or a photo-realistic AI generated thing, it’s just not real. There is no harm committed to a victim. It feels like this is a moralistic crusade, adjacent to age verification laws that are just backdoor porn bans (freely admitted by the conservatives who support each laws).
The bigger issue is that these types of bans feel a lot more like banning speech than banning a real crime, and the precedent it sets can end up being used in far-reaching ways. That’s how it always is.
I can't agree with the photorealistic AI images because they're indistinguishable from an actual photograph.
Everything else I do agree with you on, though.
The probem is, prosecutions are just looking for easier ways to jail people for things they could do based on what they personally believe. (E.g. "Manga causes child abuse")
The United States already considers artwork that resembles a real minor to be outside the First Amendment and hence illegal. Even like, cartoon artwork. If you're fapping to naked Bart Simpson that's one thing, but if it's a drawing of a real child you are using that child's image as a sexual object, that can be profoundly traumatizing, and it is seen to cross the threshold of "actually abusing a child" that justifies not applying the First Amendment. People's likenesses in general are subject to strong protection in the United States and you can face strong penalties for misusing them, even if porn is not involved; consider White v. Samsung.
Is there an actual case for outlawing this that isn't based on moral panic? Wouldn't you actually want people to generate those images with AI so they are less incentivized to pay for the real stuff?
As long as you don't need actual CSAM material in the training data and the generated images are different enough from a real person (both of which seem to be very possible technology-wise), that seems to be a good thing.
Or is there any indication that availability of CSAM material actually increases the likelihood that people act on it later?
We don't have (and I doubt we will ever have) tools for distinguishing between real and ai generated images with a guaranteed 100% accuracy (and 0% false negative and false positive rates).
Given that, I don't see how you can allow ai generated CSAM without effectively making "real" csam images be unprosectable.
So you think that currently, until this law is implemented, CSAM is effectively unprosecutable because people can just claim they generated the image with AI?
I think that there is a >0% probability that a individual case can be unprosecutable (or at least have the image evidence be much less useful) if the person in question actively starts generating CSAM using AI for the purpose of casting doubt on the legitimacy of any individual real image that the prosecutor wants to use as evidence.
The standard is beyond reasonable doubt, and I think that's going to become an increasingly difficult bar to clear if the AI generated versions (either made for their own case or as decoys) are allowed to remain legal.
You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.
(You need to sign both the models and the programs to make sure there's no img2img.)
The level of tech solutionist brain rot you need to reach to propose state sponsored child porn generator... This forum is a parody of itself
You don’t even need to give them a model, just generate some images and publish them. If you find those images, it’s fine, if you find anything else, arrest them.
We can't agree on weed or safe injection sites, you think we'll have government approved CP generation?
I completely agree. Most people were marrying 13 and 14 year olds less than 100 years ago. Yes we want to make people only be sexually attractive after they turn 18 but that’s not reality.
That being said I don’t know if the availability of CSAM would increase or decrease real world abuse.
We should really up the game and completely ban all AI generated images depicting people, because we have no good way of knowing whether an image is AI generated or real, and images depicting people have terrible consequences in society when weaponized.
We really need it possible to push laws faster, 2026 is going to be an insane year for multimodal models and laws are simply not keeping up.
I don’t understand why it needs to be banned. If it is artificial, whether it is a story someone wrote, or an animation someone drew, or a photo-realistic AI generated thing, it’s just not real. There is no harm committed to a victim. It feels like this is a moralistic crusade, adjacent to age verification laws that are just backdoor porn bans (freely admitted by the conservatives who support each laws).
The bigger issue is that these types of bans feel a lot more like banning speech than banning a real crime, and the precedent it sets can end up being used in far-reaching ways. That’s how it always is.
I can't agree with the photorealistic AI images because they're indistinguishable from an actual photograph.
Everything else I do agree with you on, though.
The probem is, prosecutions are just looking for easier ways to jail people for things they could do based on what they personally believe. (E.g. "Manga causes child abuse")
The United States already considers artwork that resembles a real minor to be outside the First Amendment and hence illegal. Even like, cartoon artwork. If you're fapping to naked Bart Simpson that's one thing, but if it's a drawing of a real child you are using that child's image as a sexual object, that can be profoundly traumatizing, and it is seen to cross the threshold of "actually abusing a child" that justifies not applying the First Amendment. People's likenesses in general are subject to strong protection in the United States and you can face strong penalties for misusing them, even if porn is not involved; consider White v. Samsung.
> but if it's a drawing of a real child
How would you even prove that, though?
Datasets such as LAION-5B are found to contain thousands of images of CSAM. So, real victims are involved indirectly.
> If it is artificial, whether it is a story someone wrote
Already illegal in Australia: https://www.independent.co.uk/news/world/australasia/sydney-... (don't hold your breath on it making any "banned books" lists)
People laughed at Indians believing photos stole one's soul, and now we have legislated even stupider behavior, without the excuse of ignorance.
Australia also believe that women with small breasts in porn causes people to become child abusers...