Hacker News new | past | comments | ask | show | jobs | submit
Is there an actual case for outlawing this that isn't based on moral panic? Wouldn't you actually want people to generate those images with AI so they are less incentivized to pay for the real stuff?

As long as you don't need actual CSAM material in the training data and the generated images are different enough from a real person (both of which seem to be very possible technology-wise), that seems to be a good thing.

Or is there any indication that availability of CSAM material actually increases the likelihood that people act on it later?

We don't have (and I doubt we will ever have) tools for distinguishing between real and ai generated images with a guaranteed 100% accuracy (and 0% false negative and false positive rates).

Given that, I don't see how you can allow ai generated CSAM without effectively making "real" csam images be unprosectable.

So you think that currently, until this law is implemented, CSAM is effectively unprosecutable because people can just claim they generated the image with AI?
I think that there is a >0% probability that a individual case can be unprosecutable (or at least have the image evidence be much less useful) if the person in question actively starts generating CSAM using AI for the purpose of casting doubt on the legitimacy of any individual real image that the prosecutor wants to use as evidence.

The standard is beyond reasonable doubt, and I think that's going to become an increasingly difficult bar to clear if the AI generated versions (either made for their own case or as decoys) are allowed to remain legal.

Well they could ask the child in the photo...
You could have government-signed models + programs that are approved for generating CP (not CSAM). It's legal if the signature checks out. Something like https://contentauthenticity.org/ but for verifying that something is definitely made by AI.

(You need to sign both the models and the programs to make sure there's no img2img.)

loading story #47392409
loading story #47391471
loading story #47391440
loading story #47396681
loading story #47392552