Hacker News new | past | comments | ask | show | jobs | submit
> CSAM does not have a universal definition.

Strange that there was no disagreement before "AI", right? Yet now we have a clutch of new "definitions" all of which dilute and weaken the meaning.

> In Sweden for instance, CSAM is any image of an underage subject (real or realistic digital) designed to evoke a sexual response.

No corroboration found on web. Quite the contrary, in fact:

"Sweden does not have a legislative definition of child sexual abuse material (CSAM)"

https://rm.coe.int/factsheet-sweden-the-protection-of-childr...

> If you take a picture of a 14 year old girl (age of consent is 15) and use Grok to give her bikini, or make her topless, then you are most definately producing and possessing CSAM.

> No abuse of a real minor is needed.

Even the Google "AI" knows better than that. CSAM "is considered a record of a crime, emphasizing that its existence represents the abuse of a child."

Putting a bikini on a photo of a child may be distasteful abuse of a photo, but it is not abuse of a child - in any current law.

loading story #46870254
loading story #46870486
loading story #46870385
loading story #46870389
loading story #46874950
loading story #46873334
loading story #46870366