And also it's an AI.
13k images before AI involved a human with Photoshop or a child doing fucked up shit.
13k images after AI is just forgetting to turn off the CSAM auto-generate button.
And also it's an AI.
13k images before AI involved a human with Photoshop or a child doing fucked up shit.
13k images after AI is just forgetting to turn off the CSAM auto-generate button.
Stable Diffusion has been distancing themselves from this. The model that allows for this was leaked from a different company.
Making the CSAM is illegal by itself https://www.thefederalcriminalattorneys.com/possession-of-lolicon
Title is pretty accurate.
Creating the pics is a crime by itself. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
It would be illegal in the United States. Artistic depictions of CSAM are illegal under the PROTECT act 2003.
Asked whether more funding will be provided for the anti-paint enforcement divisions: it's such a big backlog, we'll rather just wait for somebody to piss of a politician to focus our resources.
It seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I've heard from others, it is used to simplify prosecution. PedoAnon can't argue "it's a deepfake, not a real kid" to the SWAT team.