Is anyone asking for that though? To make it illegal to have regular pictures of children in these datasets?
I was responding to this part of your comment which directly refers to legality
Is anyone asking for that though? To make it illegal to have regular pictures of children in these datasets?
I was responding to this part of your comment which directly refers to legality
No but it is a reason why generating csam should be illegal. You’re using data trained on pictures of real kids
At that point it’s still using photos of children to generate csam even if you could somehow assure the model is 100% free of csam
If you don’t understand that then I’m done here because you either don’t understand what “ai” does on a fundamental level or you don’t understand how big the difference is between adult and child bodies.
This is a gross conversion to be having on something that is so wrong to do on so many levels.
You can’t make an ethno state without genocide so it is wrong and pointless to talk about
You can’t make ai csam without harming a child so it is wrong and pointless to talk about
Just like how you can’t generate a child without pictures of children to base it on you can’t generate them naked without pictures of their bodies. There is a reason pedos are attracted to those bodies and not women with no curves/small men.
I work with children, I see them everyday. The difference is so massive that an ai would not be able to approximate it with just photos of adults. Ai doesn’t “know” anything it just has photos that it uses to approximate what is being asked based off it’s data. Even if you kept describing in more detail what those bodies looked like it wouldn’t be able to create it without anything to base it on. It’d be like creating a van gogh style picture with no van gogh training data, no matter how much you try to describe the details of his style you’ll never get the ai to make something like it without the training data.
You can keep disagreeing, keep saying “but with more data” but ai can’t make anything original, that is a fundamental misunderstanding of it’s abilities. If it doesn’t have the data it can’t accurately do it.
Now think of the photos that don’t have any matching hashes. Social media has a ton of csam and as long as they scrape from Facebook/insta/twitter or from porn sites with no verification system they will continue to have csam in their training data.
I don’t see a reason to discuss if it’s possible to to something if the thing that’s being done is morally wrong. If you disagree then let’s talk about making a white ethno state or if we can do another Holocaust since morality doesn’t matter when discussing hypotheticals
You can’t generate csam without photos of children to make up the actual child part of the picture. It doesn’t matter if you actually use csam you’re still using photos of children to make pornography. Unless you think ai could create a van gogh style picture without any van gogh training data (and if you do then you don’t know enough about ai generated photos to talk about them with any authority)
It’s obviously accidental, but that doesn’t change that it happened and is something that will be near impossible to avoid as long as they continue to scrape data in the way they do for their models. They would need a human to filter it out like they already use for most LLMs.
The bodies of children are not just small versions of adult bodies.There are meaningful differences that an ai wouldn’t be able to just guess. Also do you not see any problem in using photos of real children to generate csam? Imagine someone used a picture of your child/niece/nephew to generate porn. Does that not feel wrong to you? It’s still using real photos of real children either way, even if it’s abstracted through training data.
Except you can’t know that. CSAM has been found in training data already and as long as they pull from social media they will continue to be trained with more.
Just annoyed to see everyone saying with such definitive wording that there isn’t any csam in training data. I’m a victim of CSA and can’t imagine how I would feel if photos of me were used to help get people off like that.
Csam is in the training data. From a few months ago
Has your model seen humans in a profile view? Has it seen armor? Has it seen Van Gogh style paintings? If yes then it can create a combo of those things.
For CSAM it needs to know what porn looks like, what a child looks like and what a naked pubescent body looks like to create it. It didn’t make your van Gogh painting from nothing it had an idea of what those things were.
Csam is in the training data. From a few months ago
Using csam in training data causes harm
If you think porn is the reason for declining birth rates and higher rates of loneliness I have a bridge to sell you
Except when the data is trained on csam
She was already a shit person. Don’t forget she very publicly transitioned to distract from her murdering someone.