@lamp I'm noticing a strange pattern where some people are thinking about pedophilia all the time out of the blue and constantly expressing their hate for it.
I guess some of the cases could be a form of deflection, where someone is a pedophile and hates themselves for it, thus goes online and proclaims their hate by writing about their desires to brutally murder pedophiles with a woodchipper (there is few reasons otherwise one would put in the effort of writing such things after all).
I don't care if someone commits the wrongthink of being attracted to prepubescent children - I care about past actions and planned actions, while also knowing that terrible ones made against children are mostly done by sickos that aren't in fact pedophiles.
@divVerent@Suiseiseki@lamp > And if someone "uses" that... well that is still sick, but know what? I don't care.
i think that has largely been japan's opinion. they have a whole industry of drawn-only hentai for that and while the UN has been pitching a fit the country still has the lowest rate of actual child touching in the first world. (i've heard some musing that it could be distorted by witheld reporting, ymmv)
@Suiseiseki@lamp To be fair, in part the words are simply confused by people who do not know what they mean.
No one wants a child molester. But not every pedophile is a child molester, and - indeed as you say - vice versa.
And if someone distributes real CSAI, they are indeed increasing the damage that has already happened to the children depicted. That does need to be punished.
But we live in 2025. There are not just hand drawn pictures, you can even generate that sick stuff with AI - even using models that have not been trained with real CSAI, but work in a "transfer" like approach. Where nobody is actually being harmed except for the viewer, I guess.
And if someone "uses" that... well that is still sick, but know what? I don't care.
@divVerent@lamp Many people know what the words mean, but use them wrong.
Clearly, non-one wants a predator, but 90% of predators are not pedophiles.
Procedural generation models cannot generate something from nothing - the output is produced as result of the combination of the inputs, so there is a slim possibility a identifiable real person is depicted (like a "deepfake").
There is also another potential harm from such models - those images can waste the limited allocated investigation resources available, as investigators can waste time looking for a trafficked minor that doesn't exist, redirecting resources away from looking for and rescuing the many trafficked minors.
People reporting drawings already waste a lot of investigator time (the FBI specifically notes not to report drawings), but at least a drawing will only waste a few seconds of an investigators time at most.
@divVerent@Suiseiseki@lamp i think its all just a deflection from Epstein et all where you have british royalty being besties with actual child sex traffickers and then getting their records protected by world governments
its a trick that happens a lot but i forget the name of it. basically the system distracting you from what they are really doing by pointing at a facsimile of it. like if people collectively know they should kill their governments, you get pointed towards movies about it while at the same time public participation continues to be shuttered.
@divVerent@Suiseiseki@lamp if you look in to the books on human trafficking, basically nobody cares when children and adults go missing in to sex dungeons. there are a couple NGOs and agencies who are tasked with putting a stop to it. half of those are part of the problem. the other half get basically no funding and are administratively kneecapped.
i have the unshakeable nod that CSAM detectors have more to do with getting a consent decree to have AIs spy and snitch on people. since you can't exactly see what that black box is doing, there is no way to prove when they just say slip in a person of interest dataset in the filters and turn it in to a universal face scanner a la batman. but it will be to "protect the kids," despite the governments basically cutting the budget to anyone who actually does that.
@icedquinn@divVerent@lamp The child abuse rate in Japan is distorted by withheld reporting, but if I remember correctly, even if you apply a multiplier to account for undereporting, Japan is less bad than England (now where are those statistics).
@Suiseiseki@lamp What I meant is, such models can also be trained just from "harmless" pictures of minors, and "adult" porn. If one really needs that kinda stuff. In that case no real person has been harmed.
If a model has been trained using real CSAI, then yes, the entire model should of course still be illegal to distribute with massive consequences, as doing so still perpetuated harm to actual children depicted in training data. As for pictures generated by said models, it is more complicated, as it ultimately depends on whether or not a real person can be recognized (and yes, this raises to the level where someone thought they "just used AI" and had no intention of harming a real child, but the model was massively overfit and always generated the same child).
But yeah, waste of investigators' resources is also an issue, but should not be a reason to ban something. Get gud at investigating, I guess.
@lamp@divVerent Achieving the highest quality of 漫画-style drawings can do great harm to the artists wrists and fingers and also to the rest of their body (as they stay inside for hours or days or weeks in dim light working on such drawings and not going outside and exercising).
Mentally it seems that few male artists can handle such long drawing process, but it seems that female artists seem to be able to mentally take it better (there are many more female artists than male artists).
Copyright and/or trademarks are only an issue if a derivative work of an existing character is drawn and not a (definitely) original character.
@divVerent@lamp No artwork is truly original, viewing a work doesn't consume it and works tend to be unique and cannot be readily substituted for anything else, unless everything is slop (slop just fills whatever with content(s)), thus the "consuming" and "OC" as terms are totally incorrect.
@Suiseiseki@lamp And yes, in case of OC the copyright/trademark issue of course is taken care of (except that the authors themselves may get their copyright infringed by others).
@lamp@divVerent >This harm is real I don't see why delivering traffic via the internet is harmful, as that's literally what it's for.
It just increases costs, as there are rent seekers on the internet's fibre's that demand they receive a payment for every single byte that is transmitted over the fibres (in either direction).
youtube does not cache on ISPs - google just runs their own massive networks and datacentres and are happy to let any ISP peer with them - the result is the spying data going directly to google's datacentres and the video's coming directly from google's datacentres (whether on premise, or from another google datacentre), bypassing rent seekers.
As it shows, the harm depends a lot on the person. But anyway, whoever consumes it consents to it, so I see no reason for a government side ban, beyond age restrictions that usually are already in place (but care has to be taken to not enforce age restriction in a way causing much bigger harm - because the last thing we need is a government database tracking who watches which porn).
In addition, it comprises a major part of Internet traffic, also because unlike e.g. Netflix and YouTube, it is not cached on your ISP's premises and thus actually goes through the backbone and peering. This harm is real, but of course, is one the ISPs could easily fix if they wanted to.