I don't like the term CSAM because I've seen it lead to such tortured and misleading terms as "CG-CSAM" (computer-generated child sexual abuse material), suggesting that an AI model generating images of naked children is committing sexual abuse. The already-existing concept of "simulated child pornography" actually describes it better.
Do you have sources to back up the fact that "it encourages and is often found coincident with real CSAM"? You can't just claim that it's common sense that an AI-generated image of a child is encouraging real child abuse. Generative AI hasn't even been around for long enough to be part of common knowledge.
Again: it's you that has put the specific "AI" projection onto "CG".
As I say, this term appears to be used in the past broadly to include the kind of 3D animations that appear in conventional porn adverts, and also to face-swap and other Photoshop-type edits.
All I can say is that prosecutions in the UK for example have often mentioned such material alongside conventionally shared material.
I don't think I've read about any prosecutions where fake material was the only material justifying prosecution. But I could have missed that.
I have absolutely no interest in getting into the rest of the argument, which is tedious and IMO kind of obvious on many grounds.