Unfortunately it's not an uncommon experience when reading academic papers in some fields to find citations that, when checked, don't actually support the cited claim or sometimes don't even contain it. The papers will exist but beyond that they might as well be "hallucinations".
Humans can speak bullshit when they don't want to put in the effort, these LLMs always do it. That is the difference. We need to create the part that humans do when they do the deliberate work to properly create those sources etc, that kind of thinking isn't captured in the text so LLMs doesn't learn it.