As someone who regularly works with AI and LLMs, hallucination is a big issue. In fact, one of the interview questions which gets asked regularly is 'How to avoid hallucination?'. This content definitely seems like hallucination. Even if there was a WN, it probably wouldn't have reached so far, as you said. I feel sad for people who believe this. Raoul would probably not leave Theodora for long and there's also no mention of any 'Rituals of Preparation' in this chapter.
There's almost no consistent mention of any ability or spell names at all in the entire series. It doesn't fit the style of the series to have some "ritual of preparation" or any of the other horseshit.
But yeah, reader should have known even without that. The timing for the WN doesn't allow for it. They didn't question any of the prefacing information, let alone the made up story. That's the insidious thing about offloading your thinking.
I also work with models. And I also do vet people who come in for interview. But my question on this vein is rather different than "how to a avoid hallucination". It's more like "if I have AI, why do I need you." It's a bit of a trick question because some people only answer by reeling off how they can "add" to the AI output or supervise it.
Sharper candidates realize AI will keep improving and require less supervision. I don't personally believe they will ever stop hallucinating, it's an inherent problem with LLMs. But it's beside the point. Let's say they do, then what do I need employees for whose base case came from an AI anyway? I also have access to the same AI.
It's a more fundamental issue before I have to even ask about how they would vet and fix the output.