I've used GPT for DnD type stuff. It's really good if you don't have an existing expectation of the world you're in, as everything should be a hallucination. Within the context window LLMs are pretty internally consistent, so within a chat session it'll be surprisingly coherent and human-like. But inside of a full game world like Skyrim, each hallucination could lead to wasted time finding the magic scroll the LLM told you about but never existed. If you can think of a style of game where hallucinations aren't a problem, or a way to put the LLM on rails so that it won't hallucinate too much then I think players will have a good time.