Exactly, AI in games solves a different problem than "traditional" AI. It's merely about creating the illusion of intelligence. Not unlike what happens in rigid body dynamics (approximated, not actual physics sim), graphics (plenty of tricks here, e.g. displacement maps vs actual geometry), animation (e.g. blending canned animations).
One could argue that "traditional" AI is also about creating the "illusion" of intelligence. That is what the Turing test is testing -- whether the illusion is sufficiently good to fool the tester into thinking that they're communicating with something intelligent.
The difference is in the degree to which the illusion has to hold up. Game "AI" exist in a limited, artificial world; so they can get away with "simple" algorithms without any learning elements. General AI needs to operate in the real world (even if constrained to a particular domain), where the "rules" are too numerous and complex to program in advance; thus learning is an essential part of "traditional" AI.
Thanks for clarifications. I couldn't say it better.
In a real world, you can't live on illusions, an AI really needs to perform ( think Googles self running cars ). Learning ( and all the other useful ingredients of the AI ) are essential.
Thanks, I edited my comment. I am more in the field of general AI ( and the rest of pure definition of AI ). So normally what I saw is the intelligence of a programmer put into very clever algorithms. I don't see any learning effect , or persistence effect. In fact, once you shut down the game, it starts all over. I'd love a game that learns from the player ( like how to climb over an obstacle ). Or can anyone point me to such a game?
IIRC, there was a survival horror game that claimed to learn from and adapt to the player in order to provide a scarier experience (playing to the player's phobias and such). Not sure if that qualifies.