Again, the compiler is free to optimize that to data-dependent time. It's relatively unlikely to do so on current machines (the time saved is unlikely to be worth the potential of a pipeline stall), but it's free to do so.
For instance, it's legal for the compiler to insert a strcmp fallthrough (`if input == token: return true`, or rather the strcmp equivalent) at the start, I'm pretty sure.
Something like this should work:
The reason for the CMP_TYPE #define is so that you can optimise the comparison by replacing char with uint32 or uint64.