I'm working on a game written in m68k Assembly. In cases where I'm trying to make every CPU cycle count, these kinds of novel bitwise operations really catch my interest. TurnOffLowestBit seems useful for an effect I've had in mind for a while: A graphic gets "erased" one pixel at a time from right to left, but with some random variance for each horizontal line. Reliably turning off the lowest bit might be a good solution. Maybe throwing in TurnOffLowestContiguousBits would help to introduce some variance in a visually interesting way.
Or maybe it wouldn't. But in order for me to consider these operations as potential solutions to a problem, I have to know they exist. Even if they beckon for me to invent a problem they can solve, I'm totally open to that kind of creative inspiration.
"Check out this neat bitwise thing you can do in only 12 CPU cycles!" That's definitely the jam of Assembly and/or demoscene coders, haha.
Or maybe it wouldn't. But in order for me to consider these operations as potential solutions to a problem, I have to know they exist. Even if they beckon for me to invent a problem they can solve, I'm totally open to that kind of creative inspiration.
"Check out this neat bitwise thing you can do in only 12 CPU cycles!" That's definitely the jam of Assembly and/or demoscene coders, haha.