Not an expert, but my sense is that it's at least in part due to widespread cheap/commodity consumer cameras becoming a thing between the planning of Curiosity and the planning of Perseverance. In earlier missions, cameras were usually big and bulky and designed specifically for their respective research functions, and if you're designing something from scratch, you design it to maximize research utility first, and sending home cool videos is probably a secondary consideration. So I'm not sure many of these could do video (because it's probably not that useful for studying rocks that don't move), and a lot of them didn't even capture a spectrum range exactly matching the visible range (I think some of them included parts of the near-IR range, etc.), so a lot of the images from earlier missions that were released to the public had color that was faked/recreated in post-processing.
I think the new rovers still have these specialty cameras, but now that there are decently good mass-market cameras from the cellphone/consumer-electronics industry that cost $5 apiece and weigh a couple of grams, it seems like there's no reason not to throw a few of those onboard as well.
I would speculate, also, that video compression might be part of the story. Processors on these vehicles tend to be specialized radiation-hardened chips that are modified versions of several-generations-old general-purpose processors. I think Curiosity's was a rad-hardened 200MHz PowerPC chip, for example. I would bet that those chips just weren't up to the task of compressing high-quality video enough to make it practical to send, given the bandwidth constraints of transmitting from Mars to Earth.
It's not just cameras. What good is an 8K video taken on a rover but it can upload at 200 bits/sec?
The entire stack has improved. I suspect everything from radiation hardening, semiconductor manufacturing, CCD reliability and costs, image processors, colorometry, radio transmission, relay tech and orbiters, and not to mention the importance of great PR.
One thing to consider is that NASA missions are basically always using older technology, because by the time the mission gets approved, designed, and built consumer/business tech has already moved past. Space worthiness ads more 'age' to the technology as well.
I remember when New Horizons flew by Pluto in 2015 taking pictures with a 1 MP camera. At that time I had an 8 MP camera on my phone, but remembered that in 2006 when New Horizons launched, the iPhone didn't even exist.
The current scale of video availability is very recent. Dirt cheap tiny HD cameras and storage & fast reliable networking to match, are only a few years old. Put that in context of a complex system expected to work perfectly under extreme conditions millions of miles distant, with early designs starting years ago, and that’s why you’re only seeing such now.
The commercial cameras have gotten so good over the years so they could put cameras everywhere. They added some local storage to retain images for post processing. They post process the videos to a much smaller file before transmitting. So of-the-shelf technology keeps getting better.
It didn't have priority until now. Everything costs time and money and adds more complexity, more parts that can cause a failure. While videos are nice, they don't have such scientific value as another scientific instrument that could have been on board instead.
> While videos are nice, they don't have such scientific value as another scientific instrument that could have been on board instead.
While they don't have much scientific value, they have an extreme amount of value in a related field: marketing.
I mean, if you want more money for your hot new space rover mission nothing sells it better than high resolution quality videos of it landing on the surface of another planet.
Did the memory write speeds increase, more sensitive CCD sensors, easier to send data back to Earth?
I'm sure there must be some technological reason this wasn't done before because it's simply stunning...