> Earlier, I also said that VGA games more frequently featured parallax, and that VGA was easier to program than EGA. This is primarily because in 256-color VGA mode, each pixel always occupies one byte. So all the complexity caused by the need to address individual bits disappears. VGA still has a planar memory layout, but this applies to bytes instead of bits.
My dad picked up an old EGA computer from work in the early 90s. I was always frustrated that almost all games required a VGA at the time. Now I understand why... It was too much effort to be backwards compatible with what was increasingly becoming niche hardware.
It wasn't just coding, it was also a lot of design/art resources. Going from 256->16 colors would look terrible with automatic dithering. My recollection is that there were a few Sierra titles made for VGA that also supported EGA and that it looked like someone had gone through each screen by hand to make the low-color ones look acceptable-ish.
EGA art specifically designed for EGA could look beautiful. For example, I love EGA Monkey Island. Most adventures with EGA support had different graphics for EGA vs VGA.
Hmm, maybe my memory is failing me, but VLB went from 0 to 100% on 486 motherboards pretty much overnight. The ISA bus limits (and hence EISA) were well known, and most of the SVGA cards being sold in the early 1990's (like those cirrus logic's) were "accelerated" to compensate for the lack of ISA bandwidth. AKA, they had blit engines, hw cursors, line drawing, etc, and windows drivers to utilize that functionality. I'm fairly certain, although to lazy to dig up the tech manual for them that the blit functionality included alpha channels. Given that 486's themselves were considered high end, I frankly never remember seeing one with an 8 bit EGA card, even 386's usually in the late 80's early 1990's were shipping with 16-bit (S)VGA cards because the 3rd party cards were fairly inexpensive. While EGA lived on in lots of games because it had some convenient modes, actual EGA cards were also fairly short lived in that little window before VGA was released so by the late 1980's weren't being sold with new machines. It was also fairly common to overclock the 8-MHz ISA bus to 10MHz on boards that didn't have VLB. Anyway, my point is that in 1992, low end machines were coming with Hercules graphics and monochrome monitors. Anyone who could afford a color RGB monitor wasn't getting it with VGA/EGA because the monitor was 80% of the price (few hundred dollars) and a 512K SVGA card was well under a hundred.
Anyway, I guess the benchmarks in there were to show how slow plain VGA was, but really by the time frame of duke nukem, 486SX's and VLB were radically changing the PC's performance characteristics, even on fairly low end machines.
PS: I still have the cirrus technical manuals because back in the 1990's you could call up these companies, tell them you were a software developer and they would send you a 400+ page technical manual that details registers and operational/functional characteristics of the boards.
> VLB went from 0 to 100% on 486 motherboards pretty much overnight. [...] actual EGA cards were also fairly short lived in that little window before VGA was released so by the late 1980's weren't being sold with new machines
I know the market moved fast back then, but "sales" != "ownership". Sales may have switched to VGA overnight, but if a game wants to maximise it's audience it needs to target what people already own. So unless people buy computers like they buy milk, that would have included EGA cards.
Also taking place in this time frame was the move towards "multimedia PC" standards. At its onset, MPC Level 1 helped consolidate the standards around VGA spec and Soundblaster audio, when before that, the expectations for PC games were that it was primarily an office computer and didn't "do" graphics or sound, so you just had to try to support everything, especially if you were doing shareware and wanted high install rates. So around 1991, when that spec was released, you start to see VGA as a requirement, but shareware did stay behind the curve to support "granny PCs".
Doom really changed this. Lots of people ran out and got new systems for the sake of Doom, when before there was an association between high end machines and complicated simulator games. And while there were still casual games that remained focused on the "granny PC" well into the 2000's, the market really redefined itself in a big way around this new wave of enthusiasts playing shooter games.
Yah, PC games have been known to drive hardware sales, doom, quake and then unreal all drove me to upgrade various components (486, Pentium and then voodoo2). I guess it still happens but the closest I can remember to buying a new piece of HW for a game was a GTX-1080 to play Witcher 3 a few years back.
The CPU/GPU HW vendors could probably boost their sales by buying rights for some of those games and selling them very discounted to drive HW upgrades. Thats basically what happened to me with Witcher 3, I bought it on some super sale for $10-15 discovered it didn't play very well on my older graphics card, and ended up spending $500 (or something) as a result.
For the most part I think people mostly ignored the MPC spec's though. Outside of maybe 7th guest which drove CDROM sales, most people I knew who were interested in gaming already had purchased or scrounged up a soundblaster and a 10base2 nic (for lan parties).
GPU vendors (called partners now?) have been bundling games with cards since early 2000 (maybe pre 2000, but Im too lazy to look for examples now). Here is Palit Radeon X700 packaged with two 2 year old games: Xpand Rally (legit good game) and Second Sight (never even heard of it before writing this post, but actually looks good!) http://ixbtlabs.com/articles2/video/palit-2.html
When it comes to GPU manufacturers Afaik AMD started first in 2012 with "Never Settle" Far Cry 3, Hitman: Absolution and Sleeping Dogs. Nivida followed a year later with "Gear Up" garbage promotion of free items for free to play Planetside 2, World of Tanks and Hawken.
When it comes to bang for buck giving out free games is fools game, real money is in paying brib^^^ organizing cross promotion and developer support for game studios. Nvidia started with "The Way It’s Meant To Be Played" program sponsoring Ubisoft
"Number one: Nvidia Gameworks typically damages the performance on Nvidia hardware as well, which is a bit tragic really. It certainly feels like it’s about reducing the performance, even on high-end graphics cards, so that people have to buy something new.
"That’s the consequence of it, whether it’s intended or not - and I guess I can’t read anyone’s minds so I can’t tell you what their intention is. But the consequence of it is it brings PCs to their knees when it’s unnecessary. And if you look at Crysis 2 in particular, you see that they’re tessellating water that’s not visible to millions of triangles every frame, and they’re tessellating blocks of concrete – essentially large rectangular objects – and generating millions of triangles per frame which are useless."
Why give out free games when you can influence developers to include your closed source garbage tech (physX, running in FPU mode on SSE capable CPUs to make it a slow path without nvidia GPU), dial useless feature to 11 (tessellation), or incorporate something taking 50% performance hit for marginal visual gain (ray tracing) forcing users into frequent upgrades!
They werent even the first to do it. Intel pioneered this practice in their golden age of anticompetitive bribe binge starting in 1998 by sponsoring Ubisoft to print huge "Designed for Intel MMX" commercial on all POD boxes https://www.mobygames.com/images/covers/l/51358-pod-windows-... despite MMX not influencing game speed at all (used for one sound effect). Amazingly someone working in Intel "developer relations group" at the time is on HN and chimed in https://news.ycombinator.com/item?id=28237085
"I can tell you that Intel gave companies $1 million for "Optimized" games for marketing such."
$1 million for one optional MMX optimized sound effect.
Bundling though doesn't cause people to decide after a couple hours of gameplay that they want to continue playing but with smoother/higher res/etc graphics.
Which is what drove my upgrades, I liked the games in question, and saw how they ran on other peoples hardware enough to want that experience for myself.
(although in the case of the 1080 I also had a monitor setup that didn't work well with the graphics card I was using at the time, so it was that as much as witcher which drove the upgrade).
Instead of bundling, the game companies would be better served by picking some gotta-have-it title and giving the first couple levels away as a trial/etc, and making sure that there was big jump in perf moving to the latest card and then bundling the license with the card. That way people were hooked and willing to pay for the experience. I generally avoid bundled card deals because the games don't interest me.
Also upgrades were very very common back then - and you’d often sell back your current card to the store when buying the upgrade. So people would be upgrading to EGA cards as those came back in.
Somewhere in there was MCGA too. I still remember very cool stuff on the 8086 Amstrad PC 1640 that had an EGA card. You had to do palette magic and it was hard to do any animation, but I think games like Xenon 2 and Prince of Persia did a good job of it.
IIRC — I worked in a computer lab full of PS/2 Model 25 at the time — MCGA was just CGA with a couple VGA modes (640x480x2 and 320x200x256) bolted on. In particular, it was not compatible with EGA.
I would be very surprised by this. Raster ops, yes. Alpha channel image compositing operations, no way - that's an entirely whole nother level of complexity.
Not sure why you are down voted, its a somewhat fair point.
So I looked it up, the 542x series in this article do support "transparency" blt operations, but it might be fair to call them raster ops rather than full blending. They would be be sufficient for the parallax scrolling Duke Nukem here was implementing, which was sorta my original point.
OTOH, full 32-bit ARGB support shows up in the Cirrus line of adapters with the next revision 543x, mostly for video overlay though, although the way its wired seems to leave a lot of open doors for interesting effects too.
And full blown alpha blending shows up one generation later in the 546x series.
So there is full hardware support by the mid 1990's in fairly low end HW/PCs at that point. I've said before on this board that alpha blending was going on all through the 1990's and there are various ways to "cheat" and speed up what is presented in '84 (https://dl.acm.org/doi/10.1145/964965.808606). Pulling my Copy of CGPnP off the shelf it says under compositing "since it is fairly easy to do". I give you a demo from '92 with real time apparent blended transparency: https://www.youtube.com/watch?v=pLJhtefcoPE
see about 4 mins in. I sorta doubt this is the first case, but was one I vaguely remembered, since I was myself hacking these kinds of things with my schoolmates around that time as we tried to emulate what we saw others doing/etc. And none of us went into computer graphics or gaming oriented parts of the technology fields.
BTW: That demo notes it needs a 386+VGA, so we are talking late 1980's PC hardware.
http://www.vgamuseum.info/index.php/cpu/item/130-cirrus-logi... "D3 Selects the memory read data latches to be eight bytes wide, instead of the normal four bytes.
This bit can be used in Write Mode 1, in order to rewrite 8 latched pixels (64 bits) back into display
memory. This bit should be used in X8 addressing mode only."
= Set/Reset and Compare registers extended to full 8 bits and read/write mode 1 extended to 8 bytes at a time with extra foreground/background masks. If Im calculating correctly this means you can perform internal copies at 12-20MB/s and lines/pattern fills at 24-40MB/s
Things might of looked different with VBE/AF fully defined in early 1992 and subsequent graphic products all providing at least partial support (8/16bit panning and sprites would be enough).
Instead everything was too late. Even VESA LFB never truly got implemented on ISA cards (afaik only ATI Mach64 and maybe experimental support in ET4000?), and only started working on VBE and PCI.
I remember implementing a side scrolling platformer on my 386 and being boggled by how slow everything was compared to on consoles. Didn’t know what hardware acceleration was then, and this was before I got Internet access so I worked it out myself, coded it in C, then hand optimized the inner drawing loop in assembly, rewriting the pixel writes to use rep/mov instructions for each tile. This took a while but I managed to squeeze out acceptable performance for my game with Keen-like smooth scrolling.
Reading Masters of Doom some 30 years later was pretty eye opening. This article too - to this day I’ve never heard of latch writes.
Since you mentioned latch writes, check out Michael Abrash’s Graphics Programming Black Book[0], chapter 23 onwards, if you feel like immersing yourself further in well-written VGA lore.
Not recommended for your present-day productivity (unless you’re making retro games!)
Turrican being a robot, and the developer being from Germany leads me to believe it may have been inspired by NES Probotector[1] rather than Contra. PS, watching that now I know the struggle of 50hz gamers playing slow games was real... yikes.
That's pretty funny. Although, to be honest, I think a lot of the shareware platform games from that era were seen as low budget knock-offs of more famous games on the consoles and 8/16-bit home computers.
My god the things programmers had to do back in the day to squeeze every single bit out of the machine and here I was feeling smug using Weakmap instead of map for my code optimisation :P
I also remember carmack's fast square root function. So many amazing tricks. There definitely needs to a site or sub-reddit dedicated to these.
Ha! You call that parallax scrolling? Then take a look at this (Amiga, 1993): https://www.youtube.com/watch?v=BoiHk_3siYg. In the first level, the bands of clouds are moving in (I think) 6 different levels of parallax, and also the mountain range and swamp below has different levels of parallax. Ok, this was a game that, as one reviewer put it, "squeezes all the graphic power out of the Amiga". Would you believe that this was a computer which was only capable of displaying 32 colors at once? Ok, that was without tricks of course, and this game probably uses every trick in the book...
Not to take anything away from the developers but the Amiga does have hardware sprites, hardware scrolling, a blitter, the copper and dual playfield mode to make things a little easier
Oh R-Type... I had trouble beating the final boss of the first level, but once I got through once, I played on and on and on, getting to level 4 or 5, I think, and I had my hands literally shaking from stress, fearing to lose all the powerups.
The Commodore 64 had smooth parallax scrolling in 1986 without any hardware acceleration. Instead, they used a trick where they rolled the character set definitions, eg. https://www.youtube.com/watch?v=iHrmjP6D1OU
I think it's unfair to say that it's "without any hardware acceleration". These tricks work because the VIC-II can unburden the CPU a lot by letting it work with character cells and character definitions instead of individual pixels, and in Parallax, all the CPU needs to do to create the parallax effect is to redefine the 32 bytes that define the background characters.
I didn't know that Flimbo's Quest trick, that's cool!
That vid is a very bad example for the vid itself is not running at 50/60 Hz. And that was the whole thing back then: most great game on the Amiga (and the Atari ST) were running at "full frame rate". Arcade games too (except some still amazing ones like Outrun which was 30 fps IIRC).
A parallax a 50 Hz was silky smooth. A video of a game running at 50 Hz that is recorded at 30 fps doesn't do the original game justice.
The link to Shadow of The Beast (Amiga 1989) is much better in that you can configure YouTube to play the vid at 50 Hz.
Don't get why you need to put down what someone else did because someone else did something better at the same time? Nobody claimed it was the best parallax at the time.
That's a stretch. That sketch is about one-upmanship as to who suffered more. (Which is also toxic behavior when taken out of comedic contexts. I know people who do this unironically and it's not healthy.)
Agree with the parent, it's great to develop a richer context on top of the OP and have everyone benefit as a result, but there's no need to put one down in order to raise another. They're both cool and interesting.
In my day there were no graphics cards. You had to write the bits into special parts of memory yourself. And you considered yourself lucky to get 140 pixels on a line.
(Also the only video I could find with the original, heroic-sounding music I remember, as everyone else seems to have somehow ended up with a disk image using an oddly-substituted filler track.)
Not saying that isn't impressive, but there are just two layers of parallax scrolling per horizontal line (= how it's usually counted) in that example.
BTW, many Amiga games could do 3 layers per horizontal line.
great to see lethal guitar's work getting featured here. Rigel Engine is a very cool and ambitious project to revitalize a retro game. He is also very welcoming. I had my fun time contributing to it few years back.
TLDR: The primary scrolling used 8-pixel steps (based on 8x8 tiles), while the parallax scrolling was in 4-pixel steps just by using two (or 2x2) variations of the tiles. Also, the EGA hardware allowed copying tiles within video memory with four times the speed of memory accesses from the CPU, by operating EGA latch registers for the four bitplanes in parallel. The number of tiles on the screen where sprites and tiles and background partially overlapped, and therefore required slower drawing methods, was minimized.
Karateka didn't use any of the C64's advanced features (hardware scrolling, sprites) as far as I know, being a game that originated on the Apple II. It was still very impressive. I wonder which C64 game was the first to have smooth parallax scrolling (at 50/60Hz).
The technique of pre-shifting was used very widely, even on consoles with some form of hardware assistance (Sonic 3 uses pre-shifted images to gain an extra parallax layer in some places, like the Launch Base Zone background, in addition to the two hardware tile layers and single sprite layer, which could be combined in interesting ways via priority bits). Many Amiga games would also dedicate two sprites to generating a parallax layer (as the Agnus chip's "copper" could rewrite sprite registers mid-scanline, two sprites would be sufficient to cover the whole width of the screen via updating the image pointers on the fly, which was really the endgame of the idea of "racing the beam", dedicating a piece of compute logic towards updating registers at extremely high speed)
Karateka is a poor example of Parallax on C64, as it apparently does (slow) bitmap scrolling in front of a static background.
Parallax on the C64 tends to rely on the ability to redefine the character set and effectively use it as a tile set. You can smooth scroll characters using the graphics chip, so you only need to move them every 8 pixels.
Combine that with multiple character sets and you get a variation of the Duke Nukem II approach for "free" - you scroll the characters for the background with the foreground, but you flip to a partially scrolled definition of the tiles.
Another approach often used when the background is limited in complexity, is sprites.
A third option is banding. E.g. if the background you want to parallax scroll is above the normal play area, you can use the raster interrupt to control the scrolling independently for each band.
A fourth option is a bitmap scroll of the tile data. This is "easy" if your parallax bands are limited patterns. E.g. you might fill one band of the background with 0123012301230123 over and over, and then just scroll that as a 32 bit pattern in the character definition.
In any case, the key to all of these is that few c64 games use the bitmap mode (a few do), and so you get to offset the slow CPU with moving far less data around. Tack on the hardware sprites and raster interrupt and you can do a lot of interesting stuff.
Here's a great page on Parallax in C64 games with some video:
Many of the comments here imply this article is about how to do parallax scrolling in general or think the article is suggesting Duke Nukem II was pioneering parallax scrolling. This is not what the article is about.
This article is about how Duke Nukem II specifically implemented drawing so that it could actually achieve playable framerates on EGA video cards and slow ISA buses, which were too slow to do a naive implementation.
My dad picked up an old EGA computer from work in the early 90s. I was always frustrated that almost all games required a VGA at the time. Now I understand why... It was too much effort to be backwards compatible with what was increasingly becoming niche hardware.