A few weeks ago I had a sudden need to understand certain finer points of the operation of EGA/VGA BIOS. I found common reference materials to be inadequate—they tend to do a good job of documenting the data structures the video BIOS uses, but do not even attempt to explain what those structures are for and how exactly the video BIOS uses them.
Partly it’s IBM’s fault. The technical references covering VGA products did not quite explain everything in detail either. For the EGA, IBM solved the BIOS documentation problem the old-fashioned way—by publishing the complete BIOS listing.
Now, the EGA BIOS listing included a fairly decent documentation of the INT 10H interface as part of the source code itself (IBM had previously done the same thing with the PC/XT/AT system BIOS). Decent documentation but not great. But naturally anyone who needed to know more could just read the BIOS listings! For the VGA, BIOS listings were never published, although the VGA BIOS did not do too much new that the EGA didn’t do already.
For answering my questions, the EGA BIOS listings were sufficient. Except the available documents weren’t terribly well OCRed and thus were difficult to search properly. That is mostly IBM’s fault too—the listings were printed in rather small font. It’s not hard to see why IBM did that: Even at about 130 lines per page, the EGA BIOS runs over 60 pages of dense text!
To avoid any problems with searching the BIOS listings, I decided to reconstruct the source code of the EGA BIOS from the published listings, using the poorly OCRed text as a basis.
Verifying the reconstructed source code was trivial: If the reassembled BIOS matches an actual EGA BIOS, it must be close enough to the original.
From the published listings it is not clear what assembler IBM used. It must have been some version of MASM, but we can only guess which. As it turns out, the EGA BIOS was written in a fairly straightforward way without using any MASM “cleverness”, and can therefore be successfully assembled with almost any version of MASM.
I found out that just about any version of MASM from 1.0 up to and including 4.01 produces the same binary. Only MASM 5.0/5.1 is a little smarter and produces different code. Obviously MASM 4.0 is too new, and perhaps even 3.0 is. It’s entirely possible or even likely that IBM MASM 1.0 was used for the EGA BIOS, but it could just as easily have been some other version.
The EGA BIOS was split into several separately compiled modules, and the main module with most of the code was further divided into a number of INCLUDEd source files. The names of the .INC files are apparent from the published listings, though the names of the modules are not.
The listings also do not indicate how the modules were linked. Not coincidentally, the listings of the modules appear in the same order in which the modules were linked. It is thus not difficult to reconstruct a method to assemble and link the source modules, and then run them through EXE2BIN.
Note that the result won’t be an exact match of an EGA BIOS image. To reproduce the EPROM contents, the BIOS image would have to be padded to the appropriate size (16 KB) and equipped with the correct checksum. For my purposes, that was unnecessary because I only needed to make sure that my EGA BIOS binary, which is 14,688 bytes long, matches the first 14,688 bytes of the EGA BIOS dump. And it does.
I attempted to reproduce the EGA BIOS source code as accurately as possible, which means including any typos, misspellings, or grammatical errors. I ran into one annoying but non-critical problem: The published EGA BIOS listings use a font where capital letter O and the number zero (0) are either identical or, more likely, indistinguishable given the small size and print/scan quality. Most of the time, it’s obvious which is which, but in a couple of instances (mostly function names or labels) it is not clear from context whether capital O or zero was in the original. But whichever it was, the naming is consistent in the reconstructed source and does not hamper searchability.
Source Structure
There are two copies of the bulk of the EGA BIOS source code (basically everything minus fonts). The file EGABIG.ASM is one flat file containing (almost) everything. It has one very important property: The line numbers in EGABIG.ASM exactly match the line numbers in the Tech Ref listing. That way, source location can be easily cross-referenced.
The “real” source code is in VBIOS.ASM which includes 9 other files with the .INC extension. Those files are meant to represent what the source code really looked like.
The source code archive can be downloaded or viewed online. If you find any errors (and I’m sure there are a few), feel free to report them in a comment. But first please check the Tech Ref listing first, because there are plenty of typos and misspellings in the original.
BIOS Notes
While reading the EGA BIOS source code, I learned that like the VGA, the EGA can operate as either a color or a mono adapter. But unlike VGA, which can dynamically switch between mono and color, the EGA’s behavior depends on the type of the attached monitor and won’t change at runtime. The reason for that is that the EGA could work with existing IBM MDA and CGA monitors (models 5151 and 5153, respectively), as well as a “native” EGA monitor (model 5154).
And, again unlike the VGA, the EGA also limits the text and graphics mode choice depending on the attached monitor, which can display either 200 (CGA) or 350 (EGA/MDA) lines and nothing else. The VGA can display either 400/200 (the latter double-scanned) or 350 lines in alphanumeric modes; VGA text modes can thus use 8, 14, or 16 pixels high fonts, whereas EGA has to use either 8 or 14 pixels depending on the monitor.
The EGA, like the VGA, can use 9-pixel wide text mode characters. But unlike VGA, the EGA can do so only when using a monochrome MDA style monitor with 720-pixel horizontal resolution. Like the VGA, the EGA BIOS contains an 8×14 pixel font which is suitable for modes with 8-pixel wide character box; for running with 9-pixel wide characters, there is a “patch table” which the BIOS uses to overwrite a handful of characters that benefit from a bitmap optimized for 9-pixel wide box (particularly wide/complex letters such as ‘M’ or ‘W’).
The real reason I wanted to read the listing was the video parameter/save area usage. The format is well documented, the usage is not. The BIOS initially sets up the pointer at 40:A8 in the BDA to point at the default table in ROM.
When the BIOS executes a mode set, it looks at the pointer in the BDA and uses that to locate the mode parameter tables. Depending on whether it’s setting a text or graphics mode, the BIOS looks at the alphanumeric or graphics font override table. The font overrides are used only during a mode set.
Since the default parameter/save table is in ROM, it is obvious that if anyone wants to customize it, they need to copy the one in ROM (or rather whatever 40:A8 points at), modify it, and update the pointer in the BDA.
There is also a dynamic save area which is used for storing the content of palette registers on the EGA/VGA. This save area is written (rather than read) by the BIOS during a mode set if the dynamic save pointer area is non-null. But it is also read and written by some INT 10h/10h subfunctions, again only when the dynamic save area pointer is non-null. And in the default case of a ROM parameter/save table, it will be.
There was a clear technical reason for the save area: To the chagrin of programmers, most of the EGA registers were write-only. Software which needed to implement any kind of screen switching couldn’t just read the hardware state. The save area kept track of palette state, both during the initial mode set and in INT 10h/0Bh and INT 10h/10h calls. Note that the EGA BIOS did not offer a service to read the current palette state, precisely because the dynamic save area was optional; by default, the BIOS simply had no way to determine the current state.
Overall, the BIOS usage of the parameter/save table is fairly straightforward, yet sufficiently non-obvious that it is impossible to figure out from incomplete documentation.
Interesting, this is the kind of content that I would had wished were in computer magazines back in the 80’s.
A side track, re “the EGA could work with existing IBM MDA and CGA monitors”: This is mostly true but there is a problem with using a CGA monitor. Even if you set the EGA card to use CGA frequencies for all text modes it will still let software select 640*350 which of course produces a garbled display, and some programs had no way of overriding this in software. To me this is an indicator that few users actually used an EGA card with a CGA monitor.
Using the EGA with the CGA monitor results in something that’s functionally very close to CGA. So yeah, I can’t see that being an attractive option; without an EGA monitor, there was little point in upgrading from CGA.
On the monochrome side, the EGA did have a clear advantage over MDA, namely graphics Mode F (640×350, monochrome).
I wonder what the actual problem was with the garbled modes. I will say that the EGA BIOS is, from a modern perspective, not terribly robust. It does not always validate inputs and I can easily see how one could end up with an “impossible” and non-working configuration.
Amstrad’s PC1640 had a Paradise EGA chipset and was sold with (depending on model) either a CGA, EGA or Hercules monitor. So EGA with a CGA monitor wouldn’t have been an uncommon configuration there. If the user used the supplied GEM it would have got them 16 colours in 640×200 mode rather than black and white.
IBM ensured that the base model 5170 kept a price below $5,000 which prevented triggering extensive meetings to approve the purchase order. It does seem that the expectation was users would start with an older monitor and then upgrade video card RAM and monitor once software supporting the new EGA modes was available. IBM put a lot of marketing emphasis into the ease of upgrading the EGA card.
I don’t know if it was possible to protect from programmers not understanding the video subsystem while keeping the EGA BIOS to fit within the limited confines of an affordable ROM chip. Obviously, a few years later, VGA could take advantage of larger cheaper ROMs and low cost electronics that could be placed in the monitor and return state.
The biggest advantage of EGA on a 5153 CGA monitor was 320×200 mode with 16 colors. Don’t recall if there was a 640×200 mode available with more colors on stock IBM hardware. IBM also didn’t take advantage of interlacing, instead pushing the better monitor if you wanted a high resolution productivity video mode.
Michal:
The garbled output was due to the EGA bios letting programs select 640*350 even when the user had set the dip switches for using a CGA monitor.
It’s arguable if this is a bug or a feature. I would lean on calling it a bug. If someone would really want to run the text modes at 15kHz like CGA but still be able to select 640*350 it would be reasonable for that really fringe case to require a tsr or whatnot while the more reasonable use case for selecting CGA output would be to only enable modes that a CGA monitor could display.
I wounder if the Amstrad that John mentions had a BIOS that wither disabled the 640*350 mode with a CGA monitor, or did something special like using interlace or possibly running it at 50Hz (since it was in Europe that would had made sense anyways) and displaying more than 200 rows but still not the full 350 rows. A kind of compromise would had been to use interlace but only interlace certain lines (as interlace on a 50Hz “PAL” monitor would give at least 512 visible rows).
Chris: In order for the EGA monitor to be compatible with a CGA card it automatically switches to a 16 color mode in hardware when it runs in 15kHz “CGA” mode, and since the EGA cord of course were intended for usage with the monitors IBM actually sold it would therefore not generally allow more than 16 colors at 640*200. Perhaps it might had been possible to use all 64 colors by poking the hardware directly. That in turn could had been usable on some computer that used a nonstandard monitor (i.e. EGA monitor only able to work properly on an EGA card or some kind of analogue RGB interface or so). Btw a possible misconfiguration of an EGA card was to have an internal jumper set at using a CGA monitor while using an EGA monitor. (I’m not refering to the dip switch that selects if text modes should be in CGA or EGA frequencies/resolutions, but a separate jumper that almost always were next to the dsub connector). CGA specified signal ground on both pin 1 and 2 while EGA needed eight signal pins and thus changed pin 2 to be one of the three intensity signals. In order to not short circuit that output when using a CGA monitor the jumper selected between outputting that intensity signal on pin 2 or grounding pin 2. With that jumper in the wrong position the high intensity colors would be incorrect in EGA modes.
And yes, being able to display 640*200 in 16 colors rather than having to select between 640*200 in hardcoded black/white or 320*200 in four colors with a limited set of fixed palettes were a great improvement. And some programs actually did let you select this mode instead of just either always going for 640*350 which didn’t work, or at best let you select between 640*350 or a CGA mode. IIRC the automagic selection of the best available mode using Borlans BGI system would select 640*350. I think it was possible to remove one of the BGI files to force a downgrade to CGA. (As a side trach, it seems like the BGI system could never auto select the “ATT plasma display” 640*400 mode, which was used in for example the Toshiba T3100e mains-power only 286 based laptop).
As a side track re EGA BIOSes and compatibility: Back in the days I had at least two clone EGA cards. IIRC one of them had a bios that wouldn’t work on anything better than a 286 while the other had a bios that wouldn’t work on an 8088. And, oh, the contraptions I went trough… One of those EGA cards were really a combined EGA card and ISA riser/backplane salvaged from some PC with a non-standard chassis form factor. It had a small card edge connector with the EGA monitor signals and dip switch inputs which would had been on the motherboard of the system it was salvaged from. I remember the luck when I found out that a friend had that exact system so I could trace out the pinout of that connector in order to actually use that card 🙂 I kept on using EGA up to mid 90’s. I remember that it was possible to install Windows 3.x with EGA and then upgrade to Windows 95 and run Windows 95 in EGA mode. I had a perfectly fine EGA monitor and a really crappy and scary VGA monitor with a broken power supply that I could run with two external power supplies (one for 24V DC and the other for IIRC 105V DC) so I could temporarily use VGA while installing stuff and then downgrade to EGA and swap over to the EGA card and monitor. 🙂
Here is an article from InterAction magazine of Sierra On-Line, the maker of the classic adventure games: How To Get 16-Color EGA Graphics on your IBM or Compatible, Without Buying a New EGA Monitor
http://sierrainteraction.wikidot.com/how-to-get-16-color-ega-graphics-on-your-ibm-or-compatible-w
All those early Sierra games were designed for 320×200 with 16 colors graphics mode. The combination of the EGA card with CGA monitor is perfect for these games.
Thanks for the link. It occurred to me earlier that those old Sierra games would have worked with an EGA card on a CGA monitor, and that maybe that was intentional.
BIOS mode E (640×200, 16 colors) was probably meant exactly for that case (EGA with a CGA monitor).
Perhaps you will find interesing (Assuming that you didn’t read them previously, which is unlikely) the PCjs blog posts about the Fantasy Land EGA demo:
https://www.pcjs.org/blog/2017/07/03/
https://www.pcjs.org/blog/2018/04/23/
The second one does mention something about font differences when using a CGA Monitor on an EGA card, since it had Jumpers for that particular case.
OK, now I understand the problem better — it’s that the EGA BIOS does not prevent modes 0Fh and 10h (640×350 graphics) being set when running with a CGA monitor. Yes, that’s exactly the incomplete input validation.
Interestingly, the BIOS does a much better job when a mono monitor is attached. The logic is that with a mono monitor, if you’re not setting mode F (640×350 monochrome graphics), the BIOS forces mode 7 (MDA text). With a CGA monitor attached, the BIOS does not block modes F/10.
Of course users are not supposed to set those modes, because the results are not going to work no matter what, even without garbled monitor signal.
Yes, I read that when Jeff first published it. Re-reading the blog post, it hints that perhaps the Enhanced EGA monitors were initially unavailable or in short supply; in any case the FantasyLand demo assumes a CGA monitor attached to an EGA, because it uses 8×8 fonts rather than 8×14. Update: Yes, the IBM announcements say the EGA was planned to be available in October 1984 and the Enhanced EGA monitor in January 1985. So there was a period of several months when the EGA was available but the monitor was not.
Interestingly, a VGA can be told to optionally use 200-line (CGA style) text modes, but the EGA can’t choose at runtime; it has to use either 200 or 350 lines (resulting in 8×8 or 8×14 text mode fonts) depending on the attached monitor. VGA BIOS offers INT 10h/12h,BL=30h subfunction to set the text mode lines (200/350/400), on the EGA it was purely a function of the switch settings.
I should add that the Enhanced 5154 monitor did support either 350 or 200 lines, but as far as I can tell, the EGA BIOS could not be easily told to use 200-line text modes if the switches were set up for the Enhanced monitor. But it would have been possible to change the switch settings in the BDA, set a 200-line mode, and then put the original switch settings back.
I haven’t dug my real hardware out to test with, but I’ve looked at the Paradise EGA BIOS in the PC1640 and as far as I can tell from the disassembly it behaves like the IBM EGA BIOS. If the monitor is mono all modes other than 0Fh are mapped to 7, but there’s no similar check for a CGA monitor. Maybe they were aiming at bug-for-bug compatibility with the IBM card.
Re 16 colors in graphics mode – did the PCjr organize the video display different to how EGA did?
If not it it was a bit weird that 320*200 in 16 colors were int 10,09 for PCjr but int 10,0D for EGA.
https://stanislavs.org/helppc/int_10-0.html
I don’t know off hand how the PCjr organized the memory, but EGA mode D is a planar mode, very similar to the higher resolution EGA/VGA modes. I can’t imagine it would have been organized anything like that on the PCjr.
One can’t map the CGA monitor to just a single screen mode. Part of the EGA design was the creation of new modes or the redefinition of existing modes, some of which would be available on a CGA monitor. If the ROM can’t be sure that a mode would be illegal, it had to be left to the programmer to only choose legal modes for the monitor attached to an EGA card. Charles Petzold’s PC Magazine columns (August and September 16, 1986) cover some of this along with a note of the problems that might be involved in using the feature connector to create additional columns on a monochrome display. I had vaguely remembered those articles.
The high color modes for the PCJr are similar to the color modes on CGA. CGA stores all the even rows then all the odd rows with 8K per group of rows. PCJr 16 color has four banks each one handling every fourth row. One slight oddity in the PCJr is that the full 32K is not available with memory addressing B800:0000; only 16K was remapped from its actual position to B800. The other 16K requires access through its actual low memory address.
Richard: Well, you can. The EGA BIOS reads the jumper settings and based on that uses CGA or EGA display modes for all the text modes. The 200 line graphicas modes will always output a CGA compatible signal, while the 350 line graphics mode (and also the 43 line text mode) will always output a signal requiring an EGA monitor.
Sure, there could be some rare use case where you set the dip switches to use CGA compatible modes for those text modes even when you have an EGA monitor, like for example if you have a PC with an EGA monitor and a signal splitter and also hook it up to a CGA compatible video projector for classroom presentations and whatnot. However that would be a way rarer use case than plainly using an EGA card with a CGA monitor, so it would be reasonable for the EGA BIOS to use the dip switch settings to disable/enable the display modes (and variations thereof) that require an EGA monitor.
We could also argue that the various PC related computer magazines did a bad job by not testing how various programs behaved when using a CGA monitor with an EGA card.
Side track / bonus: By the end of the really low-end PC era that weren’t or were barely able to run Windows, at least in Europe it would had been fairly common to have TV sets with an RGB input which with a simple level adaptor (a few resistors) could display CGA modes but not EGA modes. So for home user oriented PCs (like perhaps the Sinclair 200?) it could also had made sense to have a graphics card more capable than the monitor. Sure, PCjr/Tandy 1000 compatibility might had made more sense, but still.
MiaM: Using the feature connector, it should have been possible (in theory) to set up modes that use neither 200 nor 350 lines which would have been assigned to an existing mode number. Having the BIOS prevent that newly defined mode from going to a working display because the default mode settings wouldn’t work defeats the purpose. Retaining memory contents after a mode switch by setting the high bit on mode was another case where IBM chose to risk potential screen corruption for the possibility of undreamed of functionality by third parties.
The PC magazines did a number of tests with EGA cards and CGA displays when EGA appeared. Check out Infoworld for some reported problems that were quickly fixed. They just didn’t go back check later EGA specific programs that were poorly coded which failed to handle the CGA display being attached.
I just thought about this: Since you’re working on the EGA VBIOS source itself, could it be theorically possible to modify it to simply ignore the font size Jumpers, then provide an interface to manually change that at runtime via Software with a MODE-like command? Or is there any Hardware limitation?
It would be possible. The EGA switches, as far as I can tell, do not affect the function of the hardware in any way, they provide four bits of information to the BIOS; nothing more, nothing less. The switch state is only read by the EGA BIOS POST routine and after that, software can manipulate the switch state in the BDA. Which the BIOS itself in fact does in some situation.
As you may or may not know, in some DOS versions, the MODE command manipulates the BDA (not exactly the EGA switches AFAIK) when switching between mono and color modes. The EGA and VGA BIOS does much of that itself (including changing the value of the EGA switches in the BDA) when going between mono and color.
The Petzold article of Sep 16, 1986 shows examples of how font changing and altering the MODE command could work. There was an italic version of the 8×14 font. The 8×8 font was given two blank lines for a 80 by 35 mode and loses the lowest line for a 80 by 50 mode. The 8×14 font was doubled to cover 28 scan lines giving a mere 12 rows of characters which could be combined with the 40 character row mode to provide a 40 by 12 display which falls to 40 by 7 on a CGA monitor. MODE BW40 gets redefined from 40 by 25 to 120 by 25. The intensity bit allows two different fonts on screen at the same time. IBM designed EGA to be incredibly flexible in character modes.
The hardware hackery with supplying a faster clock in order to display 120 characters per row described in the article is very interesting.
I bought an EGA card in 1990 for my CGA monitor. By then, it was a cheap upgrade. Nearly all games and several apps ran fine at 200 lines.
For programs that required 350 lines, I used a BIOS patch with a hotkey to pan inside a 200 line window.
On the same monitor, my (Zenith) CGA card supported an interlaced 640×400 mode. I always wondered if my EGA card could have run a 640×400 interlaced mode for better compatibility with 350 line programs.
640×400 doublescan CGA mode requires a 25 kHz signal. Most standard EGA cards did not have the extra crystal to do that.
Nathan: Do you remember what that TSR was?
The hardware requirement to produce an interlace signal is to alter the number of lines by one between each field, so the actual resolution including invisible lines above/below the visible areas would be odd. Not sure but I think the 6845 CRTC chip has a bit to turn interlace on/off (but doesn’t do much more than that). Seems like something that wouldn’t had been emulated in the EGA 6845-like implementation. Later on VGA cards used interlace but I don’t know if there were any of those EGA and VGA combination cards that had the proper hardware for interlace. Otherwise you would need a vertical sync interrupt (or simulate one using another timer with a software PLL to sync up to the vertical retrace, if it’s possible to read which line is output to the monitor at the moment) and change the number of lines up/down/up/down by one for each field, and also change the settings for what to display. As a bonus you’d need to tell the hardware to skip every other line when displaying the picture, or otherwise the picture stored in memory would be very weird.
Richard: Afaik double scan of the CGA modes always uses about 31kHz. Nathan is talking about interlace though, which would still be 15kHz.
MiaM: The AT&T 6300’s Model 319 monitor and other 640×400 displays used 24 kHz to 26 kHz signals.
The EGA Wonder was one of the EGA cards that supported an interlaced signal on CGA monitors. The reviews for the interlaced mode were unfavorable with terms like “nervous-looking eye-straining flicker” being used.
Thanks for such a reconstruction!
There are two OCR errors in VCGMN.ASM.
Line 99 (decimal), comment: should be TH_2D instead of TH_20
Line 327 (decimal), comment: should be BT_9B instead of BT_98.
Thanks, I fixed the errors. And found two more very similar errors in the same file.
Could you finish this driver?
https://www.vogons.org/viewtopic.php?f=63&t=55188&start=83
Am I capable of that? Maybe. Do I have time and motivation to do it? No, sorry.
In file V1-5.ASM, comment located at line 20 should be put at line 19;
comment located at line 22 should be put at line 21.
The same in the file EGABIG.ASM (lines 2926-2929).
Compare to page 24 in PDF.
Thanks, fixed. Given the amount of manual work I’m shocked there aren’t far more problems 🙂
Too bad the comments can’t be verified for correctness as easily as the source code.