While trying to work on my DOS 5.0 article, I looked at DOS 5.0 build 224 from June 1990, which is the oldest surviving beta of DOS 5.0. And the README contains the following intriguing text, which reminded me of previous WordStar ruminations:
DOS 5.0 and WORDSTAR Due to a known problem, some older versions of WORDSTAR don't work correctly with this pre-release version DOS 5.0. We know what the problem is, but the fix was not incorporated in time for this beta release. WORDSTAR 2000 seems to work fine with DOS 5.0.
This is of course maddening because it does not mention which version of WordStar might have trouble. Version 3.x? Version 4.x, which is in fact newer than WordStar 2000? Some other version? There were more than a few. Well, let’s try WordStar 3.24 (early 1983) since that’s what I happen to have on hand:
Yep, there certainly is some kind of problem. But lack of memory? Unlikely. So what is it then?
Maybe my CALL5 utility might shed some light on the problem? Let’s see…
Lucky guess, indeed it did. Notice that in this beta of DOS 5.0, the CALL 5 interface still works (otherwise CALL5.COM wouldn’t produce any output), but the CALL 5 destination is 0000:00C0 (i.e. the INT 30h vector) instead of the usual F01D:FEF0 or so. The upshot is that the reported program segment size is 0C0h (192 bytes) and not the usual 0FEF0h (65,264 bytes). That might conceivably cause problems, and it certainly upsets old WordStar 3.24 because it thinks there isn’t enough memory.
So what does that tell us? I’m not sure. WordStar 3.24 clearly uses the CP/M-compatible information in the word at offset 6 in the PSP, but I see no evidence that it uses CALL 5 (I put a breakpoint on the entry point, and it never triggered while running WS). So if the goal was just to keep WordStar 3.2x going, it would be enough to put the right information into the PSP at offset 6 and forget about the CALL 5 interface. So who needs the CALL 5 interface which requires the address wraparound shenanigans? The mystery remains.
To recap the current knowledge: WordStar 3.24 (released around February 1983) is the oldest surviving version for the IBM PC that could be thoroughly examined. It was found to rely on the word at offset 6 in the PSP, but not use the CALL 5 interface. So this version of WordStar alone cannot explain the need for address wraparound and A20 line control.
There were earlier versions of WordStar. Version 3.20 is known to have existed, and 3.21 also existed. Currently there is no reason to believe that those were substantially different from version 3.24.
It is known that WordStar 3.02 was the earliest version for the IBM PC, and it was potentially significantly different from version 3.20. The reason to suspect such difference is that version 3.2x also supported CP/M-86 (the overlays are identical between DOS and CP/M-86, only the main executable is different), but 3.02 predates the appearance of CP/M-86.
So maybe WordStar 3.02 is the culprit? Well, no, it isn’t. WordStar 3.02 (released in April 1982 or so) didn’t even work with PC DOS 1.1, so it’s extremely difficult to believe that it would have been a concern by the time the PC/AT came out in 1984. So many question marks.
2022 Update: Although WordStar 3.02 for PC DOS is yet to be found, the WordStar history is now somewhat clearer. Sometime in late 1981, WordStar was ported to the 8086, targeting CP/M-86, perhaps even before CP/M-86 was released to the public. In April 1982, the CP/M-86 version of WordStar was lightly modified to run on top of PC DOS, and the first DOS version of WordStar was probably released around May 1982.
Version 3.20 of PC DOS WordStar has turned up in the meantime, and indeed it’s not substantially different from 3.24. There is good reason to believe that there never was a version of WordStar for DOS that used the CALL 5 interface, since it was derived from the CP/M-86 version using INT E0h, not from the 8080 CP/M version. It’s not clear when WordStar 3.20 for PC DOS was released, but it was probably in July 1982.
As detailed above, WordStar did depend on word at offset 6 in the PSP, which requires A20 wraparound if CALL 5 is to be supported as well. To put it differently, without A20 wraparound, DOS can either break programs that use CALL 5, or programs that use the segment size at offset 6 in the PSP (WordStar 3.x is in the latter category). With A20 wraparound, DOS can satisfy both classes of programs.
Finally, in the meantime, other programs that rely on A20 wraparound for reasons entirely unrelated to CP/M compatibility and the PSP have come to light.
MASM v1 was definitely written in Pascal. On the PC, and perhaps micros in general, Pascal had a head start on C. That is evidenced by the fact that IBM offered a Pascal compiler at the IBM PC introduction in August 1981, but no C compiler. I would also say that the early PC Pascal compilers were more professional than the C compilers.
I used Turbo Pascal in the early 1990s and it was an excellent product (not written in Pascal). I would have probably used it for a lot longer if there had been a 32-bit DOS version, but for that C was effectively the only game in town if you didn’t want to do everything yourself.
@Rugxulo: you’re right, we don’t. We use C. Though C is not without its
own problems, either.
Such differences seem to be largely cultural, not technical in nature.
Me’d imagine that bwk’s points have long been resolved in common
implementations. The question is, of course, if his central points of
‘no escape’ and ‘fixes are non-standard extensions’ have been taken
care of, as well; mewouldn’t know.
As for the jargon file, it seems like it hasn’t been updated in quite
a while, medoens’t know why. There’s a 1995 update to the BASIC entry,
admitting that it ain’t so bad a language anymore: if what you say is
true, me’d argue that something similar should be added to the entry
for Pascal.
@Necasek: Honestly, to me it seems that, while Pascal had been more or
less fully designed and then implemented, C only gradually gained its
most useful features. One might thus argue that C wasn’t mature enough
at the time. That said, where C becomes popular it sure tends to trounce
everything else =)
@zeurkous
I investigated before the p-Machine Pascal, effectively there was versions for CP/M and DOS, but I found that in CP/M it bypassed the CP/M calls and talk directly to the BIOS and used their own disk format so I think that there is not chance of using CALL 5.
Hm, that’s interesting. What parts of CP/M *did* it use? Or did it
take total control of the machine?
Yes, Pascal was used a lot in the 70’s and 80’s, I dabble with it but not for any significant project. My understanding it’s that one of the problems was that it was not transparent to port code from environment to environment, the first Pascal had some dependencies to a CDC 6000, later the environment to go was p-Machine after that Turbo Pascal was the environment to go. Somewhere I read that Microsoft used it extensively al the early 80’s.
At the Microsoft Languages Newsletter. Vol 1-1 (thanks to PCjs) has this text:
Faster Macro Assembler 4.00 release developed in Microsoft C
By porting the new Macro Assembler 4.00 release to Microsoft C, it assembles programs from 2 to 3 times faster than the previous Microsoft 3.00 and IBM® 2.00 releases.
In this newsletter advertise this languages:
Latest DOS Versions:
C 3.00
COBOL 2.10
FORTRAN 3.31
Macro Assembler 4.00
Pascal 3.31
QuickBASIC 1.0
Unfortunately it doesn’t have a date.
@zeurkous
This was my 1984 adaptation of UCSD Pascal II.0 to run on top of a typical CP/M bios:
http://www.math.purdue.edu/~wilker/misc/ucsd/
That’s circa 1984. C 3.00 was Microsoft’s own then brand-new compiler.
I believe Pascal was extensively used for the Macintosh, too. Even the early MS Windows SDKs supported Pascal.
UCSD Pascal needed its own file system because so much of the compiler was tied to the longer filenames and special filetype bytes. The DOS hosted version used DOS functions to access the P-system volumes which were handled as unified DOS files because the largest P-system volume was about 11 MB. As far as I can tell, PSYSTEM.COM used INT 21h not CALL 5. I think all the calls in DOSFILER.CODE pass through PSYSTEM.COM; DOSFILER is the Pascal routines that handle importing and exporting between the DOS file system and the P-system volumes.
P-system did have the advantage of having much more efficient memory utilization. However, P-system was long limited to a maximum address space of 128kB which was rather limiting in 1986. Pecan improved it to use larger memory ranges and Cabot even managed to turn it into a 32-bit version but, by then, most UCSD Pascal programmers had moved onto other environments.
A major program using MS Pascal was PC Write and Bob Wallace was certainly close enough to inform MS as to any bugs affecting his program.
Yes, Pascal was very important for Apple.
History of Apple and Pascal:
https://archive.org/details/ApplePascalHistoryDTC92
IIRC (atleast 16-bit) Windows uses Pascal calling conventions for it’s API.
With Microsoft C first released 1984 and Windows 1.0 released around the same time, that makes sense as it would probably had been too hard to write and ship such large product with a compiler in beta stage.
(But what compiler were used on Xenix before Microsoft C were released for DOS? Maybe 3.00 in the DOS version isn’t just marketing bullshit but versions 1.x and 2.x were available for Xenix? (Just like Wordstar 1.x and 2.x for CP/M-80 and 3.x being the first version for DOS)
“Me’d imagine that bwk’s points have long been resolved in common
implementations. The question is, of course, if his central points of
`no escape’ and `fixes are non-standard extensions’ have been taken
care of, as well; me wouldn’t know.”
He’s only barely correct on some few points. Not a totally inaccurate initial critique, but in hindsight it leaves a lot to the imagination. It’s not worth putting that article on a pedestal. It has been largely debunked several times. In 1981, K&R C wasn’t perfect either, and Modula-2 was already designed and implemented as an improved replacement (ignoring many further offshoots later on, e.g. Extended Pascal or Oberon).
“I used Turbo Pascal in the early 1990s and it was an excellent product
(not written in Pascal). I would have probably used it for a lot
longer if there had been a 32-bit DOS version, but for that C was
effectively the only game in town if you didn’t want to do everything
yourself.”
GNU Pascal started in 1988. FPC was first released sometime around 1995. Both of these had 32-bit DOS versions. I know the world was somewhat smaller before widespread Internet (but we still had BBSes …), so I do understand. Still, there were others (SVS Pascal / 32-bit DPMI, which I’ve never used; 32-bit TMT Pascal). Admittedly, C was and is much more popular.
“Yes, Pascal was used a lot in the 70’s and 80’s, I dabble with it but
not for any significant project. My understanding it’s that one of the
problems was that it was not transparent to port code from environment
to environment, the first Pascal had some dependencies to a CDC 6000,
later the environment to go was p-Machine after that Turbo Pascal was
the environment to go.”
There were two ISO standards for Pascal. Both were heavily ignored, especially the latter. Yes, it’s true, even standards are ignored, compilers have bugs, test suites aren’t always widespread. But even C has many implementations, bugs, variations, and tons of non-portable extensions. VLAs and _Complex are now optional in C11, probably owing to lack of compliance of various compilers.
I don’t really claim classic Pascal is 100% better than C or perfect. But the idea that you can’t do anything useful in it is wrong. Compilers are incredibly complex, so anything ambitious hits dark corners. C is no panacea, which is why so many people are still trying (and failing) to replace it (although I still like it, it’s hard not to when most good things rely on it).
“By porting the new Macro Assembler 4.00 release to Microsoft C, it
assembles programs from 2 to 3 times faster than the previous
Microsoft 3.00 and IBM 2.00 releases.”
That may be true, but it’s bad engineering. I don’t totally blame them, but it’s (usually) not a language’s fault for a poor implementation. You can write a slow assembler in any language. FPC is a faster compiler than GCC. What does that tell you? (Not much. Re-reading headers is usually bad. But there’s many other differences.)
XENIX 2.0 for 8086 existed, but used a different C compiler. I have not had a chance to examine it so I assume it was a port of one of AT&T’s C compilers (pcc?). Starting with XENIX 3.0 (also IBM PC XENIX 1.0), the C compiler used was more or less the same as Microsoft C for DOS (including the capability to cross-compile DOS executables on XENIX).
I wanted something 32-bit back in 1993 or so. There was Watcom, MetaWare, NDP, and others. And they were well supported with DOS extenders etc. Pascal was in many ways a nicer language, but the industry support for C was just so, so much better. That is hard to fight.
Oh and I totally agree that the “2 to 3 times faster” MASM was probably not the result of a better compiler (MS C 3.0 was not that good) but rather better algorithms and better implementation.
Well, one major difference between C and Pascal were bounds checking buildt in into Pascal. That could actually have made a difference in something like an assembler. There are loads and loads of small reads from the source and tiny writes to the output buffer(s), so there are plenty of places where bounds checking could impact performance rather hard.
At my previous job we did a lot with Pascal. Delphi to be specific. Code was developed in the 90’s with Turbo Pascal for DOS. Borland made it easy to transition to Windows. A lot of this code is still running. This was in near-realtime production environments. Nothing safety rated, just labeling boxes and scanning bar codes.
The downside of the compatibility is that there was little reason to improve coding methods. The old stuff continued to work. When we reached a point that it wouldn’t it was a big nut to crack.
This topic shows a problem with trying to maintain backward compatibility. Tricky code that uses the same information for multiple purposes can easily be broken if all the functions are not documented. Saved 20 bytes back in 1974 though.
I don’t like partial source code compatibility where much of the old structure is retained but heavy revisions are needed to work with it. Loses out both ways. Give me a simplified function if I need to rewrite code.
CP/M-86’s bolted on adjustments to the zero page lead to a bunch of calculations to figure out segment addresses. About all CP/M-86 accomplished through that was making it harder for MS-DOS programs to launch CP/M-86 small model programs since the data segment would overwrite already occupied memory locations.
“Maybe 3.00 in the DOS version isn’t just marketing bullshit but versions 1.x and 2.x were available for Xenix? ”
I believe that they came from Lattice.
As far as I can tell, Xenix 3.0 (1984) was the first one which came with an x86 compiler developed by Microsoft. Earlier 8086 Xenix versions used a different compiler. The Xenix compilers had no version numbers.
DOS-based Microsoft C 1.x/2.x were definitely rebranded Lattice compilers, so calling the next one Microsoft C 3.0 really was the logical choice, even if it was the first version of MS’s own; anything else would have been too confusing. It was not like (say) calling the first Watcom C compiler 6.0 even though there had been no prior versions.
I think I should correct a couple of my statements about CP/M-86.
Due to a bug in GENCMD, segments could not exceed 62 kB.
I have not been able to verify whether early CP/M-86 and MP/M-86 had problems running small model programs. The surviving versions are from after I know small model worked correctly. Some early DRI development tools like CBasic-86 only supported 8080 memory model CMDs. In 1983, special extensions were added to newer compilers that allowed support for the memory models including multiple code segments and multiple data segments matching DRI’s support for those models in the PC-DOS* versions of their compilers. I only found out about that from a Usenet post quoting a now vanished DRI post from 1983 discussing undocumented DRI features. Surprising DRI kept quiet about that since that was a major missing ability.
* Yes, PC-DOS. DRI did not have versions of their compilers that would run on MS-DOS on non-IBM compatible systems. Since they were not mentioning improvements to CP/M-86 versions of compilers, DRI was effectively conceding all development on DEC Rainbow and the like to competitors.
How did DRI bolt their tools to only run on PC DOS?
8080 model is, if I understand it correctly, the tiny model (essentially no segmentation). That might not have been too much of a concern in 1981, but PCs quickly got enough memory that the 64K limitation would have mattered. Small model quickly became standard but beyond that it took a while.
Mostly it was just marketing but remember this was happening in 1983, i.e. before Compaq when it was easy to make code that would not run on some systems. PC Magazine’s July 1983 issue has an interview with a DRI rep explaining the pivot to making PC-DOS the primary focus of language development but still playing down the idea of OSes with both CP/M and MS-DOS compatibility.
DRI Micronotes from 1984 indicate that DRI added MS-DOS versions of CBASIC compiler and COBOL but the other languages retained versions listed as for CP/M-80, CP/M-86 and PC-DOS (not MS-DOS) plus a lonely CBASIC for 68K. Micronotes also points to 6 patches of GENCMD and ASM-86 for CP/M-86 1.0 which indicates a lot of flux with the early CMD format. Alas, the other Micronote issues and the early patches seem not to have survived.
Yea, it is pretty funny how DR often pretended that MS-DOS means PC-DOS. I wonder why given that the OAK was not exactly unknown for example.
(tying up a loose end in this thread…)
@Rugxulo: Don’t worry, menever has regarded that article very highly,
for the tone did (and does) not appeal to me at all. Mehad
assumed, though, that bwk would have his research in order
before taking up such a strong position in public: it now
looks like that assumption is wrong.