Curiosity. The final frontier.
I’m looking at a source code of one program, it’s from 1993. I was around 16 years back then. What got me into learning PC architecture, DOS and assembly was pure curiosity. I sought answers with a feror.
It’s like sipping coffee from a cup, versus thinking how that cup is actually made: if ceramic, how is the stuff molded? What actually is ceramics? How does it endure things? What is done to get a smooth, glossy surface finishing? How does the painting and ornaments get done? What happens to the cup once I dispose it as a broken thing? Is the cup factory a small or large one, and what kinds of machines it has? How does the ceramic oven work? What skill do the people need? What’s the history of pottery?
Same with assembly.
I had used DOS as a tool. Typed commands, to list files, run programs, the usual shell stuff.
Then I wanted to know things on a deeper and more precise level. What about that DOS shell. How does it actually work? The command.com is MS-DOS command interpreter.
Let’s imagine. Shell has to:
- take input string from user
- maintain current working folder name
- be able to call DOS / BIOS functions
- reads the file allocation table (FAT) of DOS to get to files
- should be able to interpret and load both a .COM and .EXE files and let them run (executing)
- alter access bits of files (this time, it’s not FAT, but access control data of a directory and file table)
- read and write a file, renaming it (“COPY”)
- delete a file (mark it deleted, does not actually overwrite – calls a DOS service interrupt)
- create a new file with zero contents (a null file)
- support so called “standard input” (stdio) and standard output (stdout)
- be able to redirect ‘<‘ input from a file to shell
- be able to redirect ‘>’ output from shell’s program executions and that of internal commands to a external file (very nice for logging stuff from commands, can be useful in later inspecting and verifying automated stuff)
- support running special files that are DOS BAT scripting (Bat stands for batch files)
That’s a lot of responsibility!
DOS itself was not a multitasking operating system. It could only load a process into RAM, execute it, and then come back to the shell.
The DOS era came with lot of really interesting 3rd party utility programs. Norton and Quarterdeck were some of the biggest household names. These utilities did a variety things, from optimizing RAM usage, to keeping the system safe and things generally running smoothly. Quarterdeck’s Manifest is one utility that definitely has a place in my heart (see image)

There was one really strange and neat thing with the shell itself. DOS kept a lingering remnant of the command interpreter code in RAM, and with luck, that could be used without reloading the command interpreter from floppies or hard drive. This meant that the shell was faster to run, after terminating the main program.
However, if the program’s own process wanted to allocate “too much” RAM, it squeezed the memory resident (RAM-residing) portion of the shell out, and the shell loaded its main code from disk, once it was needed again.
There were all kinds of optimization and tricks that got done when RAM was scarce. We’re talking about a few hundreds kilobytes (kB) of RAM. 8192 kB was a figure often reported in high-end PCs of 1990s. That’s 8 megabytes.
We nowadays have 16384 megabytes, 32768 megabytes (32 gig, GB) or even more of RAM. That’s a whopping difference.
Put remember: plot the figures on a logarithmic scale, using Moore’s law, and there’s a rather surprising finding: things aren’t that extraordinary. Moore’s law says that the amount of transistors or basically any physical electronics primitives doubles every 18 months. In other words, it means that in around 30 years (30*12 / 18) which is 20 Moore’s cycles, we should have 2^20 times as much resources. Let’s check:
2 to the power 20 is 1048576 or roughly 1 million times. Well, what happened to memories? From 8 megabytes to 32 gigabytes is just a factor of (32768/8) = 4096. So RAM memory amounts roughly went up by a factor of 4000 (four thousand) in 30 years. It’s less than what Moore would have anticipated.
I wrote a couple of utilities myself.
What I wanted to see was a X-ray of the DOS’s living RAM state. Since DOS did not support multitasking at all, it had to be done oneself. The way to enable virtual multitasking is done by 2 things:
- load a program, so that you reserve permanent RAM for it, which doesn’t vanish at termination of the execution
- use a special TSR command (interrupt) instead of normal proces exit. This asks DOS kindly to keep the software in memory. Anything you’ve done is kept as-is.
- your software takes responsibility of the situation
- hook the keyboard handler routine to inject an inspection (check) of keys pressed. If your desired magic key combo is pressed, capture that (remove from keyboard queue) and do the things you want to
- in the case of RAM utility, what happened was this software showed a menu in ASCII, on top of DOS, and let the user navigate there with cursor arrows
- restore DOS execution after each session with your resident utility program has ended – note: the software still remains in memory. You can let the user call you as many times as you wish
- remember: don’t leak memory. If your resident routine does dynamic memory allocation, things might get unstable under DOS – it was better to only once allocate memory (before staying resident), and later, when your interrupt was invoked and you were live again, use the already allocated static RAM areas — if they were indeed needed
- memory was allocated by the 16 byte chunks (a ‘paragraph’ in DOS parlance)
- so if your program took 1000 bytes, you would allocate up till the next even 16-byte chunk: use “ceiling” (round to nearest integer up) math function to keep things on safe side. Example: 1000 => allocate ceil(1000/16) = 62.5 = 63 paragraphs. This would actually then allocate 63*16 = 1008 bytes. Very little waste. Only 8 bytes lost as inefficiency, that is negligible!
Leave a Reply