In today’s article, I want to take the opportunity to talk about running 16-bit DOS applications in a Windows Server 2003 Terminal Service environment. It may seem strange to talk about applications written for an operating system that has been obsolete for the least ten years, but believe it or not, there are a mind blowing number of companies that still depend on legacy DOS applications for mission critical tasks. Since that is the case and because DOS applications don’t really like to play nice in a Terminal Server environment, I wanted to talk about some of the issues that you may encounter.
Before I Begin
Before I get started, I want to point out that not all DOS applications are created equally. Depending on what an application does and how that application is coded, a DOS application may run perfectly in a thin client session. On the other hand, a DOS application may not work at all in such an environment. My advice is that if you can avoid doing so, you should not run DOS applications in a thin client environment. Of course if you have a mission critical application that was designed for DOS and your corporate security policy mandates running the application in a thin client environment, you may not have a choice.
Why Are DOS Applications so Finicky?
There are many reasons why DOS applications don’t perform well in a Windows Terminal Service environment. In order to really understand some of these reasons though, you need to have an understanding of what sets Windows apart from DOS.
When Microsoft released Windows, the public perception was that Windows just provided us with a pretty graphical user interface and maybe even the ability to run multiple applications simultaneously. There is more to it than that though. The thing that made Windows so revolutionary is that it allowed applications to be coded in a way that allowed them to function regardless of a system’s hardware. If you think back to the early 1990’s, you will recall that most DOS applications came with about a dozen different video drivers and a huge number of sound card drivers. Whoever developed the application had to also come up with drivers that would allow the application to work with various hardware devices. If you bought an application and it didn’t include a driver for your hardware, you were out of luck.
What Windows did was to allow drivers to exist at the operating system level rather than at the application level. Think about it for a minute. When you buy a new video card, you simply install a device driver and all of your Windows applications use that driver. You don’t have to worry about individual applications not being compatible with the video card.
The difference in the way that DOS and Windows applications use device drivers accounts for one of the biggest problems with running a legacy DOS application in a thin client environment. Think about the way that the terminal services work. An application actually runs within a virtual session on a server and then the screen image gets transmitted to the client. So if the application has a built-in set of drivers, which drivers do you choose; drivers that match the server? Drivers that match some of the clients? What are the odds that an ancient application will even have drivers that will work with modern hardware?
Generally speaking, if the application runs in text mode, you will usually be OK (from a driver standpoint) running it in a thin client environment. Text mode applications do not require video drivers. If users will have to print from the application, you may still need printer drivers, but many modern laser printers are backward compatible with drivers from many years ago.
If a DOS application does use a graphical interface you may still be able to get the application to work. Try looking to see if the application includes a CGA or an EGA graphics driver. Many of the CGA and EGA drivers that were used with DOS applications are universal. The interface might not display as pretty as if it were using a higher resolution driver, but at this point our goal is just to make the application work. Cosmetics aren’t really a consideration initially.
Resource Hungry Applications
Another common problem with running old DOS applications in a Windows environment is that when DOS was the dominant operating system, most applications were not designed to multitask. Sure, there was the occasional TSR (Terminate and Stay Resident program), but those were the exception rather than the rule. The point is that when a programmer developed a DOS application they could safely assume that the application would not be sharing the system’s resources with anything other than DOS itself and a few DOS components such as HIMEM.SYS, EMM386.EXE, etc. As such, the programmer had free reign to do anything that they wanted. I myself used to be a developer in the days of DOS, and I know from first hand experience that programmers used to use all kinds of little non conventional tricks to squeeze extra resources out of the system.
At the time, such programming practices were in vogue. Systems weren’t very powerful back then and writing an application that would use every bit of a system’s resources was almost considered to be an art form. For example, a common practice at one time was to address video memory as system RAM to compensate for systems with inadequate memory. DOS Applications that make creative use of system resources often will not even function in a Windows environment because Windows addresses memory so much differently than DOS did.
Even if an application is not memory hungry, some fairly innocent programming techniques can be problematic in a terminal service environment. For example, we have all seen programs that stop and wait for a key to be pressed. In the days of DOS, input commands usually required the user to press enter in order to make the program resume. Often, programmers wanted a user to be able to press any key to continue, rather than having to press Enter. To accomplish this, programmers used a function called INKEY$. The INKEY$ function was basically an infinite loop that polled the keyboard during each cycle to see if a key had been pressed. Once a key press occurred, then the loop would be broken and the program would move on to the next line of code.
Using INKEY$ was fine at the time, but this function can wreck havoc on a Terminal Service environment. Think about it for a moment… A Terminal Server must divide its CPU resources among its own operating system and each guest session. Normally, CPU access occurs in round robin fashion, insuring that each session receives adequate CPU time. If an application is using the INKEY$ function though, the function is basically telling the CPU to waste processor cycles until a key is pressed. Such a function can dominate the server’s processor, depriving other sessions of CPU time.
The good news is that there is a way around this problem. Although I have never personally used it, a Web site called www.tamedos.com offers a utility called Tame that is designed to keep keyboard polling routines such as INKEY$ from dominating a system’s processor. Windows NT 4.0 Terminal Server Edition included a utility called DOSKBD that basically did the same thing, but Microsoft did not include the utility in newer versions of Windows and the Windows NT version does not work with newer Windows operating systems.
In this article, I have explained how a difference in the way that DOS and Windows use device drivers and how creative programming techniques cause DOS applications to not work very well in thin client environments. Although my article has discussed some potential solutions to these issues, you need to be aware that some DOS applications simply cannot be made to work in a Terminal Service environment. Furthermore, this article has only begun to scratch the surface of the DOS / Windows compatibility issues. Many other architectural differences exist that can cause conflicts between DOS applications and a Windows operating system.