Virtual Memory

The downside of virtual memory is that it's hundreds of times slower than real RAM. VM works best when you run several small applications at the same time (versus one big application that eats RAM to open big documents). With the small apps, the foremost program dwells in real RAM, while background (inactive) applications get loaded into virtual memory. Avoid running background tasks in these background applications, because that slows the whole system (with frequent swapping between RAM and the virtual memory section of your hard drive).

When you run with virtual memory turned on, there's a bigger burden on the memory manager unit (also called MMU--it's built into the more modern CPUs). If you open a program and there's not enough memory left for it in RAM, the CPU gets a "no vacancy" message. Then the MMU searches RAM for the least recently-used information and copies that onto the hard drive (at the same time removing it from RAM). This means what used to be in RAM gets stored at the speed the hard drive turns--much slower than the electron-fast RAM memory. Now there's space in RAM, so the new program gets loaded into RAM memory. Later, when you switch back to the earlier program, the MMU swaps it back out of the virtual memory space of your hard drive (and back into RAM, while it moves other information out of RAM and back onto the hard drive).

VM uses hard drive space. If you have 16 Megs RAM and add 16 Megs of virtual memory, VM creates a 32 Megs space for it on your hard drive (don't ask me why; maybe because it's easier to keep track of where everything's stored in the virtual map of the RAM?).

What this boils down to is a bunch of extra steps: to read a program off the hard drive, store it in RAM, then later put it back on the hard drive in the VM space and swap it back and forth between RAM and hard drive, depending on when you use it. Every time you hear your hard drive chugging away (especially when you switch between one application and the other), that's virtual memory slowing the world. It makes more space at a dire cost in speed.

CRAZY FACT: with VM turned on, a 32-bit machine could have as much as 4 Gigabytes of virtual RAM to play with (granted you have a 4 Gig hard drive to put it on). Don't try it though! The hard drive'll be so busy spinning, you'll wonder if you were better off with the old 512K Macintosh!


Disk Caches

There's another part of the Memory control panel that helps instead to speed-up memory processes. It's the Disk Cache.

The word cache comes from French "caché", which literally means "hidden". The cache sits between the CPU and a source of memory storage (hard drive, CD-ROM, or RAM). It serves as a hidden receptacle for faster data access. On a hard drive, the built-in cache speeds access to the drive's data and ultimately means less time waiting for the drive to spin.

The built-in Disk Cache (in Macs with System 7 or higher) is always on. Apple used to recommend setting it to 32K for every meg of RAM (so for 16 Megs, it would be at 512K). Nowadays (with System 7.5 and above) you could set it anywheres higher than that, but you'd be eating up precious RAM space. Here's how it works:

The amount of Disk Cache you specify in the Memory control panel gets reserved for use as cache on the RAM chips (so with 16 Megs RAM, there's only 512K of that used for the Disk Cache, still leaving 15.5 Megs for normal RAM use).

When a program needs data stored on the hard drive, the CPU sends out the request. The Disk Cache intercepts this request and looks within itself first. If it can't find the data, the DC passes the request to the hard drive. Then the DC reads the data it needs from the hard drive, along with other data nearby (it figures this extra bit might come in handy real soon, and since it's accessible without making the drive spin to a different location, why not get it while it's there?). Only the requested data gets copied (at the electron-speed of RAM) from the DC to the regular RAM, and then delivered to the CPU.

The next time the program asks for data, the CPU again sends the request to the DC. If the cache already happens to have this data, there's no need to again search the hard drive (which may have spun to a different location by now)! The new data goes directly into RAM and then to the CPU; so this speeds the process of fetching info off the hard drive.

Another boon to speed, RAM cache can be added with a RAM cache card. (If your machine supports the addition of a RAM cache card, but you might look into it, if speed ever becomes a serious problem.) RAM cache cards use static RAM chips, which run at least three times faster than the dynamic RAM on SIMMs and DIMMs. So even small amounts of static RAM (it's expensive) like 32 or 64K can boost performance by 30 to 60 percent.

© David Pierre Ostwald, 1997--all rights reserved.