Memory Use By Windows

From: "jeno" <jeno@example.com>
Newsgroups: corel.PaintShopPro9

Subject: Re: Screen Saver
Date: Wed, 15 Jun 2005

The real truth about Win 98SE Low Resources


Memory Use By Windows

The major components of the Windows API (Application Programming Interface) are Kernel, User, and GDI. The five system resource memory "heaps" are located in USER (the input manager) and GDI (the graphic display interface manager).

USER has one 16-bit heap and two 32-bit heaps. GDI has one 16-bit heap and one 32-bit heap. The USER 32-bit heaps are used to store WND (window) structures--one WND for every window in the system--and menus. The GDI heaps store fonts, brushes, palettes, bitmaps, pens, and other graphic items. The 32-bit heaps provide a capacity greater than the system will ever require. They don't cause resource usage problems, so for the purposes of this discussion, they will be ignored.

Windows allocates the remaining two 64K blocks of memory to programs for tracking purposes. These blocks are Referred to as User Resources and GDI Resources. System Resources reported by Windows will be the lesser of these two values. You can view all "three" of these values using the Windows Resource Meter which can be run by going to Start | Programs |Accessories | System Tools | Resource Meter and after it is loaded, double-clicking its icon in the System Tray.

The following is excerpted from a post on Delltalk, a user forum, by Kickaha Ota who wrote it in response to a question about low memory warning from a user with a large amount of installed RAM.

"In order to understand why resources are limited, we first have to understand a bit about what resources are, and how they work. Resources are Windows objects that a program can manipulate. For example, every window on the screen is a resource. Every picture that's displayed on the screen is probably a resource. If an application opens a file on disk, that open file is a resource. And so on, and so on.

"If an application needs to use a resource, it asks the operating system to create or load it. For example, a Program can say, "Hey, Windows, I need to create a window that's 300 pixels wide by 200 pixels high, okay?" Windows then goes ahead and creates or loads that resource, and gives the application back a magic number that represents it. "Okay, I've created your window, and it's #38710." Then the application can use that magic number to ask windows to do other things related to that resource. "Okay, Windows; could you please display #38710 in the Upper-left corner of the screen?" "Gotcha." Finally, when an application is through with a resource, it tells Windows to dispose of it. "Okay, please delete #38710." "Gotcha."

So, what format do these magic numbers take? Well, on most operating systems, it would be what's called a "pointer". You can think of memory as being like a post office, a huge collection of little boxes stretching off into the distance; every box can hold one piece of information. And just like every post office box has a number, every memory location has an address--a number that's used to access it. A pointer to something in memory is simply the address of the area in memory where it's stored. So, if I were a regular OS, and an application asked me to load a window, and I loaded that window into memory starting at memory address #12345678, I would tell the Application "OK, I've loaded that window; it's #12345678."

On an Intel machine, these pointers are four bytes long. So if an application needs to hold a pointer to something, it needs to use up four bytes of memory in order to do it. That presented a problem to the original designers of Windows. Remember, memory was very limited back then; an 8MB machine was huge, and 4MB was more typical. And an application can use thousands and thousands of resources. So if resources were referred to by pointers, so that an application needed to use up four bytes of memory every time it wanted to refer to a resource, it could wind up using up huge chunks of memory just for these resource pointers.

So, instead, the Windows designers used a different scheme. They created the resource table. The resource table is essentially a big list of information about all the resources that are in memory at any given time. So if an application tells Windows to load a resource, Windows finds an empty spot in this resource table, and fills it in with the information about the resource that was just loaded. Now, instead of giving the application a four-byte pointer to the resource, Windows can just tell the application where the resource is in the table. If I tell Windows to load a window, and that window winds up taking the 383rd slot in the resource table, Windows will tell me "Okay, I've loaded the resource, and it's #383." Since these 'index numbers' are much smaller numbers than memory addresses, under this scheme, a resource's number can be stored in only two bytes instead of four; when you only have a few megabytes of memory to work with, and lots of resources being used, that's a huge improvement.

There's a problem with this scheme. There's only so many different possible values that you can store in a certain number of bytes of computer memory, just like there's only so many different numbers you can write down if you aren't allowed to use more than a certain number of digits. If you have four bytes of memory to work with, you can store billions of different possible values in those four bytes. But if you only have two bytes, there's only 65536 different numbers that you can store in those two bytes. So if you use two-byte numbers as your resource identifiers, you can't have more than 65536 resources loaded into memory at one time; if you loaded more than that, there'd be no way for programs to tell them apart. But on the computers of the day, there'd be no way to fit more than a few thousand resources into memory at one time anyway. So this limitation wasn't seen as being a problem, and the Windows designers went ahead and used the resource table and two-byte resource identifiers.

Now, we leap ahead to the present day. Memory is incredibly cheap; the memory savings from using two-byte resource numbers instead of four-byte pointers simply aren't significant anymore.
There'd be more than enough memory to hold hundreds of thousands of resources in memory at one time. But there's still only 65,536 different possible resource identifiers; so only that many resources can be loaded into memory at once. Beyond that, you're out of  resources, no matter how much memory you have left.

***End excerpt***

The number and type of applications running determine what portion of System Resources are being used. Known Resource "hogs" include:
If any 16-bit applications (Windows 3.x) are running Windows 9x/ME will treat the System Resources allocated to all of these apps as one block and won't release them until all 16-bit apps have been closed.

When applications are loaded, it is common for them to require additional Windows components to be loaded as well. When the application is closed Windows will retain those components because they are likely to be needed again, so that resources initially allocated when an application is opened will not all be released when it is closed, although most will.

"As we know,
There are known knowns.
There are things we know we know.
We also know
There are known unknowns.
That is to say
We know there are some things
We do not know.
But there are also unknown unknowns,
The ones we don't know
We don't know."


And you wanted to program  computers?

INDEX       NEXT