RE: nserver.exe process is taking up over 1.1 GB of memory Julian Robichaux 2.Jun.01 02:55 PM a Web browser Domino Server -- Performance 5.0.5Windows NT
Well, it certainly looks like you've got a nice beefy server there. Probably no chance that the problem is because of lack of hardware.
First, a few more things about page files:
The only issue that you'll generally have with multiple page files is if one of them is very small. I've heard of people having paging problems like yours because they end up having a tiny page file (like 2 Meg) on one drive, and normal sized ones on the other drives. I think that this is because the operating system then has to keep going back and forth between the two drives/files to page memory. Whatever the reason, as long as they're all pretty big (yours sound fine), you're okay.
I personally like to have a single page file, but that's just me. I've heard arguments both ways (for multiple and for single page files). As long as the total size is at least 2 Gig on a server like yours, you should be all right.
The concept behind having the same size for the maximum and minimum sizes for the page files is just a concern for making sure that your page file stays contiguous (i.e. -- one big, uninterrupted stretch of disk, as opposed to little pieces of the file all over the place). The initial page file gets created as a contiguous file on the drive (if possible), based on the minimum size specified for the page file. If the OS decides that the page file needs to grow bigger, then it will try to grab some more space on the drive; however, it often can't get more contiguous space right after the original page file, so your page file can end up getting fragmented. Fragmented page files can cause excessive paging, as you described. If you set the minimum and maximum sizes the same, then you don't have to worry about that.
With a tool like Diskeeper (there are plenty of other defragmentation tools out there that perform just as well -- that's simply the tool I'm familiar with), you can analyze the current structure of the files on your disk and find out what the level of fragmentation is. Specifically, you can see if your pagefile is fragmented. If it is, you should set the pagefile size to zero for that drive (since you have 2, you can do it one drive at a time and not get any errors), reboot, run your defrag utility, and then reset the pagefile size and reboot, which will recreate the file in a new block of contiguous space.
Now then, as for your question about which process is causing the excessive paging (and probably all the page faults). You can determine that by using Performance Monitor to create a log file, and analyzing the log file later. Add the "Process" object to your PerfMon log and start logging. After you've collected some good data in your log (at least an hour), open a chart (using Options -> Data From the log file that you created), and then add to the chart (Edit -> Add To Chart) the Page Faults/Sec counter for all of the Instances listed. You should end up with a chart with a huge number of lines, but I would guess that one or two of the lines is going to jump out and be much higher than the others. That line or lines will correspond to the process or processes that's causing all the page faults.
You should also be able to use this technique to determine which process is doing all the paging (errors or no errors), with the Page File Bytes counter. Chances are it's the same process, but you never know.
I've got a few more things that you could look at, although I'm going to do that in a separate post -- this one's getting a bit long...