I've been trying to resolve a high-CPU issue on my WAS. I find the WAS operating around 35% of CPU every time I visit. So I restart the WAS and watch, and it kicks back up to 35%. Here's a couple things I found:
In this specific case, the culprit was a server process named RDPCLIP.EXE. Both that and the WAS were cranked up. RDPCLIP is initiated when you run Remote Desktop and include 'local resources' such as access to the client hard disk. I killed the rdpclip.exe process, and WAS returned to normal.
------------
The second thing I looked at was my RAW log. I found one A5W page that, as shown in the RAW log, has hundreds of blank lines before the last two HTML tags. I've reported this as a bug months ago but they have been unable to fix it (during development, an A5W page will suddenly toss in dozens to hundreds of blank lines just before the last two HTML tags.) This page, when accessed on the website, causes the CPU to jump. It was in the FOOTER.A5W page, so essentially at the bottom of all of my pages.
I know that these mega-blank line pages, when opened in A5 in developer mode, can cause Alpha to crash. So I waited until I had one that did cause Alpha to crash. I uploaded that page to my WAS as the normal A5W page and accessed it via the web. It caused the CPU to go to 45% and stay there until I restarted the WAS. The solution is to check all of your A5W pages for excessive blank lines in front of the last two HTML tags and remove the blank lines, and republish. I've had to edit those files in Notepad to keep Alpha from crashing.
---------
Third is the issue of ROBOTS.TXT. Google and other web spider may hit your website looking for pages to index. They will first seek a ROBOTS.TXT file. Unless you have done something specific to allow this, the attempt will produce an error in your Error Log (if you have logs turned on at the server) such as: [Mon Feb 11 05:42:22 2008] [Forbidden] your security credentials do not allow access to this resource. In addition you will see a "403" error on a line with robots.txt in your Access log.
To rectify that (you don't have to) you need to 1) add TXT as always allowed in Web Security > Page Security > Types always Allowed, and 2) create a ROBOTS.TXT file and place it in the root directory of your application. There's plenty of help on the web on creating a robots.txt file. The proceeding assumes you are using Security Framework.
By the way, I recommend going to www.google.com/webmasters and setting up an account for each of your websites to analyze how Google indexes your site, or if it does at all.
--------
Last is the issue with FAVICON.ICO. When someone accesses your website with Foxfire or Safari (not so with IE) it will look for a FAVICON.ICO file. Same as with the robots.txt above, if it does not find it, or Security Framework prohibits access to ICO files, it will produce an error record and a 403 record in the Access log. I found one method to tell the browsers not to look for this file, but it requires you have Apache, so not applicable to most of us. The only alternative I can see is to actually make a FAVICON.ICO file and put it on your website (and include ICO as an Always Allowed file type in Security Framework.).
---------
The last thing I will mention is your own bad processes. If you find the CPU kicking up, and none of the above apply. You have to watch the performance log and other Alpha logs as you run through each process, looking for the one that kills performance. It can be anything, but typically I've found its on pages that are executing my own xbasic, and I've done something wrong. One thing I have discovered is, if I have a process that takes a long time, and one or multiple users are sitting there waiting, and eventually triple-clicking the Submit button, causing the action to be repeated, it will really bring up the CPU and start to back log the processing.
The solution I have started to put in place for this (any user-initiated process that takes too long) is to use two third party utilities to take the processing "offline" as far as the user is concerned.
With this change, the process the user initiates just saves parameters to a text file, such as 1) the name of the A5W processing page and 2) all of the entries in a dialog form, and other information. This text file goes to a particular folder on the server where it is detected by a FolderWatch program. The FolderWatch program takes the text file and uses CURL.EXE to execute the same processing the user would have initiated, but does it in the background, perhaps even waiting until the evening when activity is low.
1 comment:
Steve, Thanks for this bit of info. The favicon issue has been driving me crazy. I can't believe the solution was so simple. It has definitely taught me the lesson of testing what is accessible from a browser.
Post a Comment