Windows – Does Windows performance degrade past a certain level of CPU utilization

central-processing-unitcpu-usageperformancewindowswindows-server-2003

Is there a recommended average CPU threshold in running Windows boxes based on experience in other shops?

Background:
We are running with Windows Server 2003 32-bit OS. Servers are handling a major enterprise-level web application suite with a high frequency of small transactions mixed in with much larger transactions – overall average is 13ms.

Our average overall CPU utilization of the Windows servers are ~60% during prime-shift. And we question at what level does the Windows OS begin to shimmy on the CPU scheduling road?

Thanks.

Best Answer

OK there's a few things you need to watch out for here.

Firstly is redundancy / fallover. If you have 5 machines running at 90% capacity, and one of the machines fails, then the other 4 machines have to pick up the slack ... whoops ... that takes them over 100% capacity and you will likely start to have a cascade failure on your hands.

Secondly, IF you're running multiple processes, then remember that the OS takes compute cycles to SWITCH processes too. This means that if the system load gets too high, the system can start spending too long loading and suspending tasks for execution, and less and less time actually executing the processes.

Thirdly, if you're running MS SQL server, for goodness sake configure it correctly or get someone to do it for you. MS SQL server will suck up all available RAM for cache and can bog down the machine if you don't limit its RAM usage. I have had clients who were complaining about RAM usage on a server, doubled the RAM, and noticed no performance gain because MSSQL server sucked it all up again!

Hope those help :-)

Related Topic