Windows – Why would a program require a server operating system versus a workstation operating system

windowswindows 7windows-server-2008

A vendor is stating that their software requires Windows Server 2003/2008 versus Windows 7 Professional because it doesn't perform well on a non-server OS. The specific reasoning was that the number of network connections is limited on Windows 7 Pro. I have 10 client workstations and the maximum number of peer-to-peer connections on Windows 7 Pro is 20 (according to some random post I found on the web).

The application doesn't deal with domains, doesn't use IIS, Exchange, or other server software. The executable resides on the "server" machine and clients use a peer-to-peer connection (mapped drive) to connect to the server and run the executable (think \servername\folder\program.exe).

The hardware requirements state that a Core2Duo or better processor is recommended as well as 8 GB of RAM. I have a i7 processor with 12 GB of RAM, but am running Windows 7 Pro (which they don't support).

What would be some reasons that could cause poor performance when the program is hosted on Windows 7 Pro versus Windows Server 2003/2008?

Thanks

EDIT 1: First of all, thanks for the feedback. I know it is against the vendor requirements and don't plan on implementing the program on a workstation OS, but what I really wanted to know was the technical details. What could cause poor performance on a workstation OS versus a server OS when both are running on the same hardware?

Best Answer

The limit of 20 concurrent connections applies to SMB connections as well. You're really not thinking scalability if you limit yourself to a max of 20 clients.

The bigger issue is that you wouldn't be following your vendor's requirements. They don't support what you want to do, so don't do it. They likely don't QA it against non-server OSes, so you shouldn't run it on one.

Not to mention that using a workstation to perform a server's function makes systems admins cry.