Is having a [high-end] video card important on a server

graphics-processing-unitterminal-server

My application is quite interactive application with lots of colors and drag-and-drop functionality, but no fancy 3D-stuff or animations or video, so I only used plain GDI (no GDI Plus, No DirectX).

In the past my applications ran in desktops or laptops, and I suggested my customers to invest in a decent video card, with:

  • a minimum resolution of 1280×1024
  • a minimum color depth of 24 pixels
  • X Megabytes of memory on the video card

Now my users are switching more and more to terminal servers, therefore my question:

What is the importance of a video card on a terminal server?

  • Is a video card needed anyway on the terminal server?
  • If it is, is the resolution of the remote desktop client limited to the resolutions supported by the video card on the server?
  • Can the choice of a video card in the server influence the performance of the applications running on the terminal server (but shown on a desktop PC)?
  • If I start to make use of graphical libraries (like Qt) or things like DirectX, will this then have an influence on the choice of video card on the terminal server?
  • Are calculations in that case 'offloaded' to the video card? Even on the terminal server?

Thanks.

Best Answer

Only if you're running graphically intensive stuff on the server (e.g. playing games in the server room during maintenance downtime). All those terminal server things are set through the client. The server won't do graphical processing for the clients.

The there are some edge cases where people write software to use the GPUs on high-end graphics cards to do actual processing, but 99% of the time, the graphics card runs the monitor that is attached to the computer. That's it. I haven't seen a server with more than basic onboard graphics...Well...Ever. Sun used to do some that could do fancy graphical crap, but I think that was more about looking pretty than actual functionality.

Related Topic