.net – Running 32bit .NET application in 64bit OS, is it really bad

64-bitnetperformance

I understand that we can compile a .NET Application by targeting AnyCPU which will cause to run 32bit in a 32bit OS and 64bit in a 64bit OS.

However there was a reported bug* on a 64bit OS that my app was giving an error and the solutions for that, I need to target x86.

Now my question: Is it really bad to target x86 even though when your code is going to run in x64? What sort of performance we are talking about? (my application is quite CPU intensive but it's really hard to come up with )

After all .NET Framework will run in 32bit which sounds bad to me instead of taking the full addressing power of x64 CPU**.

*I can't remember the bug but the solution was targeting x86 specifically, and solved the problem.

** I'm not sure if it's any important but my application doesn't use any Int64 variables.

Best Answer

No, it's not bad; In fact, for an application I'm working on, I have to target x86 (as it brings in COM objects, for which the vendor doesn't support x64)