CIL and CLR in .NET – Why They Are Required

net

I saw this nice image here. I learned that all the compilers that support .net language convert the source code to CIL format. Now Microsoft is never bringing in .NET for all the operating system by writing a CLR for all operating systems. Then why keep such an intermediate code format and a CLR to execute that CIL. Is that not an headache to deal with. Why did Microsoft choose be like this ?

EDIT This kinda architecture has its price. It will reduce the performance, wouldn't it ? Java does this to maintain platform independence, for what reason .NET do it ? Why not keep a simple plain C like compiler. Any way will also require a compiler to convert the code to CIL if I need to add up any new language, the only difference it would make is the target language. Tat's all.

Best Answer

Because they only need to write one compiler for C# to CIL - which is the hard part. Making an interpreter (or more often, Just-In-Time compiler) for the CIL per platform is relatively easy compared to writing a compiler from C# to (per platform) executable code.

Beyond that, the runtime can handle anything that compiles to CIL. If you want a new language (like F#) you only have to write one compiler for it and you auto-magically get all the platform support for things .NET supports.

Oh, and I can take a .NET dll and run that on windows or on linux via Mono without recompilation (assuming all of my dependencies are satisfied).

As for performance, it's debatable. There are "pre-compilers" essentially that take the CIL and make native binaries. Others argue that just-in-time compilers can make optimizations that static compilers simply cannot. In my experience, it depends a lot on what your application is doing and what platform you're running it on (mostly how good the JITer is on that platform). It's extremely rare for me to run into a scenario where .NET wasn't good enough.