Electronic – MCU Frequency – Detail Explanation

microcontroller

Really, I have been writing codes for embedded application for a while. But this question struck me, and I seem not to give more answers to it myself.

Can I get more details on the reason for selecting an MCU with a particular frequency XXhz. For example when you say ARM® Cortex®-M3 revision 2.0 running at up to 48 MHz, what is the implication to the over-all performance of the Chip and connecting circuit as a whole? Why should I choose this Frequency and not a lesser or higher frequency instead?

I can understand that it determines the speed of the MCU, but could that be that all?

Also, what effect does the external crystal oscillator have on the clock Frequency as well.

Thanks
Paul.

Best Answer

You don't start out chosing a particular frequency. That eventually falls out of other requirements. Just the frequency spec accross a broad range of processors is pretty meaningless.

The real spec is some minimum performance or latency reqirement the processor has to meet. In general for any one microcontroller, processor performance is proportional to clock speed. A large portion of the power current is also proportional to clock speed, so that is one reason to not make it wildly higher than needed in power-sensitive applications. For high end general computing processors, performance is not necessarily proportional to clock speed because there are issues like cache hit percentage, memory latency, etc. Small microcontrollers intended for self-contained embedded applications don't usually have these kinds of advanced architectures and performance is pretty much linear with clock speed.

However, clock speed is a poor indicator of performance accross different microcontroller architectures. Some microcontrollers, like low end PICs for example, require 4 clock cycles per instruction cycle, some 2, and some even just 1. Then there are differences in what each architecture can accomplish in a instruction cycle. Comparing clock frequency between anything other than related processors in the same family is largely meaningless.

Another issue is that some micros have fancy internal clock chains including PLLs and dividers. The purpose is so that they can run at a variety of speeds from easy to use and find crystals. 8-16 MHz is a nice frequency for a crystal. You can certainly use crystals well outside that range, but 8 MHz is about the limit where really small packages become available, and having the external clock be otherwise as slow as reasonable is a good thing. That then brings up the question as to what the "clock speed" really is. Is is the external clock frequency you actually feed into the chip, or what the chip derives from that inside before using it otherwise? Each of these are relevant in different ways.

In short, focusing on microcontroller "clock speed", whatever that really means, is like obscessing about piston displacement and turbo boost overpressure when all you really want to know is horsepower and fuel economy. You have little reason to care how they got there, only what the result is.