Does anyone even care about what
you're developing on?
Yes and no. I've been developing on the AVR32 for a particular project, and the development environment (in particular the compile/program/debug cycle) is horrendous compared to, for instance, PIC32.
The customers don't care, except for cost and maintenance, and in the case of an arduino-like system the programmers wouldn't care because the arduino environment and development cycle is leaps and bounds better than the current AVR32 setup.
I just wonder because there is such a
strong contingent for AVRs in the
Arduino family. I understand that they
are the official processor but there
isn't a reason the code couldn't be
ported to an ARM or a Freescale
architecture other than cost, right?
As long as there is onboard memory, I
figured there could be easy migration
into those parts.
There's no reason another processor couldn't be used, but there's a very good reason they've chosen a low end 8 bit device rather than an ARM, MIPS, PowerPC, etc device: Ease of use.
If you've looked at the setup for the even the low end arms, it's an order of magnitude more complex (memory mapping, caching, etc) than an 8 bit processor. But even more importantly - at the time there were no DIP arm processors, and these were meant to be used and buildable by artists and hackers, not necessarily electronic technicians and engineers who feel comfortable with even a 48 pin TQFP.
The reason the AVR was chosen over the PIC is that the PIC doesn't really have a widely used, open source, free C compiler, among other things (the SDCC port isn't mature).
I see a lot of ARM in industry (seems
like every vendor is pushing one into
their designs) and was wondering why
there wasn't more uptake in the
Arduino developer world. Thoughts?
Mainly it's due to ease of use - complexity, easy to solder, cost, and the fact that there's not much need for it. Developers like the idea of having a lot of power, but at the end of the day when all you need to do is move some servos and flash some lights with a low-end FFT, an 8 bit processor is just fine.
Even the low end cortex ARMS coming out in 28 pin packages are still SOIC, not DIP.
So the AVR had all the right features:
- Easy to solder
- Easy to get via mail order all over the world
- Free GCC C compiler
- Easy to understand the processor and peripheral setup and usage
- Cheap
- Ubiquitous - lots of people and experience surrounding the AVR family
Largely this is still true - I don't know of an ARM in a dip format, and the adapters make it significantly more expensive than the AVR. For the most part, manufacturers don't think that a DIP packaged 32 bit processor is going to be very profitable.
The rate at which these gates are turned on and off seems to have no relation to the power used.
This is where you are wrong. Basically, each gate is a capacitor with an incredibly tiny capacitance. Switching it on and off by "connecting" and "disconnecting" the voltage moves an incredibly tiny electrical charge into or out of the gate - that's what makes it act differently.
And a moving electrical charge is a current, which uses power. All those tiny currents from billions of gates being switched billions of times per second add up quite a bit.
Best Answer
Assuming "n bit" refers to the size of the general purpose registers all arm processors are 32 or 64 bit.
Prior to designing the ARM processor Acorn's computers had been designed around the mos technology 6502 and variants. They decided to leapfrog over the 16 bit generation and go straight to a 32-bit design. Instructions were 32 bits wide and so was the data path. Initially some bits were used as flag bits limiting addreses to 26-bit but these flags were later done away with allowing a fully 32-bit address space.
Acorn computers failed to maintain a viable position in the market but before that happened ARM had been spun off into a seperate company. ARM was a low power architecture and started to see a lot of use in mobile/embedded computing. Eventually as transistors got cheaper (in both cost and power terms) it worked it's way down into the microcontroller market.
Arm later introduced a mode called "Thumb" (sometimes reffered to as "Thumb1") where the instructions were only 16 bits wide. The registers however remained 32 bit. Thumb in it's original form was an incomplete instruction set, certain important operations could only be performed by switching back to Arm mode.
Even more recently arm introduced "Thumb2". This extended Thumb to produce a complete instruction set using a mixture of 16-bit and 32-bit instructions. Again the data registeres remained 32-bit. Modern ARM microcontrollers only support Thumb2 mode (sometimes with only a subset of the full "thumb2" instruction set), not traditional arm mode.