Right, so we have 8-bit, 16-bit and 32-bit microcontrollers in this world at the moment. All of them are often used. How different is it to program 8-bit and 16-bit micrcontrollers? I mean, does it require different technique or skill? Lets take microchip for example. What new things does a person need to learn if they want to transition from 8-bit microcontrollers to 32-bit microcontrollers?
Digital design does not have a lot in common with software development (maybe except that Verilog syntax looks a bit like C language but it just looks). Thus it is very hard to answer this type of question adequately. But as a guy who walked a path from software development to hardware design I'll give it a shot. Looking back at myself, here is how I would have advised myself back then if I knew what I know now:
Start from scratch
Forget everything about software development. Especially programming languages. Those principles do not apply in digital design. It probably would be easy for a guy who designed a CPU to program it in assembler or even C, but an assembler programmer won't be able to design a CPU.
On your learning path do not tend to solve what seem to be an easy problem with your existing knowledge from software. One of the classic examples is a "for loop". Even though you can write a for loop in, say, verilog — it serves a different purposes. It is mainly used for code generation. It may also be a for loop as software developers see it, but it won't be good for anything but simulation (i.e. you won't be able to program FPGA like that).
So for every task you want to tackle, don't think you know how to do it, do a research instead — check books, examples, ask more experienced people etc.
Learn hardware and HDL language
The most popular HDL languages are Verilog and VHDL. There are also vendor-specific ones like AHDL (Altera HDL). Since those languages are used to describe hardware components, they are all pretty much used to express the same thing in a similar fashions but with a different syntax.
Some people recommend learning Verilog because it looks like C. Yes, its syntax is a mix of C and Ada but it doesn't make it easy for a software developer to lean. In fact, I think it may even make it worse because there will be a temptation to write C in Verilog. That's a good recipe for having a very bad time.
Having that in mind, I'd recommend staring from the VHDL. Though Verilog is also OK as long as the above is taken into account.
One important thing to keep in mind is that you must understand what you are expressing with that language. What kind of hardware is being "described" and how it works.
For that reason, I'd recommend you get yourself some book on electronics in general and a good book like this one — HDL Chip Design (aka as a blue book).
Get a simulator
Before you start doing anything in hardware and use any Vendor-specific features etc., get yourself a simulator. I was starting with a Verilog, and used Icarus Verilog along with GTK Wave. Those are free open-source projects. Run examples you see in books, practice by designing your own circuits to get some taste of it.
Get a development board
When you feel like going forward, get a development board. If you know that your employer wants to go with Lattice, then get Lattice board.
The programming methods are very similar, but there are details that are different. For example, different tools, different options, different interfaces. Usually, if you have experience with one vendor, it is not hard to switch. But you probably want to avoid this extra learning curve.
I'd also make sure that the board comes with components that you are planning to use or is extendable. For example, if you want to do design a network device like a router, make sure the board has Ethernet PHY or it can be extended through, say, HSMC connector, etc.
Boards usually come with a good reference, user guide and design examples. Study them.
You will need to read books. In my case, I had no friends who knew digital design, and this site wasn't very helpful either because of one simple thing — I didn't even know how to phrase my question. All I could come up with was like "Uhm, guys, there is a thing dcfifo and I heard something about clock domain crossing challenges, what is it and why my design doesn't work?".
I personally started with these:
- Advanced Digital Design with the Verilog HDL.
- 100 Power Tips for FPGA Developers
- Advanced FPGA Design - Architecture, Implementation and Optimization.
FPGA vendors have a lot of cookbooks with best practices. Study them along with reference designs. Here is one from Altera, for example.
Come back with more specific questions
While you go through your books, simulate a design, blink some LEDs on your development board, you would, most likely, have a lot of questions. Make sure you don't see an answer to those on the next page of the book or online (i.e. in the Lattice-specific forum) before asking them here.
Intel HEX files are always byte-addressed. This does not mean they can't handle information for other word sizes, only that there needs to be a convention about how those words are mapped to the bytes of the HEX file.
Just like with all the other non-byte addressed PICs (PIC 10, 12, and 16), the addresses are doubled in the HEX file. PIC programmer software knows this and interprets the HEX file addresses accordingly. This is of course all well documented in the programming spec for whatever part you want to program.
You say you want to make your own programmer. That's fine as long as you understand this will take way more time and frustration than just getting a known working one. If the point is the experience and learning of making your own, then fine, but otherwise go buy one.
If you really do want to make your own, you should look at the code for my PIC programmers. All the host code and firmware is open and available in the Development Software release at http://www.embedinc.com/picprg/sw.htm. By looking thru the host source code, you can see how there are flags indicating whether HEX file addresses are doubled for various parts of the PIC's memory.
If you make your programmer compatible with my PIC programmers protocol, then you can make use of all my host-side tools. This could be very helpful when bringing up your system since you have known working code on the other side. The protocol spec may look intimidating at first, but look carefully and you will see much of it is optional, especially if you plan to only support a single PIC.
- Electronic – Figuring out the functions of PIC numbers in a data file
- Electronic – arduino – If one has used PIC uC, how different is it to migrate to using a different uC like say Arduino or ARM
- Electronic – Jumping from microcontroller to DSPs
- Electronic – Do CMSIS libraries also handle GPIO registers
- Electronic – Existing UART libraries on pic/atmega/