Judging from a quick look at it, the double buffering looks like a good approach. However, I believe it can still happen that you get an "invalid" value returned. Theoretically the T0 interrupt could fire multiple times while you access the timestamp in the timebase_now()
function (if execution is delayed >1ms by another ISR) and would make your "double" buffering useless.
Are you sure it is even required to make your Timer0 ISR non-blocking? Since the hardware is handling all the low level TWI functions and the max data transfer speed is 400kHz, there should be enough time to handle TWI data. Updating the 4-byte timestamp variable only takes a few clock cycles. On what assumption are you expecting to loose TWI interrupts?
You say that timestamp accuracy is not critical, so another solution could be to just set a flag (or 1-byte counter) in the T0 ISR and handle updating the timestamp in the main loop. However, this only works if timebase_now()
is not supposed to be callable from within any ISR.
What is the motivation in using hardware description languages (HDL) such as Verilog and VHDL over programming languages like C or some Assembly?
C and assembly are good languages for telling a CPU what to do. They describe actions to be done sequentially by a single state machine.
HDLs are good languages for describing or defining an arbitrary collection of digital circuits. They can express operations done in parallel in ways that programming languages can't. They can also describe timing limitations for the interfaces between blocks in ways that programming languages can't.
I was surprised to see discussions expressing doubts whether to write firmware in C or Assembly (how is Assembly appropriate if you don't necessarily have a CPU?)
In that question, what's asked is, "If you are writing code for a microcontroller is there a real difference if you write in assembly or C or some other high level language?".
Since he's specifically asking about systems with a microcontroller (a CPU with peripherals), C or assembly are both reasonable choices for firwmare development, and HDL's are not.
A firmware really can be written either in HDL or in a software programming language, or it's just another way to perform the same mission?
It depends what kind of hardware you have. If you have a CPU, use a programming language. If you have an FPGA or you're designing an ASIC, use an HDL. If you are designing a very large amount of digital logic, you can look to one of the in-between languages like SystemVerilog.
I've read that the firmware mostly burned on ROM or flash. How it is represented in there? In bits, like software? If so, what's the profound difference? Is it the availability of adapted circuits in the case of firmware?
I think you are getting hung up on the term "firmware". This word originally meant code to be run on an embedded system, that wasn't accessible for the end user to change. If you sold somebody a PC, there's a very high chance that the user would change what software is run on it. If you sold them an oscilloscope, you wouldn't want them to change the code that's run on the internal microprocessor, so you called it firmware.
FPGA users appropriated the word "firmware" for the output of their designs, because it is more changeable than hardware (stuff that's soldered together). But really the "firmware" that configures an FPGA is different from the "firmware" that runs on a uC. uC firmware directs the uC through a series of states to perform it's function. FPGA firmware defines a set of interconnections between logic elements, and values to be stored in look-up tables.
In either case, the firmware is typically stored as bits on an eeprom (or on disk on a host machine that will donwload it whenever the embedded system is re-started). But that doesn't make them similar to each other.
Best Answer
This is something I deal with on an ongoing basis. We have very complex hardware that's been in the field for almost 10 years, with different versions of various subsystems. Some of the subsystems have a 2-bit code, but as you mentioned, that's not always enough.
The EEPROM suggestion is a good one, but it requires programming the EEPROM and populating the board with the right version.
What I would suggest is an 8-bit parallel in-serial out shift register like a 74HC166. The version number can be set on the input by the PC board itself tying inputs HIGH or LOW and then you only need 3 pins to load and read the shift register from an MCU.