This is causing a bit of confusion to me. I'm assuming that timer period of a timer is the time that it takes for an increment of timer. Am I correct?
It works the same way for 8-bit and 16-bit timer, right?
In this case I suppose that if both increment at the same time that the only difference is that one will count till 256 while the other to more than 65500.
Do the timer always count to its maximum value, in the case of 8-bit 256, till it overflows or it depends of the time period that we set?