Electrical – What Is the timer period of a timer of a Pic

microchipmicrocontrollerpictimer

This is causing a bit of confusion to me. I'm assuming that timer period of a timer is the time that it takes for an increment of timer. Am I correct?

enter image description here

It works the same way for 8-bit and 16-bit timer, right?
In this case I suppose that if both increment at the same time that the only difference is that one will count till 256 while the other to more than 65500.

Do the timer always count to its maximum value, in the case of 8-bit 256, till it overflows or it depends of the time period that we set?

Best Answer

I'm answering based on the MCC image you posted which relates to PIC16F and not your chosen PIC32MX which may be the source of your confusion. You should run MCC with your PIC of choice to fully understand what it is doing with your chosen PIC.

The timer period is the time it takes for the timer interrupt to occur. MCC just works out hidden variables so you don't need to worry about the timer's inner workings.

For example:

Let's say you have an 8MHz clock going into the timer. And you have prescale and postscale set both to 1. Typically the clock going into the timer is divided by 4 (Fosc/4), again this is dependent on PIC. Then that's 2MHz going into the timer.

Some PIC's issue an interrupt when the timer overflows, some PIC's may issue an interrupt when it matches a preload register value. If the PIC does not have an internal preload register value for the timer, then MCC will generate a software seed for the TMR value.

For an 8-bit timer it will take 256 clocks to overflow. If we have 2MHz going in to the clock that's 0.000128s maximum time for overflow or interrupt. So if you don't use a preload/seed value then we just need to work out the prescale/postscale to get our desired interrupt rate based on the maximum time. MCC may be smart enough to also use the preload register if one exists or create a seed value that it reseeds with each interrupt to get the desired interrupt rate (timer period) between the minimum and maximum time. The minimum time would be 8MHz/4 with this set up. The maximum and minimum periods are shown on MCC

min and max period

So when you change the requested period, MCC may work out a preload value (either in hardware if supported or software as a seed) between the max and min periods. The max and min periods can only be changed by changing the pre/postscale manually in MCC.

So In summary...

I'm assuming that timer period of a timer is the time that it takes for an increment of timer

That's a no. The period is the time it takes for the timer interrupt to occur. (source clock x prescale x postscale) is typically the time it takes for an increment of the timer. The source, prescale and postscale values of the timer may be changed and may differ from PIC to PIC. Please refer to the appropriate PIC datasheet that you are using for more detailed information on these values.

Do the timer always count to its maximum value, in the case of 8-bit 256, till it overflows or it depends of the time period that we set?

In software MCC may generate code to reset the TMR value to a seed value to achieve the period that you set. MCC may make use of the PR (preload) register to achieve the same effect. In hardware, again, this depends on the PIC you are using, I'd refer to the datasheet, as the timer may reset when it matches the PR value.

It works the same way for 8-bit and 16-bit timer, right?

Yes, 8-bit will count up to 255 and overflow at 256 (reset to 0). 16-bit will count up to 65535 and overflow at 65536 (reset to 0).

The actual period is shown in MCC as integer registers introduce error to the timer. 16-bit timer will give you better resolution and will give you less error than an 8-bit. Again if the PIC supports both types of timers I'd read up on the datasheet to see how to implement each.

Your chosen PIC32MX however uses 16-bit timers and supports preload.