Electronic – Making variable delay routines

delaypic

I know how to write accurate delay routines in assembly to run on a PIC microcontroller. The problem I have is creating delays of a arbitrary length.

What I want to do is accept some data which tells me how long to delay for, and then the PIC will write a port high or low for that period of time. So if I have a 1uS delay and a 1mS delay, I can break up the data and find out how many times I need to call each function. for example:

0.123567 Seconds means I would call Delay_1uS 567 times, then Delay_1mS 123 times.

The problem is that all these calls create delays of their own which means that there will be an error is the total delay time.

To combat this I can calculate the incurred delay of the calls and subtract this from the the total delay time. However now the total number of calls is less and thus the incurred delay is less… This seems like some sort or recursive optimization problem that I do not know how to solve. Is there a "best practice" way of doing this?

Best Answer

Delay loops are nasty, and you shouldn't be using them very often.

Alternatives include timers, periodic interrupts and such like.

That said, they're occasionally useful. Be sure to take care of the WDT somewhere during the looping or you could have some nasty problems.

There's an assembly language code generator here that generates cycle perfect assembly delay routines (delay time fixed at assembly, not variable).

There's no reason why you couldn't make a cycle-perfect programmable delay generator in asm but I think it would be a bit complex. Interrupts would cause inaccuracy, but often you only need to guarantee a minimum time and if it runs (say) 11ms rather than 10ms it's not a big deal.