Electrical – ATmega 32 Clock Frequency Issue

atmegaclock-speedmicrocontroller

I'm currently using ATmega32 device with external clock speed of 16 MHz . the problem when i use a delay of for example 16,000 ms in the software, it actually delays only a 1000 ms in actual world. I mean the uC CPU divides the delay time by the clock frequency in MHz to make the delay (16,000 ms / 16 MHz). Also when i use a 8 MHz clock with the same amount of delay it would make a delay of 2,000 ms.
What would be the reason for that?

Edit "Here is my code"

include avr/io.h>

include util/delay.h>

include stdio.h>

include stdlib.h>


define F_CPU 16000000 

define BAUD 9600

define BRC  ((F_CPU/BAUD/16)-1)


void USART_Init(unsigned int brc)

{   

// set baud rate

    UBRRH = (unsigned char) (brc >> 8)  ;
    UBRRL = (unsigned char) brc ;

    UCSRB = (1 << TXEN ) | (1 << RXEN) ; // enable receiver and transmitter
    UCSRC = (0 << UMSEL); // setting Asynchronous mode
    UCSRC = (1 << UCSZ1) | (1 << UCSZ0); // set communication number of bits (8 in this case)
}

int main(void)
{


    USART_Init(BRC) ;
    while(1)
    {
        char c;

        _delay_ms(32000);
        for (int i=0 ; i<=3 ; i++)
        {
            if (UCSRA & (1 << UDRE))
            {

                UDR = i;
                _delay_ms(8000);
            }
        }

    }
}

Best Answer

You cannot use _delay_ms to wait for that long. According to the AVR GCC description of the function:

The maximal possible delay is 262.14 ms / F_CPU in MHz.

When the user request delay which exceed the maximum possible one, _delay_ms() provides a decreased resolution functionality. In this mode _delay_ms() will work with a resolution of 1/10 ms, providing delays up to 6.5535 seconds (independent from CPU frequency). The user will not be informed about decreased resolution.

Also, you should check that you set CLKDIV8 and CLKSEL correctly for your system.