Electrical – Delay doesn’t work (avr attiny 26, delay.h)

attinyavrdelayinterruptsmicrocontroller

I am trying to learn how to write programs to microcontroller, and started with very simple things like LED blinking with given frequency, or controlled by button.
However, I have problems with using _delay_ms() from delay.h. As fas as I understood, for the correct work of this function, optimization should be turned on. When I write program to the microcontroller, instead of LED blinking, LED is constantly turned on with low brightness. When I tried just turn it on without delays, brightness is much higher, so looks like for some reason microcontroller gives some intermidiate low voltage, I am wondering why and what could be the problem. Below I describe what exactly I was doing.
Code (main.c) looks like this (I am using attiny26, and asssume that it works at 1MHz).

#define F_CPU 1000000UL // 1 MHz clock speed
#include <avr/io.h>
#include <util/delay.h>
int main(void)
{
     DDRA |= (1<<PA0);  //A0 as output
     while(1) //infinite loop
     {
          PORTA |= (1<<PA0);// LED at A0 ON
          _delay_ms(1000); // 1 second delay
          PORTA &= ~(1<<PA0); //LED at A0 OFF
          _delay_ms(1000); // 1 second delay
          }
 }

I make object file with

avr-gcc -mmcu=attiny26 -O1 -c main.c

where -O1 is for level 1 of optimization. Then

avr-objcopy -O ihex main.o main.hex

and write to microcontroller:

sudo avrdude -c usbasp -p t26 -B 100 -U flash:w:main.hex:i

All the steps were completely without any errors or warnings. As I told above, resulting behaviour is weird: LED is constantly on with lith low brightness. Meanwhile, if I write to microcontroller program, where I just turn LED on (remove all the delays), LED will be turned on with full brightness. I was thinking if the optimization itself can be a problem, but in the regime without delays program works correctly in cany case – with optimization or without.

Controlling anything with button would require debouncing, and the same _delay_ms() function, which doesn't work for me.

UPD: Commenting out #define F_CPU 1000000UL or increasing argument of _delay_ms() doesn't help. But increasing delay time by making iteration for (i=0; i<100000; i++) {_delay_ms(1000);} leads to LED being constantly turned on with full brightness. Tested smaller times of iteration (10, 100, 1000, 10000) do not give full brightness of LED.

Best Answer

The problem is you are creating an object file, but not linking it. Without linking, branch instructions in the machine code are not filled with the required destination addresses, so the code simply executes linearly until it 'falls off the end' into unallocated memory. Eventually it reaches the end of memory and wraps around to zero to execute your code again. The end result is the LED flashes with a very low duty cycle.

The simplest fix is to just remove the -c option from your compile command, ie:-

avr-gcc -mmcu=attiny26 -O1 main.c

This creates an ELF file called 'a.out' which you convert to HEX with:-

avr-objcopy -O ihex a.out main.hex