Electrical – Garbage data when using UART on atmega2560

avrfirmwareuart

(updates at bottom)

I am using an Arduino Mega with the SoftwareSerialExample sketch on it to send and receive bytes through UART on my ATmega2560 (which is actually on a Arduino Mega Board).

Terminology:

  • MySerial Arduino = Arduino with SoftwareSerialExample
  • Atmega2560 = mcu on Arduino Mega board Im coding with Atmel Studio 7

Some background: I'm using a different Arduino Mega w/"Arduino as ISP" to flash my ATmega2560 using Atmel Studio 7 so I'm working without a debugger for now. please dont tell me to buy a debugger, it is not an option right now. Also my clock values and fuses are right since I've already generated and measured a 300Hz pwm signal on a different program.

Problem: I can send and receive data through UART on the atmega2560, but all the data is garbage.

Below is the code I have on the Myserial Arduino. All I did was change baud rates to what i have setup on my atmega2560. Yes I have the baud rate correct in the serial terminal of the arduino IDE. I also tried typecasting the myserial.read return but no success.

#include <SoftwareSerial.h>

SoftwareSerial mySerial(10, 11); // RX, TX

void setup() {
  // Open serial communications and wait for port to open:
  Serial.begin(9600);
  while (!Serial) {
    ; // wait for serial port to connect. Needed for native USB port only
  }


  Serial.println("Goodnight moon!");

  // set the data rate for the SoftwareSerial port
  mySerial.begin(9600);
  mySerial.println("Hello, world?");
}

void loop() { // run over and over
  if (mySerial.available()) {
    Serial.write(mySerial.read());
  }
  if (Serial.available()) {
    mySerial.write(Serial.read());
  }
}

Below is my AVR code on the atmega2560. I have a UART_RX interuppt working since I can blink an LED when I send a byte through MySerial Arduino. I can also transmit from the atmega2560 since i am seeing garbage bytes on Myserial Arduino serial port at the appropriate times. I say garbage bytes since they are always empty squares (ascii 176 to be specific). The data is always garbage and I am not sure why. I want to have a short LED pulse when I receive a 0 on the atmega2560 and a long LED pulse when I receive a 1 so I know I am sending actual data through UART.

Question: Why is the data I'm sending/receiving always garbage.

#include <avr/io.h>
#include <avr/sleep.h>
#include <avr/interrupt.h>
#define F_CPU 16000000 
#include <util/delay.h>

//USART defines
#define BAUDRATE 9600
#define UBRR ((F_CPU/(BAUDRATE*16UL))-1) //UBRR=CPUclock/16/baud - 1 (from datasheet)
#define RX_COMPLETE       (UCSR1A & (1<<RXC1))
#define DATA_REG_EMPTY    (UCSR1A & (1<<UDRE1))

/* Function Prototypes */
void usart_init(void);
void usart_send(unsigned char ch);

/* Global variables */

/* UART variables */
volatile uint8_t Rx_Flag = 0;
volatile uint8_t data;

/* UART Interuppt Serive Routines */
ISR(USART1_RX_vect){
    Rx_Flag = 1;
    UCSR1A |= (1<<UDRE1);
    data = UDR1;        
}//uart1_rx_isr


int main(void) {
    usart_init(); //enable usart1
    sei();        //enable global interrupts    

    /* Blink LED at startup */
    DDRB |= (1<<PB7);//sets PB7 as GPIO output
    PORTB |= (1<<7); //blinks LED
    _delay_ms(2000);
    PORTB &= ~(1<<7);
    _delay_ms(2000);
    /************************/

    while (1) {

        usart_send(0);  //Tx char to MySerial Arduino
        _delay_ms(1000);
        PORTB &= ~(1<<7); //keep LED off

        if(Rx_Flag == 1){ //if RX flag set by ISR
            if(data==0){  //short blink if receive 0    
                PORTB |= (1<<7);
                _delay_ms(500);
            }  else if (data==1){ //long blink if receive 1
                PORTB |= (1<<7);
                _delay_ms(2000);    
            } else {}//This block of code never runs since I am never receiving 0's or 1's WHY??????
            /*this commented code works with the ISR. My LED blinks when I send a byte through the Serial terminal on the MySerial Arduino*/
            //PORTB |= (1<<7); 
            //_delay_ms(500);   
            //Rx_Flag=0; //clear RX flag
            /******************************************************************************************************************************/
        } else{}
    }
} //MAIN END!!!!!!!!!!!!!!!!!!!!!!

/*  UART FUNCTIONS*/
void usart_init(void) {
    //set baud rate
    UBRR1L = (unsigned char) UBRR;
    UBRR1H = (unsigned char)(UBRR >> 8);
    //enable rx/tx for USART1 and enable receive interrupt
    UCSR1B = (1<<RXEN1) | (1<<TXEN1) | (1<<RXCIE1);
    //set frame format: 1 stop bit, 8bit data
    UCSR1C = (1<<USBS0) | (3<<UCSZ11);
} //usart_init

void usart_send(uint8_t byte){
    while(!DATA_REG_EMPTY);
    UDR1=byte;  
}//usart_send

Update 1: As a sanity check I reflashed the Arduino bootloader on the ATmega2560 and uploaded the SoftwareSerial sketch to verify that I could communicate back and forth between the MySerial Arduino and the ATmega2560. I also double checked the fuse values I'm using to config the AVRdude external tool to flash the ATmega2560 with Atmel Studio 7. They are the same ones that AVRdude uses to flash the Arduino bootloader.

Update 2: I can't figure out how the baud rate generator setting is wrong. enter image description here I'm using to define UBRR. And I've been going through trial and error on a different program that just uses polling and made the init function identical to whats in the datasheet. I even placed a stopwatch up next to the serial terminal and the garbage is coming in every second. I can't mess with the baud rate since 9600 is a requirement and I am at my wits end since I agree that garbage data means baud rate issue, but nothing in my code seems to show that I'm setting the baud rate incorrectly.

Best Answer

So I fixed the issue and can now send data. I was configured UCSR1C wrong because I don't understand how bitwise left shift operator works when youre shifting in multiple bits.

I commented out setting the UCSR1C register so it would be at the default which is what I wanted in the first place and it started working.

I experimented with other ways of setting it to help with my understanding and the two that worked for me to hard set the default config were

  • UCSR1C = 0b00000110
  • UCSR1C = (1 << UCSZ11) | (1 << UCSZ10)

FRob was really close and pointed me towards the location of the issue and I thank him for that.