Electronic – ASCII value error during communication (UART) in AVR


I am trying to implement following program in Atmega328P.

#define F_CPU 1000000UL
#include <avr/io.h>
#include <uart.h>
#include <util/delay.h>
#define BUAD 9600
#define BSR ((F_CPU/16/BUAD)-1)

void initUart()
    UBRR0H = (BSR>>8);
    UBRR0L = BSR;                                    /////// setting the baud rate
    UCSR0B = (1<<TXEN0) | (1<<RXEN0);               //////// Enabling the transmitter and receiver
    UCSR0C = (1<<UCSZ00) | (1<<UCSZ01);         //////// Setting the frame of 8 bit for character and 1 stop bit.   

void sendByte(uint8_t data)
    } while (!(UCSR0A & (1<<UDRE0)));
    UDR0 = data;

uint8_t receiveByte()
    {} while (!(UCSR0A & (1<<RXC0)));
    return UDR0;

int main(void)
    char share;
    DDRB = 0xff;
    while (1) 
        share = receiveByte();
        PORTB = share;

    return (0);

What this program basically does is take a character from the terminal (I tried Bray's terminal and Tera term), display it within the terminal and show the corresponding ASCII value of the character in 8 LEDs connected to PORT B.

I can't state my problem in words so please bear with me. The problem is as follows:

When I send 'a' , I expect the LED to light up as (0b01100001). But my LEDs show up 0b11100001. The last bit is the first bit to be off in the upper nibble but that's not what I am getting.

I send 'a' but I receive รก (0b11100001). Similarly when I send 0 (0b00110000), I receive 'p' (0b01110000). The first bit to be reset in the uppper nibble always sets whenever I send something. Then the wrong character is displayed in the terminal and the wrong LED representation.

I feel like there is something wrong when I receive my character into the chip from the terminal but can't figure out what exactly.

Best Answer

1 MHz clock cannot achieve 9600 BPS rate with 16x oversampling. You must use faster clock, slower baud rate, or use U2X bit to enable 8x oversampling.