I am asking this stupid question for my general knowledge.
Why in serial protocol often send least significant bit first?
For example, in UART, data byte 10101100b (0xAC) is send as 00110101b (0x35).
What protocol designer may have considered?
Electronic – Asynchronous serial bit transmission/reception
serialuart
Best Answer
if you are building up a number in binary, the 'right hand' bit is the least significant, and the processes is usually counting 'up' the index, or 'bit position' which goes left towards the most significant bit. This of course is entirely related to the "endian-ness" of the system architecture - which determines if it's the 7th or the zero-th bit as most significant.
A psuedo code for the receiver in a shift register of a UART might be:
after the x number of data bits (could be 8-10 data bits in a serial frame) process is finished, and set flag for microcontroller to read my receive data register!
All of this is followed of course by 1 or 2 stop bits, and then the data line is kept at whatever idle level (usually high?) until the next transmission is detected.
Same style of thinking is used for bit-banging by software serial - as the pin level changes, increase the bit index and set/reset the bit on a "received byte" variable. It's essentially based on endian-ness of the system (little or big endian) and the fact that programmers like to use 0-indexing and starting at the lower end of a variable. Hardware UART transmit shift registers usually shift data out "backwards" than what you expect, as you mentioned in your question.