Electronic – Converting an ASCII string to bit stream

8051microcontrollerserial

I am doing a project in which I am getting data from some switches and push buttons on the pins of 8051 uC. Now I have to transmit this data on the port pins serially to the PC.
At the PC end, I am receiving the data in C# .Net application using the serial port class.
Now the issue is that I want that the data received in C# application is in the form of bit string instead of ASCII.

Can someone guide me how it can be achieved ?

Best Answer

It's not clear what you are asking since ASCII is a bit stream.

In any case, especially when you have your own application on the PC, it makes sense to send data in binary over the serial line. ASCII is for humans, but you have two machines communicating with each other. The PC can display the data in any form suitable for the users, but there is no need for that to be anything like the format of the data sent from the microcontroller.

Since the most limited end of the communications line is the microcontroller, use the easiest format for it. That will be just sending raw bytes. I usually use packets that start with a opcode byte, followed by whatever data is defined for that opcode. It is simple to send and receive in a little micro, with no need for bulky and slow ASCII to binary and binary to ASCII conversion routines.

On the PC side you have essentially infinite compute power relative to the speed of the serial line, so it can occomodate any format. However, raw binary is about as easy as it gets there too.

About the only thing to watch for is to not make implicit assumptions about the byte order the host machine uses for multi-byte values. Define whatever is easiest for the microcontroller, then work with that on the PC end. For example, let's say the micro stores multi-byte data in least to most significant byte order. Don't do something stupid like define a union in C and write the received bytes into byte field overlays of wider values. That makes your host program machine-dependent. Instead, do the shifting and all will be OK. For example, to assemble a 16 bit quantity in a wider integer, write the first byte into it directly, then OR the second byte into it after that byte was shifted left 8 bits. That will always work regardless of the host machine byte order.

As for converting binary values to ASCII, there are various facilities for that in any language. That is a pure language problem and out of scope here, and besides is trivial anyway.