Character Encoding – Impact of 8-bit, 16-bit, and 32-bit Characters

character encodingunicode

Well, I am reading Programing Windows with MFC, and I came across Unicode and ASCII code characters. I understood the point of using Unicode over ASCII, but what I do not get is how and why is it important to use 8bit/16bit/32bit character? What good does it do to the system? How does the processing of the operating system differ for different bits of character.

My question here is, what does it mean to a character when it is a x-bit character?

Best Answer

It relates to the amount of possible letters/numbers/symbols a character set can have. An 8-Bit character can only have 256 possible characters. Whereas a 16-bit can have 65,536. A 32-bit character can have 4,294,967,296 possible characters. A character set that large should be able to store every possible character in the world. Whereas a 8-bit character set can only store enough characters for the English language. 8-bit character sets were the preferred standard in the early days of computing where memory was measured in bits and in some cases, KB. But with computers with multi-core processors and gigs of RAM it is not such a concern anymore (except in some rare cases)

Related Topic