Electronic – What’s happening when an LCD is initialized

lcd

It seems that the function set command is repeated multiple times, with differing delays. Why can't it be a single function set command with all the delays combined? What is characteristically special about each stage?

Datasheet

enter image description here

Best Answer

It's all to do with the 4-bit / 8-bit interface switching. It's really quite clever when you think about it.

The 4-bit interface uses bits 4-7 and ignores bits 0-3, and splits an 8-bit instruction into two 4-bit nibbles, and sends them in sequence. So, given that, the sequence:

0b00110000
0b00110000
0b00110000

in 4-bit mode says:

0b00110000 // First nibble of "Set to 8 bit mode"
0b00110000 // Second nibble of "Set to 8 bit mode"
// Is now in 8 bit mode
0b00110000 // Set to 8 bit mode (in 8 bit mode).

Now, if it happens to be half way through an nibble pair when it is reinitialized (say the MCU reboots), the sequence is interpreted as this:

0b00110000 // Second nibble of previous command
0b00110000 // First nibble of "Set to 8 bit mode"
0b00110000 // Second nibble of "Set to 8 bit mode"
// Is now in 8 bit mode

However, if it's in 8 bit mode, the sequence is interpreted as:

0b00110000 // Set to 8 bit mode (in 8 bit mode).
// Is now in 8 bit mode
0b00110000 // Set to 8 bit mode (in 8 bit mode).
// Is now in 8 bit mode
0b00110000 // Set to 8 bit mode (in 8 bit mode).
// Is now in 8 bit mode

So no matter what mode or state the display is in it receives the "set to 8 bit mode" instruction properly.

The delays are to allow the functions to execute - leaving enough time for the longest possible instruction to execute that might be left over from before the reset.