The device I am developing is an SD card storage add-on to a retro computer, SanDisk Ultra II 1.0 GB SD card in an SD card socket (I have no idea if the brand is significant, it just happens to be what I have).
The SD card is the only device on the SPI bus; a level shifter ensures the card is fed signals at 3.3 V; MISO feeds via bus-switch to a 3.3 V FPGA.
The FPGA implements a clocked shift register mechanism, which clocks eight times, shifting out output data to MOSI and capturing input data from MISO.
Many devices attached to many computers work fine.
I use SPI mode. I initalise the card, speed it up to 4 MHz, and send the read single block command.
I observe that first data byte returned occasionally has a one-bit corruption. Typically bit 0 is set when shouldn't be, but I have also seen bit 1 set when it shouldn't be. All other bytes are perfectly correct, always.
Multiple devices were tested attaching to multiple host computers. The problem was only seen with one device attaching to one computer. The device or computer works fine with other computers or devices.
I suspected timing errors in my driver code, but I am sure I have done the eight clock cycles (sending
0xFF, reading first data byte) before I read it.
I'm wondering, could this be related to voltage or current levels, which might be subtly different depending on device or computer?
I'm wondering, does an SD card suddenly need to draw extra current, just as it starts to output data?