Electronic – How to isolate an ATTiny from an I2C slave so ICSP will work

attinyi2cicsp

I'm building a small circuit to blink some LEDs with an ATTiny. I'm planning to use a 16-channel LED driver and talking to it with an ATTiny over I2C.

I'd like to use a SMD ATTiny and add some test pads for programming the microcontroller. I'm concerned that the I2C interface with the LED driver may prevent the ICSP from working correctly.

I'm considering putting a diode on the ICSP power pin so that it won't power the LED driver. I'm thinking that without power, the I2C pins are probably high-z.

I could also use a transistor to turn off the LED driver.

First, would this even work? Second, should I use a diode or a transistor? Third, what kind of diode or transistor should I use?

Here's where I'm headed . . .
enter image description here

Basically, I was thinking I could use the 4th test pad and a transistor to switch power off to the rest of the circuit.

Thanks a lot!!

Best Answer

When programming the chip over the ISP, just make sure you only update the MOSI line while SCK is LOW.

Any valid i2c message must begin with a start condition of a rising edge on the data line while clock line is high.

enter image description here https://i2c.info/i2c-bus-specification

The i2c slave state machine on the LED driver will never see a start and therefore should ignore all of the ISP programming operations happening on those pins.

So for this to work, you need the programmer that is driving the ISP MOSI and SCK lines to obey this "no i2c start" rule. My guess is that programmers do this naturally since it doesn't make sense to change the MOSI line while the SLK line is high in the AVR serial programming protocol since the chip could be sampling the data value during that time. But easy enough to confirm empirically!

For example, the ArduinoISP sketch (which is a software ISP programmer) only changes the MOSI line when the SCK line is low...

    uint8_t transfer (uint8_t b) {
      for (unsigned int i = 0; i < 8; ++i) {
        digitalWrite(PIN_MOSI, (b & 0x80) ? HIGH : LOW);
        digitalWrite(PIN_SCK, HIGH);
        delayMicroseconds(pulseWidth);
        b = (b << 1) | digitalRead(PIN_MISO);
        digitalWrite(PIN_SCK, LOW); // slow pulse
        delayMicroseconds(pulseWidth);
      }
      return b;
    }

https://github.com/arduino/Arduino/blob/master/build/shared/examples/11.ArduinoISP/ArduinoISP/ArduinoISP.ino#L201

My guess is that it is no accident that this happens to work. The serial programming protocol was likely intended to be compatible with i2c slaves that happen to be hanging off these same pins.