Quick Intro to I2C
Along with USART and SPI, I2C is definitely the most common interface used by a microcontroller to communicate with peripherals. In order to implement an I2C bus all you need is two open-collector collector pins, one for the SCL (clock) line and one for the SDA (data) line. It has to be open-collector because there are times during the protocol when two devices drive the clock line at the same time which can lead to a short circuit if one device drives it high and one drives it low. This way, the bus lines are high by default due to the pull up resistors – if a device wants a line to go low, it just shorts it to ground via an internal transistor. There is no path from VCC to GND that does not contain a high-valued resistor.
So if a microcontroller manufacturer wants to allow you to use the I2C protocol without external transistors, they must at least provide you with two open-drain IOs. This is exactly what Atmel does, for example, in the ATTiny85. What they don’t do however is provide you with any useful hardware to manage the bus transactions, they instead provide you with their Universal Serial Interface which is pretty useless for I2C.
What do I want from my I2C bus hardware peripheral?
If all I had was two-open drain inputs and no hardware peripheral, my microcontroller would have to baby-sit the bus. It would have to set the clock high, wait, set the clock low, wait, update the data line, wait, set the clock high and wait over and over again to send data on the bus which leaves no time to do anything else. This is obviously undesirable, but may be acceptable if your microcontroller isn’t doing anything else, or anything time critical.
What I want is for my microcontroller to be able to tell the peripheral, “Go send <message> on the bus, and interrupt me if you hit any errors or you finish – I’ll be doing something else”. This way, no clock cycles are wasted and we can focus on more important tasks – this means the process is non-blocking because it does not block the processor from doing other things.
So what does the USI provide?
The USI provides multiple features, most of which have caveats making them unsuitable for I2C.
The USIDR Shift Register
The USIDR can be loaded with data which can be shifted, one-bit at a time, onto the data bus. This means the processor doesn’t have to store this data is normal memory and then extract the most significant bit, put it on the bus and shift the data left manually every clock cycle. Great, we are saving memory and clock cycles. But wait, the USIDR register is only 8-bits long and the basic I2C transaction requires 18-bits! So we can put 8-bits into the register, but we still have to store at least 10 more in memory. In fact, to read a single byte from many devices needs us to clock 36 bits (not including stop/start bits), here is an exert from the TVP7002 datasheet.
<sarcasm> Great, we are saving 8-bits of memory </sarcasm>
There is available a 4-bit counter that is incremented each time a new bit is shifted onto the data bus. This is good because it saves us the effort of incrementing our own counter in memory each time we shift data onto the bus. Oh, but wait, it can only count to 16 because it’s only 4 bits long… Didn’t I just say that the minimum useful I2C transaction is 18 bits?
Automatic Clocking on Timer Interrupt
If we want our processor to keep processing while the I2C transaction occurs, we need something else to keep time for us. Luckily, the USI can be clocked by a hardware counter in the chip. Each time the counter reaches a certain value it resets and the next bit is shifted onto the bus. But it doesn’t also toggle the clock! So the processor has to catch the counter interrupt and manually toggle the clock, at which point we might as well be shifting the data bits ourselves, but at least everything is now interrupt driven right?
But remember, the I2C protocol insists that the data line can only change when the clock line is low. Therefore the clock line and the data line need to toggle on alternating cycles, out phase with each other. Therefore we either need a second counter, 180 degrees out of phase with the data counter (hard to do and wasteful), or we just manually toggle everything ourselves on every second counter interrupt.
Suddenly we find ourselves performing a significant number of clock cycles on each interrupt. Given that I2C often runs at 100kHz, and we need the timer to trigger 4 times per clock cycle to catch all the edges on the data line and clock line, we need to interrupt 400,000 times a second. An ATTiny’s internal oscillator is only 8MHz, so this gives us 20 clock cycles per interrupt, not a lot. Even at the maximum speed of 20MHz we only have 50 clock cycles per interrupt – not a lot if you want to achieve anything else.
All of these things combined, the USI saves you practically no clock cycles or memory. This is probably why all the implementations I’ve seen of I2C using the USI are blocking implementations – which is fine in many cases.
To be fair, the USI does provide a means of detecting stop and start conditions which may be useful if you are writing an I2C slave rather than a master – but if you’re the master you generate these conditions yourself anyway.
Although the datasheet never says that the USI is meant to be used as an I2C bus, the datasheet heavily hints at it by taking about “Two-wire synchronous Data Transfer”. I wish they would make it clearer that it isn’t suitable.