Do SD cards operating at 1.8V save any power compared to 2.7 – 3.6V

low-powersdswitch-mode-power-supplyswitching-regulator

Excuse me if I'm repeating my previous question, but it wasn't getting any answers probably because it was too broad.

So, let me narrow the question down to just UHS SD cards that can operate at 1.8V (internally, they still use a charge pump to raise the voltage for writes).

How much power can you expect to save or lose compared to operating at the standard 2.7 – 3.6V?

I was hoping to use it for my battery powered wild life camera, but after some investigation, I'm seeing 2 huge setbacks:

  1. the complexity – all SD cards need to be initialized at 2.7 – 3.6V, probably for backwards compatibility. After that, a CMD5 is sent requesting 1.8V operation. Then the voltage to the SD card is lowered to 1.8V. Likewise, the processor will need to either lower its I/O voltage or use a level translator.

To me, this is hopelessly complicated, hardware wise. I did find some chips that conveniently have both a configurable DC-DC converter (to select 1.8V or 3V) and level shifters:

ST6G3244ME
enter image description here

Similar: NXP IP4855CX25

But I don't want to use another chip since I'm only working on this as a hobby and am already very constrained by my 1 layer PCB.

Also note that the voltage converter on that chip is an inefficient linear regulator, which brings us to the next issue.

  1. It seems using 1.8V will use the same or even more power than the 2.7V due to voltage conversion losses.

Why? Clearly, 1.8V will reduce the I/O power between the card and the processor. But that requires either an extra regulator to step 2.7V down to 1.8V or use a single regulator that supplies 1.8V to both the SD card and the processor (microcontroller in my case).

I've looked at some switched capacitor regulator (charge pump) data sheets, and the efficiency decreases as the absolute difference between the output and input voltage increases. This makes sense because the closer the input and output voltage, the less the regulator needs to charge/discharge the capacitor(s) and it's well known that charging a capacitor wastes 1/2 the energy.

Therefore, it seems you're actually wasting power by using 1.8V since the conversion from 3.7 (battery) to 1.8V is now more wasteful. Furthermore, the charge pump inside the SD card will also get more inefficient?

Are there alternate solutions that avoid these problems?
Are there any SD cards that don't require you to start at 2.7 – 3.6V?

If 1.8V doesn't save power, then what's the purpose?

Note: there's 1 externality that I'm considering. Using 1.8V can reduce the conversion losses for the microcontroller, which runs at 1.3V internally. Not sure if it's a net gain.

Best Answer

I think the introduction of 1.8V was mainly to get higher frequencies and more speed on the data lines. If I read the specification correctly, VDD isn't actually reduced to 1.8V. Only UHS2 have a separate VDD2 at 1.8V. So probably the only benefit you get is higher speeds and a slightly lower current consumption because the I/O lines have less voltage swing, but if you need level shifters there won't be a benefit (on the contrary).

3.3V lines are specified up to 50MHz. Whereas 1.8V data lines can go up to 208MHz.

I might be entirely wrong though, SD cards take way too much power to be used in my projects, so haven't worked with them in detail.