This is a tale as old as time. So let’s do it once again, shall we?
SD cards always power up using 3.3V logic to maintain backwards compatibility to older SDIO/SPI standards. However, to go any faster than 25MB/s, the signalling voltage must drop down to 1.8V.
However, it’s not as easy as that might seem. To get from default speed mode (which, as the name implies, is the default mode of all SD cards at power up) to one of the UHS-I modes (i.e. any of the SDR or DDR ones), there must be a handshake confirming that both devices are cool to take it down to 1.8V. Specifically, this is CMD11 + some timing requirements. This process is detailed in ST RM0433, section 55.8, page 2430.
Most of the details there are firmware-related, so we’re really just interested in the fact that we must a) start at 3.3V logic and then b) drop down to 1.8V at some arbitrary point once commanded. This will require the use of one or more voltage level translators that are chill with 3.3 ↔ 3.3V operation AND 3.3 ↔ 1.8V operation.
On the STM32H7, there are two SDMMC interfaces, SDMMC1 and SDMMC2. The #1 interface is much more capable, but both are able to run in the UHS-I mode.
<aside> ⚠️
NEED TO REVIEW THIS
</aside>
Critically, SDMMC1 has direction pins, which can tell the voltage translator which direction data is flowing. Without these pins, you must rely on auto-direction sensing which can be finicky. The old plan was to use the STM32H743ZI, which only supported SDMMC2 while using the DCMI peripheral, but we’ve decided to use the higher-pinned STM32H743II which allows for the coexistence of both peripherals. However, for timeline safety, I will be implementing both methods in the hopes that at least one will work.
On SDMMC2, there are no direction output pins, so the direction must be inferred by an IC. We’re using the same circuit as used on DevKitSat, which uses the NXB0108.