65.12 I/O Calibration

The need for output impedance calibration arises with higher data rates. As the data rate increases, some transmission line effects can occur and lead to the generation of undershoots and overshoots, hence degrading the signal quality.

To avoid these transmission problems, an I/O calibration cell is used to adjust the output impedance to the driven I/Os.

The I/O calibration sequence is mandatory when one of the SD/SDIO UHS-I modes (SDMMC_HC2R.VS18EN = 1) or e.MMC HS200 (HS200EN = B(hexa)) is selected. It must be performed periodically to prevent the output impedance drift. Once the calibration is finished, the I/O calibration cell provides two four-bit control words (CALP[3:0] and CALN[3:0] in the Calibration Control register (SDMMC_CALCR)) to tune the output impedance, and thus reach the best transmission performances.

The I/O calibration sequence can be started manually by writing a ‘1’ to SDMMC_CALCR.EN. This bit is cleared automatically at the end of the calibration.

The I/O calibration sequence can also be performed automatically if SDMMC_CALCR.TUNDIS is cleared. In this case, the calibration starts automatically at the beginning of the tuning procedure when writing a ‘1’ to SDMMC_HC2R.EXTUN.

The I/O calibration cell requires a startup time defined by SDMMC_CALCR.CNTVAL. Thus, CNTVAL must be configured prior to start the calibration sequence. If SDMMC_CALCR.ALWYSON is set to ‘1’, the startup time is only required for the first calibration sequence as the analog circuitry is not shut down at the end of the calibration. In order to reduce the power consumption, the analog circuitry can be shut down at the end of the calibration sequence by clearing ALWYSON. In this case, the startup time is performed each time a calibration sequence is started.