I/O Calibration

HSIO and GPIO have a built in I/O calibration feature per bank excluding bank 3. The I/O calibration circuitry is completely self-contained requiring no external reference resistors. The basis for calibration is to optimize the device performance to compensate for process, voltage and temperature (PVT) variations. The calibration controller is used to achieve impedance control for the GPIO and HSIO output buffer drive, termination and HSIO slew rate control by calibrating the I/O drivers. The calibration is initially completed at power up. It is initiated by power-on detectors on VDDI and VDDAUX power supplies. The internal calibration engine initializes the I/O with internal approximation register settings at power-up. On-demand calibration can be invoked by you after the initial I/O calibration. The PF_INIT_MONITOR FPGA IP is used to control the I/O recalibration or to monitor the initial I/O calibration. For more information about calibration requirements for proper start-up, initialization, re-calibration, and usage of the PF_INIT_MONITOR module, see PolarFire FPGA and PolarFire SoC FPGA Device Power-Up and Resets User Guide.

The ODT and output drive features of HSIO and GPIO are calibrated depending on the I/O standard used in a Libero SoC design. The calibration logic is initially in a reset state at power-on. This initial pre-calibration state of the device sets the default to maximum calibration settings. This is done to the I/O's in order to ensure that the buffers are functionally operational after the power-on is complete.

The maximum settings are temporarily used by the buffers until the initial startup is completed. When this is completed, the optimized calibration values are then distributed to the associated I/O's within the bank. The calibration values are used for PVT compensation. GPIO and HSIO use the calibrated values for both drive strength and termination strength. The GPIO differential termination are also calibrated, and HSIO buffers are calibrated for output slew rate control.

HSIO and GPIO initially powers-on with default maximum settings. Maximum pre-calibrated settings are defined as strong drive strength (low output impedance) and low termination values. Due to the nature of these initial pre-calibration settings, a transient current on the VDDI of the associated bank occurs during this pre-calibration phase. The transient current does not have long-term reliability concerns. The transient current diminishes when exiting the pre-calibration phase.

The initial transient current caused by pre-calibration can be mitigated if it is undesirable to the system. Transient current that is caused due to ODT termination can be managed by utilizing the ODT control capabilities in the I/O (see On-Die Termination (ODT)). Training IP (TIP) normally associated with high-speed DDR interfaces can be used to disable the I/O termination until calibration is complete. For untrained termination interfaces, the ODT_DYN interface can be used to disable this pre-calibrated termination.