1.2 Signal Impedance and Acquisition Time
Input signals to the ADC are measured in two stages(1) – an acquisition stage and a conversion stage. The acquisition stage connects the input to an internal Sample-and-Hold (S/H) capacitor inside the ADC. Then the input is disconnected from the capacitor and internally connected to the conversion circuit inside the ADC.
To correctly measure an input signal, it is required that the internal S/H capacitor is charged to within 0.5 Least Significant Bits (LSbs). The value of an LSB can be computed by dividing the number of bits (n) by the reference voltage, as shown below.
However, if not all bits are needed, set the number of valid bits (n) to a lower value. This reduces the acquisition time () required(2). To find the minimum , the input to the ADC can be modeled as an RC low-pass filter, as shown in the figure below.
With an RC approximation, the voltage on the S/H capacitor is an exponential function, with the following formulas(3):
Where: R is the sum of source impedance and internal ADC impedance, C is the capacitance inside the ADC, t is time, VIN is the input voltage, VPREV is the initial voltage of the capacitor, and VERR is the difference from 0.5 LSbs.
Solving for an LSb of 0.5, the acquisition time can be estimated(3):
The worst-case acquisition time is where VDIFF = VREF, or:
The device data sheet gives information about the ADC timings.
- More stages may occur before acquisition and conversion when using features such as Capacitive Voltage Division (CVD), Programmable Gain Amplifier (PGA), or Double Sampling. The time delays associated with each stage can be added to find the total sampling time.
- On some ADCs, the number of bits generated can be reduced, which may improve the conversion time (TCNV).
- Valid only if ,