Conversion Time

The total conversion time for a single result is calculated by:

Total Conversion Time(12-bit)=Initialization+SAMPDUR+15.5fCLK_ADC
Total Conversion Time(8-bit)=Initialization+SAMPDUR+11.5fCLK_ADC

For example, given initialization = 60 μs, SAMPDUR = 2 and fCLk_ADC = 1 MHz, the 8-bit total conversion time is given by:

Total Conversion Time(8-bit)=60 µs+2+11.51MHz=73.5 µs

With the Low Latency (LOWLAT) bit written to ‘1’ in the Control A (ADCn.CTRLE) register, the initialization time is only needed once upon enabling the ADC. After that, the example above will give a total conversion time of 13.5 µs.

The sampling period of the ADC is configured through the Sample Duration (SAMPDUR) bit field in the Control E (ADCn.CTRLE) register as (SAMPDUR + ½ ) CLK_ADC cycles.
ADC0.CTRLE = 2; /* Sample duration configured to 2 */	
If PGA is used, the input sample duration is (SAMPDUR + 1) CLK_ADC cycles, while the ADC PGA Sample Duration (ADCPGASAMPDUR) bit field in the PGA Control (ADCn.PGACTRL) register controls how long the ADC samples the PGA.
ADC0.PGACTRL = ADC_ADCPGASAMPDUR_15CLK_gc; /* 15 CLK_ADC cycles */