1 Noise Suppression Theory

Many MCU applications involve measuring analog signals. In the ideal case of a completely noiseless signal, achieving a high-quality digital representation of the signal is simple: Single ADC conversions triggered with a fixed time interval is sufficient. In reality, most analog signals are affected by noise and the modern Microchip ADCs provide functionality that can be used to increase the signal-to-noise ratio.

Noise can be defined as undesirable electrical signals, which interfere with an original (or desired) signal. Noise reduction, the recovery of the original signal from the noise-corrupted one, is a very common goal in the design of signal processing systems.

Every sample from an ADC can be a combination of signal (S) and noise (N). Noise suppression is a process of reducing noise with minimal impact on the desired signal. One way this can be achieved is by computing the average of many samples of a noisy signal, which will reduce the fluctuations and leave the desired signal visible.

Figure 1-1 illustrates a noisy signal. Single conversions spaced in time results in a noisy digital representation of the signal. A potential solution could be to filter the acquired samples in software, but this would require additional CPU resources. A better option would be to use the ADC Computation modes supported by the ADC.

Figure 1-2 illustrates how a single ADC conversion trigger results in a burst of multiple successive ADC conversions.

Each conversion will be accumulated in hardware and the average value of the accumulated samples can be calculated by dividing the accumulated result by burst size in the Accumulation mode whereas in Average mode, Burst Average mode, and Low Pass Filter (LPF) mode the filtered result is available in the built-in Filter register. Because the sampled noise has a zero mean, the averaged result will be close to the actual signal value.

Here, the ADC Computation modes can be used for averaging by configuring the ADC to accumulate m samples automatically.

In Burst Average mode, the ADC sampling rate is affected by the number of samples accumulated. The total sampling time for m samples is the multiplication of sampling time for one sample and m, the number of samples taken.

Figure 1-1 illustrates a noisy signal: DC signal mixed with random noise.

Figure 1-1. DC Signal Mixed With Noise

If the signal is zoomed in, consider the ADC samples that are taken as indicated in Figure 1-2. Here, it illustrates how 32 ADC samples are accumulated in a short time window.

Figure 1-2. Averaging Zero Mean Noise

Each individual sample differs from zero by a random value, but with equal probability of being either positive or negative. The accumulated noise samples will approach zero and the noise is successfully suppressed. If this noise is imposed on a non-zero signal, the accumulated value will approach a scaled version of this signal's average.

As oversampling is done with multiple samples, the average result of all the sampled values will be approximately equal to the original DC signal. That means it results in zero mean noise.

By increasing the burst size (accumulating more samples), it helps to flatten out more peak signals and results in more suppression in noise.