Many embedded systems use Digital-to-Analog Converters (DACs) to generate signals or system excitation. In many cases, these signals are periodic and can be subject to harmonic distortion. A major cause of distortion results from DAC updates that do not occur exactly at the correct time due to interrupt latencies or other system delays. For example, to create a 1 KHz sine wave with a 32-sample data table, the DAC needs to update its output value every 31.25 µsec. If some of the samples are not updated exactly at the correct time, the signal becomes distorted.
Using more samples in the data table can reduce the effects of the distortion. Of course, using more samples adds to the system overhead required to generate the signal. However, assuming there is enough system overhead to write each sample to the DAC without missing any individual sample, the more samples used, the less the distortion of the signal for an occasionally late sample. Using fewer samples in the data table makes the timing of each sample more critical, but reduces the required system overhead to generate the waveform. Implementing this approach requires a timer-triggered DAC.
The MSP430F15x/16x microcontrollers (MCUs) integrate dual DACs that have a timer-triggered update feature to combat the signal distortions caused by miss-timed samples. The timer-triggered method writes data samples to the DACs ahead of time, and then updates the DAC exactly at the correct time. The next DAC data value can be written to the DAC any time between sample points because the DAC will not update its output to the new value until the next timer trigger.
To show the advantage of the timer-triggered update, a test was conducted that generated a 4.2 KHz sine wave with an eight-sample data table and a single-pole RC reconstruction filter. First, a test was performed without the timer-triggered update and the distortion of the signal was examined. Then the timer-triggered feature was employed and again the distortion was examined. In each test, the DAC update value was written to the DAC inside of an interrupt service routine. The sample-interrupt was generated by Timer_A of the MSP430. The tests were also constructed such that the interrupt service routine had up to six 1-MHz CPU cycles of uncertainty for a maximum sample uncertainty of 6 µsec.
In the first test, the next DAC output value was written to the DAC inside the interrupt service routine, and the DAC immediately updated its output. A fast Fourier transform (FFT) performed on the sine wave shows the distortion caused by the timing uncertainty in updating the DAC, as well as the amplitude of harmonics in the signal, which is significantly above the noise floor indicating significant distortion.
The next test used the same Timer_A 125 µsec interrupt and six-CPU-cycle latency, but implemented the timer-triggered update feature of the DAC. Instead of immediately updating its output with the new value, the DAC waited until the next interrupt. With this approach, the next sample was written to the DAC a complete sample period before it was needed, and then updated at the next sample period exactly on-time. This technique reduced the amplitude of the harmonics to almost that of the noise floor.
The benefits of the timer-triggered DAC update are clear when it comes to reducing signal distortion, but other improvements occur that may be of more importance to some applications — the reduction of the data points required to generate the signal and the complexity of the reconstruction filter. This example used only eight data points and a single pole reconstruction filter to generate the sine wave, whereas a typical system often uses 32 or 64 sample points with a double-pole reconstruction filter. The overall result of the timer-triggered DAC feature is a purer signal, fewer required data points, and simpler filtering.
Join the conversation! For more detailed reviews from these readers and to add your comments, go to our Electronics Forum at: http://rbi.ims.ca/4919-558.