??? 05/15/08 18:29 Read: times |
#154799 - Erm ... not quite true. Responding to: ???'s previous message |
There's no time constant you could ever choose that would inhibit the LSB flickering.
If you apply proper rounding, the LSB will stabilize. Else you could not increase ADC precision by downsampling. If your LSB keeps flickering, you've messed up the numerics of the calculation. Ok, what about moving average? We assumed a data rate every 10msec. Moving average would take a handful of bytes in a row, let's say 8, add them together and divide the result by 8. Mostly you do "decimating" and the data rate also decreases by a factor of 8, means you get a new byte every 80msec. Is the LSB flickering removed? No! Ok, I'll bite. First set of 8 samples: 0111 1111 1000 0000 0111 1111 1000 0000 0111 1111 1000 0000 0111 1111 1000 0000 ------------ 11 1111 1100 Divide by 8: 0111 1111 Second set of 8 samples: 0111 1111 1000 0000 0111 1111 1000 0000 0111 1111 1000 0000 0111 1111 1000 0000 ------------ 11 1111 1100 Divide by 8: 0111 1111 The LSB does not flicker anymore. How should it ? A moving average of 8 samples has a zero as 1/2 the sampling rate. It will completely eliminate the flicker frequency. (oh, and this also works if you don't decimate. None of the output samples will have the LSB flicker, they will all be 0111 1111 if you do not round and 1000 0000 if you do round before the division by 8). While your programming advice is usually very sound, you could benefit from reading up on digital signal processing. |