??? 05/29/08 01:21 Read: times |
#155243 - For your A/D Subsystem.... Responding to: ???'s previous message |
If you are using your A/D subsystem to monitor signals that are changing in time and the monitoring rate (i.e. sampling rate) is in the same league as the actual signal changing rate ... then it is necessary to filter the analogue signal down to at least 2.1 times slower than the rate you will be sampling the signal.
If you use a DSO type scope you may be already familiar with how sample rate can make a true signal look like a different frequency when the sample rate is not enough faster then the signal frequency itself. Michael Karas |
Topic | Author | Date |
ADC of Silabs MCU | 01/01/70 00:00 | |
At best the SiLabs A/D.... | 01/01/70 00:00 | |
A Simple Attenuator | 01/01/70 00:00 | |
Tweaking... | 01/01/70 00:00 | |
Further tweaking.. | 01/01/70 00:00 | |
yes | 01/01/70 00:00 | |
I had that coming.. | 01/01/70 00:00 | |
Drop the Trim | 01/01/70 00:00 | |
100% practical with theory | 01/01/70 00:00 | |
For your A/D Subsystem.... | 01/01/70 00:00 | |
buffer op-amp | 01/01/70 00:00 | |
Chosen Op-Amp : CA5420A | 01/01/70 00:00 | |
Settling time? | 01/01/70 00:00 | |
My signal is a slowly varying one | 01/01/70 00:00 | |
settling time![]() | 01/01/70 00:00 | |
internal reference specified from 2.36 to 2.48 V! | 01/01/70 00:00 | |
do not calibrate if you can avoid it | 01/01/70 00:00 |