??? 11/01/04 19:02 Read: times |
#80205 - Help - Digital calibration of analog I/P |
Hello everybody,
Can anyone tell me how to digital calibrate an analog input if possible without going for float.Is there any special algorithm used ? Or simply add/subtract offset then multiply/divide for the fullscale ?.At present i am about to use an EPOT for this purpose.My A/D will be a 10 bit one.So i think if i use software to do the calibration then i will have to loose my 10 bit resolution suppose if i need to offset about 50% of the input & multiply/divide the remaining then i will loose half a resolution. Can anybody tell me a better suggetion.My need is a minimum of 10bit,No speed critical application say,about 10ms sample time is Ok. Thanks, S.Sheik mohamed |
Topic | Author | Date |
Help - Digital calibration of analog I/P | 01/01/70 00:00 | |
RE: Help - Digital calibration of analog I/P | 01/01/70 00:00 | |
Hello Mr.erik malund | 01/01/70 00:00 | |
RE: Hello Mr.erik malund | 01/01/70 00:00 | |
RE: Help - Digital calibration of analog I/P![]() | 01/01/70 00:00 |