| ??? 05/03/01 07:13 Read: times |
#11292 - RE: 16bit to BCD conversion |
What is the problem with a 16 bit value being in 2 8-bit variables? That's the natural way for an 8 bit micro to handle larger numbers.
Conversion can be done by subtracting 10,000 from the number as often as you can. The number of times you can is the left most BCD digit. Then subtract 1,000 as often as you can to get the 2nd BCD digit, then 100, then 10. The final remainder is the rightmost BCD digit. Why start at 10,000 you may ask? Because 65,535 is the largest possible number that can be represented by a 16 bit value. Happy programming |
| Topic | Author | Date |
| 16bit to BCD conversion | 01/01/70 00:00 | |
| RE: 16bit to BCD conversion | 01/01/70 00:00 | |
| RE: 16bit to BCD conversion | 01/01/70 00:00 | |
| RE: 16bit to BCD conversion | 01/01/70 00:00 | |
| RE: 16bit to BCD conversion | 01/01/70 00:00 | |
| RE: 16bit to BCD conversion | 01/01/70 00:00 | |
RE: 16bit to BCD conversion | 01/01/70 00:00 |



