| ??? 11/24/02 00:10 Read: times |
#33046 - RE: Char vs short unsigned int Array |
"int 2or4bytes`depending`on`system"
In theory, an int should have the "natural" word size of the underlying architecture: on an 8-bit machine, an int would be 8 bits; on a 16-bit machine, an int would be 16 bits; on a 32-bit machine, an int would be 32 bits; etc. However, it is common (usual?) these days to assume that an int will always be at least 16 bits - thus Keil C51, for example, has 16-bit ints even on the 8-bit 8051. Note that the ANSI/ISO standards say nothing about the actual sizes of the types; just that an int must be no smaller than a char, a long must be no smaller than an int, etc, etc "enum 2or4bytes`depending [on the system]" Most compilers - especially for embedded targets - have at least the option to put "small" enums into a single byte. Keil C51 certainly does, also Borland C++ Builder and (I think) MSVC. Note that all these things are totally implementation-defined. Therefore, the only safe place to obtain these details is from the Manual for your specific compiler. You will also need the Manual for other data-representation details, such as byte ordering, packing/padding, floating-point format, etc, etc... |
| Topic | Author | Date |
| Char vs short unsigned int Array | 01/01/70 00:00 | |
| RE: Char vs short unsigned int Array | 01/01/70 00:00 | |
| RE: Char vs short unsigned int Array | 01/01/70 00:00 | |
| RE: Char vs short unsigned int Array | 01/01/70 00:00 | |
| RE: Char vs short unsigned int Array | 01/01/70 00:00 | |
| Correction | 01/01/70 00:00 | |
RE: Char vs short unsigned int Array | 01/01/70 00:00 |



