??? 09/19/05 12:19 Modified: 09/19/05 12:26 Read: times |
#101220 - the full story Responding to: ???'s previous message |
Craig posted:
My point simply being that I don't think that it is necessarily reasonable to imply that using the large memory model is an indication of a system that hasn't been properly engineered. The full conext was: A properly engineered system will work under all circumstances, even when the worst case execution times hit you. totally agree; howevet I can think of no case where using the LARGE model represent "A properly engineered system". so, OK, my statement should have been I can think of no case in a system where execution time matters where using the LARGE model represent "A properly engineered system". When defaulting to XDATA the "misses" of specifying the memory area of critical variables always happens, whereas working with DATA default when a memory crunch happens, you select what to move out of DATA and, in that process no critical variables get moved to XDATA. Erik |
Topic | Author | Date |
compiling in SDCC on 89c52 | 01/01/70 00:00 | |
Move some variables | 01/01/70 00:00 | |
all in xdata | 01/01/70 00:00 | |
go have a cup of coffee | 01/01/70 00:00 | |
If that isn't enough.. | 01/01/70 00:00 | |
a disastrous approach | 01/01/70 00:00 | |
Indeed | 01/01/70 00:00 | |
we agree, I hope, on both points | 01/01/70 00:00 | |
Pardon me for breathing | 01/01/70 00:00 | |
why the need for a pardon | 01/01/70 00:00 | |
Really? | 01/01/70 00:00 | |
With Craig on this one | 01/01/70 00:00 | |
the full story![]() | 01/01/70 00:00 | |
how much | 01/01/70 00:00 |