??? 07/31/07 14:17 Read: times |
#142547 - Performance slows down when enabling serial ints. |
Hello,
I have a serial communication application running in my Atmel 89LV52 8052-micro controller. The communication and the entire application works fine otherwise, but whenever I enable the serial interrupts like this: IP = 16; // serial = high priority, others = low priority IE = 128 + 16; // Enable all + serial, THIS LINE SLOWS DOWN THE PERFORMANCE the performance of the application slows down, but otherwise everything looks OK. The slow down does not appear in my 8051 simulator, but in a real HW environment it does. I have tried the code in 2 8052 chips and with 2 boards which makes 4 combinations, and same symptoms in every case. The system has still worked some time ago, but I don't understand what change has brought this problem. I initialize the TI and RI like this: setb _TI ; ti is normally set in this program clr _RI ; ri is normally cleared I transmit bytes like this: void serial_out(unsigned char* str, unsigned char n) { while (n != 0) { _asm 00001$: jnb ti, 00001$ clr ti _endasm; SBUF = *str++; n--; } // finally new line: _asm 00002$: jnb ti, 00002$ clr ti mov _SBUF, #0x0d 00003$: jnb ti, 00003$ clr ti mov _SBUF, #0x0a _endasm; } I receive bytes like this: void serial_receive_interrupt_handler() interrupt INT_SERIAL _naked { _asm jb ri, 00001$ reti ; no operation when sent a byte, seems to never come here 00001$: ; clr ea clr es ; storing of the byte into a buffer located into indirect RAM at 0x80 -> ; setb ea clr ri setb es reti _endasm; } It is like serial receive interrupt handler got called continuously. Any help will be appreciated. - Ari |
Topic | Author | Date |
Performance slows down when enabling serial ints. | 01/01/70 00:00 | |
clr TI | 01/01/70 00:00 | |
comments | 01/01/70 00:00 | |
No TX interrupt servicing | 01/01/70 00:00 | |
this is why we have FAQs here | 01/01/70 00:00 | |
replies to all![]() | 01/01/70 00:00 |