Email: Password: Remember Me | Create Account (Free)

Back to Subject List

Old thread has been locked -- no new posts accepted in this thread
???
01/21/05 04:55
Read: times


 
#85455 - Calculating time required for interrupt
I wrote a basic program last night to calculate the speed of an object as it passes through two infrared emitter/detectors. The first detector starts a 16 bit timer and the second detector stops the timer and sends the program to calculate speed based on elapsed time.

I am using an 89C2051 and the Bascom 8051 compiler program.
The chip doesn't have a built in autoreload for a 16bit timer but the bascom software will emulate a 16bit timer in auto-reload. The timer triggers an interrupt on overflow each time, and here is where I found my problem;

The first time the timer interrupt was triggered it took a few thousand clock cycles the second time it was triggered it took a few hundred clock cycles. Every timer interrupt thereafter took between 200-300 clock cycles. This obviously makes it difficult to measure the speed of the object.

Could this be due to poor programming (sorry I re-wrote the program without the interrupt so I have no code to display)

Or could it be a problem with the compiler?

Regards

James Krushlucki



List of 9 messages in thread
TopicAuthorDate
Calculating time required for interrupt            01/01/70 00:00      
   Distance between two infrared e/r?            01/01/70 00:00      
      Program Simulator            01/01/70 00:00      
         Simulation            01/01/70 00:00      
   counter overflow problem?            01/01/70 00:00      
      No overflow problem            01/01/70 00:00      
   BSIC for microseconds ???            01/01/70 00:00      
      Sloppy compilers            01/01/70 00:00      
   Clock cycles            01/01/70 00:00      

Back to Subject List