microsecond event scheduling in an application

From: Junhee Lee
Date: Tue Sep 08 2009 - 10:27:48 EST


I am working on event scheduler which handles events in microsecond level.
Actual this program is a network emulator using simulation codes.
I'd like to expect that network emulator is working as simulation behaviors.
Thus high resolution timer interrupt is required.
But high resolution timer interrupt derived by high tick frequency (jiffies
clock) must effect the system performance.
Are there any comments or ways to support microsecond event scheduling
without performance degradation?

Regards


--
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/