Gu, John A. (US SSA) wrote:
I have a ADC (Analog-to-Digital Converter, Maxim 197) integrated with a
SBC (Single Board Computer) running the Linux Kernel. The ADC has its
sample rate of 10K per second (100 usec interval). The ADC can only
operate on a single trigger and read fashion. Creating a driver is a
good way to handle it. But I don't see any proper timer or delay logic
available for my situation. By using udelay() will block all the other
processes.
As mentioned from you guys, getting some RT-patch may be the only way
out.
Ok. But how often must you read it? The best way may simply be
to trigger a read in a device driver, then just hang and wait
for 100us or until the sample is ready. If your time budget is
not too stringent, this may work ok. If you need to read the A/D
say every 100ms, then eating 100us/100ms is not too much load.
Are there any other devices which cannot accept a 100us interrupt
latency? IOW, will you blow some other time constraint, even
though the CPU load may not be too great using this technique?
Mike
--
p="p=%c%s%c;main(){printf(p,34,p,34);}";main(){printf(p,34,p,34);}
This message made from 100% recycled bits.
You have found the bank of Larn.
I can explain it for you, but I can't understand it for you.
I speak only for myself, and I am unanimous in that!