Les Mikesell wrote:
Is there a reason why rsync cannot be used for this?
Unfortunately, yes, due to the method that I receive the files, ie
from another application that has it's own mechanism to feed the files
to client machines. I really wish this wasn't the case, but I have to
live with what I got.
If you request the resends with http you could use wget with the option
to only transfer if the server's copy is newer than yours, and just ask
for all of them every time.
Or, if you can construct the (sorted)list of all the names you expect to
have you can:
ls * | comm -13 - /path/to/list
and get the list of names in the list but not in the directory.
With apologies to the 2 Les', the situation isn't like that and I
apologize if I've not been clear. The application that I'm working with
is running on a server that simply relays the data to all our
customers, it doesn't store a copy of the files and then feed them. The
NWS weather data requires as close to real-time performance and the
'series of tubes' allows. That said, I'm running another server that
runs the same application but is designed to pull the data feed and then
store the files locally. I /can/ store the files on the primary server,
and I have, but this is a production server that feeds 13MB/hr for each
of the 60 or so radar sites it handles 24/7 so I don't like asking it to
do more than it does.
So, in essence I'm stuck with these files being dumped on a server via a
proprietary method. So I need to sort the files and check for missing
ones on the filesystem.
The early suggestions were great and I'm trying each one and tweaking to
see if I can make them work with what I have. But any additional bash
tips would be helpful as I am pressed for an answer to this issue.
--
Recedite, plebes! Gero rem imperialem!
Mark Haney
Sr. Systems Administrator
ERC Broadband
(828) 350-2415
Call (866) ERC-7110 for after hours support