David Timms wrote:
What is the most effective, most robust way to allow servers that are
quite distant, and on slow networks "appear" to have the same content ?
In my example, the content is read/write at 4 sites. Hopefully the
system should make caching possible for files that were originally at
another site. If a file were not already cached, then it would get
loaded across the slow network.
redhat global file system appears to be designed to do this:
http://linux.sys-con.com/read/166309_2.htm
but then talks about storage area network or LAN connections rather than
slow wan links.
http://www.drbd.org/ raid across machines?
I did see a few other projects designed to solve this sort of problem,
but I am having trouble finding them now {search hints ?}
http://www.coda.cs.cmu.edu/ljpaper/lj.html
Has anybody used / appraised coda ?
David Timms.
OpenAFS (http://www.openafs.org) is the perfect piece of software for
this. We used the commercial version of this way back in the 90's when I
worked for NIH. It's designed for WAN's and uses a local cache for
content. It is somewhat complicated so it's not some install it and run
it application.
Gordon