What is the most effective, most robust way to allow servers that are
quite distant, and on slow networks "appear" to have the same content ?
In my example, the content is read/write at 4 sites. Hopefully the
system should make caching possible for files that were originally at
another site. If a file were not already cached, then it would get
loaded across the slow network.
redhat global file system appears to be designed to do this:
http://linux.sys-con.com/read/166309_2.htm
but then talks about storage area network or LAN connections rather than
slow wan links.
http://www.drbd.org/ raid across machines?
I did see a few other projects designed to solve this sort of problem,
but I am having trouble finding them now {search hints ?}
http://www.coda.cs.cmu.edu/ljpaper/lj.html
Has anybody used / appraised coda ?
David Timms.