Re: wget

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, 15 Apr 2004, Tom 'Needs A Hat' Mitchell wrote:

> On Thu, Apr 15, 2004 at 10:41:39AM -0700, Gunnar vS Kramm wrote:
> > Original e-mail from: Matthew Benjamin (msbenjamin@xxxxxxxxx):
> > 
> > > Does anyone know how to use wget to drill down to all of the folders and
> > > subdirectories in a website. I can mirror my website however it does not
> > > grab all of the folders which contain data that the links go to. The
> > > site is password protected.
> > >  
> > > mattB.
> ....
> 
> > You should be able to use the -r switch to wget, as such:
> > wget -r http://YourWebSite 
> 
> Also, does his web site have a robots file?
> 

The other thing that I don't think came up in this thread is. if you have 
control over the machine in which the website sits. it's a heck of a lot 
more efficient to use rsync to mirror all the files than it is to use 
wget.
 
> 
> 
> 

-- 
-------------------------------------------------------------------------- 
Joel Jaeggli  	       Unix Consulting 	       joelja@xxxxxxxxxxxxxxxxxxxx    
GPG Key Fingerprint:     5C6E 0104 BAF0 40B0 5BD3 C38B F000 35AB B67F 56B2




[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux