RE: wget

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



why not just use wget -m http://url/path and add -np if needed 

> -----Original Message-----
> From: fedora-list-bounces@xxxxxxxxxx 
> [mailto:fedora-list-bounces@xxxxxxxxxx] On Behalf Of Tom 
> 'Needs A Hat' Mitchell
> Sent: Friday, April 16, 2004 12:14 AM
> To: For users of Fedora Core releases
> Subject: Re: wget
> 
> On Thu, Apr 15, 2004 at 10:41:39AM -0700, Gunnar vS Kramm wrote:
> > Original e-mail from: Matthew Benjamin (msbenjamin@xxxxxxxxx):
> > 
> > > Does anyone know how to use wget to drill down to all of 
> the folders and
> > > subdirectories in a website. I can mirror my website 
> however it does not
> > > grab all of the folders which contain data that the links 
> go to. The
> > > site is password protected.
> > >  
> > > mattB.
> ....
> 
> > You should be able to use the -r switch to wget, as such:
> > wget -r http://YourWebSite 
> 
> Also, does his web site have a robots file?
> 
> 
> 
> 
> -- 
> 	T o m  M i t c h e l l 
> 	/dev/null the ultimate in secure storage.
> 
> 
> -- 
> fedora-list mailing list
> fedora-list@xxxxxxxxxx
> To unsubscribe: http://www.redhat.com/mailman/listinfo/fedora-list
> 



[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux