An example of a PULL script: #! /bin/sh # wget_pull.sh URL=http://download.fedora.redhat.com/pub/fedora/linux/core/updates/1/i3 86/ BASE=/home/mirror/fedora/linux/core/updates/1/i386/ PROXY_USER="user_name" PROXY_PASS="password" [ ! -d $BASE ] && mkdir -p $BASE cd $BASE wget --proxy-user=$PROXY_USER --proxy-passwd=$PROXY_PASS --mirror --no-parent --no-host-directories --cut-dirs=7 -o pull.log $URL rm -f bc_folder.png bc_parent.png bc_rpm.png blank.gif rm -f index.html headers/index.html SRPMS/index.html debug/index.html debug/headers/index.html Rgds Gary -----Original Message----- From: fedora-list-bounces@xxxxxxxxxx [mailto:fedora-list-bounces@xxxxxxxxxx] On Behalf Of Dan Horning Sent: Friday, April 16, 2004 12:19 PM To: 'For users of Fedora Core releases' Subject: RE: wget why not just use wget -m http://url/path and add -np if needed > -----Original Message----- > From: fedora-list-bounces@xxxxxxxxxx > [mailto:fedora-list-bounces@xxxxxxxxxx] On Behalf Of Tom > 'Needs A Hat' Mitchell > Sent: Friday, April 16, 2004 12:14 AM > To: For users of Fedora Core releases > Subject: Re: wget > > On Thu, Apr 15, 2004 at 10:41:39AM -0700, Gunnar vS Kramm wrote: > > Original e-mail from: Matthew Benjamin (msbenjamin@xxxxxxxxx): > > > > > Does anyone know how to use wget to drill down to all of > the folders and > > > subdirectories in a website. I can mirror my website > however it does not > > > grab all of the folders which contain data that the links > go to. The > > > site is password protected. > > > > > > mattB. > .... > > > You should be able to use the -r switch to wget, as such: wget -r > > http://YourWebSite > > Also, does his web site have a robots file? > > > > > -- > T o m M i t c h e l l > /dev/null the ultimate in secure storage. > > > -- > fedora-list mailing list > fedora-list@xxxxxxxxxx > To unsubscribe: http://www.redhat.com/mailman/listinfo/fedora-list > -- fedora-list mailing list fedora-list@xxxxxxxxxx To unsubscribe: http://www.redhat.com/mailman/listinfo/fedora-list