Re: wget?


Subject: Re: wget?
From: Adam Elkins (i-robot@gci.net)
Date: Mon Mar 17 2003 - 23:54:45 AKST


Yeah Greg, I got it. In my case, I wanted a series of books. All the books
were in the /books directory of the website.
Using wget:
# wget -r -l http://whatever/books
Also, in my case, I had to have a passwd, so:
# wget -r -l http-user=username http-passwd=passwd http://whatever.com/media

You might want to check the man page on the -l switch, you can tell it how far
into the directory to go, if you leave it blank ( -l ), it will do the whole
thing.
Hope that helped you...

Adam

On Monday 17 March 2003 11:28 am, aklug@techalaska.com wrote:
> Adam,
> Any luck figuring out how do this? I've run into the same issue and would
> be interested in how you solved it.
> Greg
>
> Adam Elkins writes:
> > I found a website, with some online books I'd like to keep tp view
> > offline. How would I go about doing this? I tried File > Save Page As in
> > Mozilla, but it only grabs one page at a time.
> > What tools v\can I use to mirror the site I guess. I don't want the whole
> > site, it's got over 1000gigs of mediaon the srever, I only want the
> > books.
> >
> > Adam
> >
> > ---------
> > To unsubscribe, send email to <aklug-request@aklug.org>
> > with 'unsubscribe' in the message body.

---------
To unsubscribe, send email to <aklug-request@aklug.org>
with 'unsubscribe' in the message body.



This archive was generated by hypermail 2a23 : Mon Mar 17 2003 - 23:52:02 AKST