[aklug] Re: recursive HTTP file downloading...

From: blair parker <blair@cmjv.com>
Date: Thu Jun 10 2010 - 11:33:11 AKDT

I am getting a robots.txt file, but I'm not savvy enough to know if it
is getting in the way... How to tell?

Blair Parker
Crazy Mountains Joint Venture
2000 E. Dowling Road, Suite 6
Anchorage, AK 99507
Voice: 907.561.0406
Fax: 907.561.0788
Cell: 907.350.8320

Shane R. Spencer wrote:
> I forgot to ask if robots.txt was getting in the way:
>
> http://www.dot.state.ak.us/robots.txt
>
> On 06/10/2010 11:14 AM, blair parker wrote:
>
>> Ok... Maybe somebody out there can help me with a recursive download
>> issue...
>>
>> The state DOT has a bunch of specs that my wife wants to download:
>>
>> http://www.dot.state.ak.us/creg/design/highways/Specs/
>>
>> She wants all of the files, subdirectories included.
>>
>> I can't seem to get 'wget' to download any of the files listed, and
>> 'curl' only downloads files individually. Am I missing something, or is
>> there some relatively simple, recursive command to download all of these
>> files ?..
>>
>> Thanks.
>>
>>
>
> ---------
> To unsubscribe, send email to <aklug-request@aklug.org>
> with 'unsubscribe' in the message body.
>
>
>
---------
To unsubscribe, send email to <aklug-request@aklug.org>
with 'unsubscribe' in the message body.
Received on Thu Jun 10 11:33:05 2010

This archive was generated by hypermail 2.1.8 : Thu Jun 10 2010 - 11:33:05 AKDT