Linux – wget and connection errors / timeouts

linuxwget

I was using wget the in the last week to recursively download a whole website of html pages.
I used it this way:

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --domains XXXX.com --no-parent http://www.XXXX.com

the issue is, since the downloading took couple of days, sometimes there were connection timeouts , network disconnections etc, and while it happened, seems like wget skipped the htmls it couldnt fetch, which is not good in this case.

I wonder if there is a flag (been looking in the manpage to no avail…) to tell wget to keep on retrying failed fetches for infinity? (even if my computer disconnects for the web for 10 hours, I want it to keep on trying to get the page until it succeeded (obviously when the computer will be online again).

thanks,

Best Answer

I suppose that is the option you are asking for:

-t number
--tries=number
    Set number of retries to number. Specify 0 or inf for infinite retrying.
    The default is to retry 20 times, with the exception of fatal errors
    like "connection refused" or "not found" (404), which are not retried.