Increase number of retries with WGet

Ask for help about FlashGot, no registration needed to post
Post Reply
Miyuki

Increase number of retries with WGet

Post by Miyuki »

Dear fabulous FlashGot friends,

I am having difficulties downloading a reasonably large (500Mb) mp4 file from a slow server. FlashGot is successful to an extent - it is able to download about half of the file using Wget and a series of retries - but then it gives up after completing about 50% of the file.

While it is downloading the first half of the file, I can simultaneously play the mp4 file in a variety of media players, and seek up to the point that is currently being downloaded.

After the failure, I can restart the download, which then appears to reach completion, but the file is corrupt. None of the media players will play beyond the point where FlashGot first gave up, and if the download eventually completes the file becomes completely unplayable (media players no longer recognise it as an mp4 file and say it contains "text/html" contents).

I think if the FlashGot download succeeded the first time, and didn't have to be restarted, it would probably work. How can I increase the number of times FlashGot will retry the download with Wget?

Many thanks,

Miyuki
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:18.0) Gecko/20100101 Firefox/18.0
User avatar
GµårÐïåñ
Lieutenant Colonel
Posts: 3369
Joined: Fri Mar 20, 2009 5:19 am
Location: PST - USA
Contact:

Re: Increase number of retries with WGet

Post by GµårÐïåñ »

FG is not a downloader itself. It allows you to grab the links and send it to your downloader. If you are unable to finish downloads, its your downloader and you need to either change to something better but having FG try over and over doesn't change anything when it comes to the outcome of a successful download. Its not responsible for that part.
~.:[ Lï£ê ï§ å Lêmðñ åñÐ Ì Wåñ† M¥ Mðñê¥ ßå¢k ]:.~
________________ .: [ Major Mike's ] :. ________________
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.2.0.0 Safari/537.17
User avatar
therube
Ambassador
Posts: 7969
Joined: Thu Mar 19, 2009 4:17 pm
Location: Maryland USA

Re: Increase number of retries with WGet

Post by therube »

URL to this file?

"Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off."

Code: Select all

  -t,  --tries=NUMBER           set number of retries to NUMBER (0 unlimits).
       --retry-connrefused      retry even if connection is refused.
You could also output a log file that might give you some indication as to what happened.

Could be that, depending on the length of time, your cookie or similar, from the server gets stale, so the download stops.

Then when you refresh the link (get a new cookie) the server could be starting from the wrong spot, or when the download died initially, it spit out some textual (html) data into the file corrupting it.

I would think an MP3 & its player to be fairly lax when it finds an error, simply trying to read past it? MPlayer is like that. Ignoring textual headers on an otherwise "valid" MP3.

Have a look at your download when it stops (thinking a file viewer would be best) & see what the tail end of the file looks like. If its relatively random looking "gibberish", then that part is probably OK. If its reasonably legible, straight text or html or the like, then when the download dies, the server is throwing garbage into the file. Might be able to use a hex editor & cut that out & then see if after refreshing the link & completing the download if it will play through.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.19) Gecko/20110420 SeaMonkey/2.0.14 Pinball NoScript FlashGot AdblockPlus
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:20.0) Gecko/20100101 Firefox/20.0 SeaMonkey/2.17a2
beepbeep

Re: Increase number of retries with WGet

Post by beepbeep »

At least One good thing should be to have to have by default

Code: Select all

--timeout 10 --tries inf
as default parameters for wget, and better to have a way to configure extra parameters to be added in wget command line.
Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:21.0) Gecko/20100101 Firefox/21.0
Post Reply