-
-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
yt-dlp doesn't use aria2c for subtitles and download is slower than videos #9919
Comments
for faster download, you may try --concurrent-fragment 20 |
Thanks, this reduced the time to:
And it does de facto solve the problem. I still don't understand why aria2c is not involved in the download of subtitles. |
On a 2nd thought, are you suggesting not using aria2c at all? |
It's up to you, but an external downloader is not needed for speeding up this subtitle download. |
75B/s ?? With a UK domestic ADSL2 connection (not the limiting factor) and no external downloader, I get 10x that speed for the sttl download (41s) and well over 1MiB/s for the mp4 download. |
Am Mi., 15. Mai 2024 um 14:43 Uhr schrieb dirkf ***@***.***>:
[download] 100% of 27.84KiB in 00:06:20 at 74.94B/s
75B/s ??
There's more to this than whether aria2c is being used.
This must be false reporting from yt-dlp. 6 seconds for 27.84KiB
can't be 74.94B/s.
|
minutes |
It reports now:
[download] 100% of 25.91KiB in 00:00:16 at 1.60KiB/s
Much better, but still suspiciously slow. By reading other bug
reports, it looks like yt.-dlp is very fast at downloading video
files, less so audio files and text files.
I am not sure if adding an external downloader mixes things up.
Adding "--concurrent-fragments" seems to do the trick. It will use
native downloader for subtitles and aria2c for both audio and video
files.
|
Gotcha. |
Fragments are the key. If the site offers a single URL for a resource, it can be downloaded as fast as the site allows. If the site breaks up video/audio/sttl into chunks (fragments) to facilitate navigation through the media (skip to 30s before the end, eg), the resource has to be fetched chunk by chunk and reassembled, so instead of one (set up connection) + (transfer) + (close connection) we have one per fragment. As video has a large bit-rate compared with audio/sttl, the overhead of multiple (set up connection) + (close connection) is much lower. Fetching fragments concurrently is another way of reducing the overhead. |
Am Do., 16. Mai 2024 um 16:55 Uhr schrieb dirkf
Fragments are the key.
When using "--concurrent-fragments", should the parameter
"--http-chunk-size" also be used or are they independent from each
other?
|
Unrelated (a different level of chunking). |
DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
Checklist
Provide a description that is worded well enough to be understood
I have noticed that yt-dlp, despite being configured to use aria2c for downloads, on many sites uses native downloader to download subtitles and in this case, the download speeds are atrociously slow.
In the below fragment, it took 06:20 to download 27.84KiB of subtitles but only 50 secs to download 534.48MiB of video file and 3 seconds to download 20MB.
Provide verbose output that clearly demonstrates the problem
yt-dlp -vU <your command line>
)'verbose': True
toYoutubeDL
params instead[debug] Command-line config
) and insert it belowComplete Verbose Output
The text was updated successfully, but these errors were encountered: