New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Resume data too large? (both with fastresume files and sqlite) #20792
Comments
Does this "File size exceeds limit" error happen on qBittorrent 4.6.3 as well? |
@androgeosone, can you provide a .torrent file or hash (if it's not a private torrent) for testing? |
I have a very similar issue after I upgraded from 4.6.3-2.0.9 to 4.6.4-2.0.10. All my fastresume files cannot be opened. Mine is like this
|
@ivan-yu, your issue is unrelated to this issue. |
@stalkerok, Ok, very well. @androgeosone, Just curious on your error. |
Sadly I can't share the original 1.4TB torrent that caused this issue, but researching this issue has led to another one (approx 700GB) that causes the same issue on a Windows Server host with 2x Xeon Gold 6150 CPUs and 768GB of RAM. The torrent initially starts and works just fine, but eventually the fastresume data (or corresponding blob in sqlite) grows too large and the torrent becomes impossible to resume without a force-recheck on each startup of QB. Needless to say with multi terabyte torrents that can take a long time.
Please be aware the that contents of the above magnet link are pornographic in nature. |
The .fastresume file size at one point was: 1033073767 The current sqlite stats:
Prior to adding the large torrent the torrents.db was around 24MB - it has bloated by more than 2GB due to the large torrent. No others have been added since. Edit: Fix typo |
I added the test torrent to the main client (on lt 1.2), I'll have to watch it longer, a quick test didn't reveal anything (only used .fastresume). |
The issue is on |
I don't use lt2.0 on a regular basis and rarely test anything because it's garbage, but a quick test showed that lt2.0 is indeed the problem. |
@androgeosone
Just want to point out this torrent is a hybrid torrent, might be relevant. |
I believe it is content of merkle tree that is saved in resume data for v2/hybrid torrents. |
@stalkerok @glassez |
It works, the file size of .fastresume is 172 MB. |
Do you really not see that they are completely different? In your case, it gets an incorrect file size. In this case, the file size does actually exceed the set limit. |
@glassez |
@androgeosone Did you compile the build yourself? The reason I ask, is that I noticed you were looking in to compiling/wiki - #20100
These dependencies are not part of the official releases.
We might need to raise default?! |
Up to 200MB maybe? The .torrent file itself is only 574KB in size. |
It makes sense to limit the size of some input files, such as torrent files, but does it for ones created by qBittorrent itself? It leads to such annoying behavior when qBittorrent cannot load files created by itself earlier so this always was looked like not a good idea for me. |
qBittorrent & operating system versions
qBittorrent v4.6.4
Windows Server 2022 Standard (Version 10.0.20348.2402)
Qt: 6.6.1
Libtorrent: 2.0.11.0
Boost: 1.84.0
OpenSSL: 3.2.1
zlib: 1.3.1
What is the problem?
Attempting to download a large (>1TB) torrent results in errors saving/restoring fast resume data in qbittorrent.
qbittorrent.log is filled with errors like this:
(C) 2024-04-29T01:10:37 - Failed to resume torrent. Torrent: "REDACTED". Reason: "File size exceeds limit. File: "M:/qbittorrent/qBittorrent/data/BT_backup/REDACTED.fastresume". File size: 402353016. Size limit: 104857600"
and
(W) 2024-04-28T23:07:04 - File error alert. Torrent: "REDACTED". File: "REDACTED". Reason: "REDACTED file_mmap (REDACTED) error: The paging file is too small for this operation to complete"
Please note that this is running on a server with over 1TB of RAM (physical) as well as a 1TB paging file. Actual memory usage was not unusually high, but the commit size according to task manager was huge (>1TB).
Since I was getting this error with fastresume - causing the file to be entirely checksummed again on each startup of qbittorrent I decided to try out the sqlite based fast resume alternative in the advanced options.
That just changed the error. Qbittorrent was still unable to save or restore fast resume data.
(C) 2024-04-30T21:41:44 - Couldn't store resume data for torrent 'REDACTED'. Error: string or blob too big Unable to bind parameters
Steps to reproduce
Additional context
No response
Log(s) & preferences file(s)
(C) 2024-04-29T01:10:37 - Failed to resume torrent. Torrent: "REDACTED". Reason: "File size exceeds limit. File: "M:/qbittorrent/qBittorrent/data/BT_backup/REDACTED.fastresume". File size: 402353016. Size limit: 104857600"
and
(W) 2024-04-28T23:07:04 - File error alert. Torrent: "REDACTED". File: "REDACTED". Reason: "REDACTED file_mmap (REDACTED) error: The paging file is too small for this operation to complete"
(C) 2024-04-30T21:41:44 - Couldn't store resume data for torrent 'REDACTED'. Error: string or blob too big Unable to bind parameters
The text was updated successfully, but these errors were encountered: