Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sometimes hangs when FTP destination refuses login #5113

Open
2 tasks done
ts678 opened this issue Mar 5, 2024 · 0 comments
Open
2 tasks done

sometimes hangs when FTP destination refuses login #5113

ts678 opened this issue Mar 5, 2024 · 0 comments

Comments

@ts678
Copy link
Collaborator

ts678 commented Mar 5, 2024

  • I have searched open and closed issues for duplicates.
  • I have searched the forum for related topics.

Environment info

  • Duplicati version: 2.0.7.1 Beta, 2.0.7.100 Canary, probably more
  • Operating system: Windows 10
  • Backend: FTP

Description

FTP server which starts refusing connections after dblock upload finishes seems to cause a permanent hang, requiring process kill.
Interestingly, Duplicati does not seem to be aware that the dblock upload finished. This was a 1-dblock backup. Needs more tests.
Odd test steps here were invented to try to induce a different bug that I hit on Backblaze, where dblock made it but dindex didn't.

Steps to reproduce

  1. Prepare FileZilla Server (or similar) that lets one quickly change limit the simultaneous connections of a user.
  2. Set connection limit to 2, then tie up a connection with another login, e.g. from a command line ftp client.
  3. Backup a Duplicati .zip file release with Remote volume size 100 MB and throttle-upload of 1 MB/second.
  4. During the probably about 50 seconds of dblock upload, reduce the connection limit to 1 and apply it then.
  • Actual result:
    Hang
  • Expected result:
    Error

Screenshots

Debug log

FileZilla log shows Duplicati communication (starting with dblock) of

<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 [Response] 220-FileZilla Server 1.8.1
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 [Response] 220 Please visit https://filezilla-project.org/
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 [Command] USER duplicati
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 [Response] 331 Please, specify the password.
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 [Command] PASS ****
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 [Response] 230 Login successful.
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Command] OPTS utf8 on
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Response] 202 UTF8 mode is always enabled. No need to send this command
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Command] PWD
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Response] 257 "/" is current directory.
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Command] TYPE I
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Response] 200 Type set to I
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Command] PASV
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Response] 227 Entering Passive Mode (127,0,0,1,234,199)
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Command] STOR test 1/duplicati-ba7a7d44f30b14872a6e26d12ed63a82b.dblock.zip
<3/5/2024 2:36:14 PM> FTP Session 19 127.0.0.1 duplicati [Response] 150 Starting data transfer.
<3/5/2024 2:37:02 PM> FTP Session 19 127.0.0.1 duplicati [Response] 226 Operation successful
<3/5/2024 2:37:02 PM> FTP Session 19 127.0.0.1 duplicati [Command] QUIT
<3/5/2024 2:37:02 PM> FTP Session 19 127.0.0.1 duplicati [Response] 221 Goodbye.
<3/5/2024 2:37:02 PM> FTP Server [Status] Session 19 ended gracefully.
<3/5/2024 2:37:02 PM> FTP Session 20 127.0.0.1 [Response] 220-FileZilla Server 1.8.1
<3/5/2024 2:37:02 PM> FTP Session 20 127.0.0.1 [Response] 220 Please visit https://filezilla-project.org/
<3/5/2024 2:37:02 PM> FTP Session 20 127.0.0.1 [Command] USER duplicati
<3/5/2024 2:37:02 PM> FTP Session 20 127.0.0.1 [Response] 331 Please, specify the password.
<3/5/2024 2:37:02 PM> FTP Session 20 127.0.0.1 [Command] PASS ****
<3/5/2024 2:37:03 PM> FTP Session 20 127.0.0.1 [Error] User «duplicati» has reached the maximum allowed concurrent sessions limit (1). Further authentication attempts denied until active sessions are released.
<3/5/2024 2:37:03 PM> FTP Session 20 127.0.0.1 [Response] 421 Too many active connections, try again later
<3/5/2024 2:37:03 PM> FTP Server [Status] Session 20 ended gracefully.
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Response] 220-FileZilla Server 1.8.1
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Response] 220 Please visit https://filezilla-project.org/
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Command] USER duplicati
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Response] 331 Please, specify the password.
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Command] PASS ****
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Error] User «duplicati» has reached the maximum allowed concurrent sessions limit (1). Further authentication attempts denied until active sessions are released.
<3/5/2024 2:37:03 PM> FTP Session 21 127.0.0.1 [Response] 421 Too many active connections, try again later
<3/5/2024 2:37:03 PM> FTP Server [Status] Session 21 ended gracefully.

Retry live log ends

Mar 5, 2024 2:36 PM: Backend event: Put - Started: duplicati-ba7a7d44f30b14872a6e26d12ed63a82b.dblock.zip (47.02 MB)

The dblock is sitting at the Destination and seems to pass a test.

Profiling log ends

2024-03-05 14:36:14 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: UpdateChangeStatistics took 0:00:00:00.006
2024-03-05 14:36:14 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Starting - Uploading a new fileset
2024-03-05 14:36:14 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ba7a7d44f30b14872a6e26d12ed63a82b.dblock.zip (47.02 MB)
2024-03-05 14:36:14 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitUpdateRemoteVolume
2024-03-05 14:36:14 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitUpdateRemoteVolume took 0:00:00:00.308
2024-03-05 14:36:14 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Uploading a new fileset took 0:00:00:00.318
2024-03-05 14:36:14 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitUpdateRemoteVolume
2024-03-05 14:36:14 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitUpdateRemoteVolume took 0:00:00:00.001

In above, one can see that it's at the end, which seems to be a popular spot for some of the currently reported Duplicati hangs.

About --> System info has

lastPgEvent : {"BackupID":"4","TaskID":2,"BackendAction":"Put","BackendPath":"ParallelUpload","BackendFileSize":49307023,"BackendFileProgress":49307023,"BackendSpeed":12302,"BackendIsBlocking":false,"CurrentFilename":null,"CurrentFilesize":0,"CurrentFileoffset":0,"CurrentFilecomplete":false,"Phase":"Backup_WaitForUpload","OverallProgress":0,"ProcessedFileCount":1,"ProcessedFileSize":49426745,"TotalFileCount":1,"TotalFileSize":49426745,"StillCounting":false}

Status bar has "Waiting for upload to finish ...".

Sysinternals Process Explorer shows no Duplicati FTP connection. FileZilla Server closes connection to CLI client in 10 minutes.

<3/5/2024 2:45:53 PM> FTP Session 15 ::1 duplicati [Response] 421 Activity timeout.
<3/5/2024 2:45:53 PM> FTP Session 15 ::1 duplicati [Error] Control channel closed with error from source 0. Reason: ETIMEDOUT - Connection attempt timed out.

Stop now puts CancellationRequested in profiling log, but that's about it.

Tray Icon right-click Quit closes the visible part, but the process stays up.

EDIT 1:

I tested with factor-of-10 changes to have multiple dblocks, so that's 10 MB Remote volume size with 100 KB/sec throttle.
I also added asynchronous-concurrent-upload-limit of 1, which was at default above, and at 1 in initial find on Canary.

2024-03-05 16:35:18 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAddBlockToOutputFlush
2024-03-05 16:35:18 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAddBlockToOutputFlush took 0:00:00:00.124
2024-03-05 16:35:18 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAddBlockToOutputFlush
2024-03-05 16:35:18 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAddBlockToOutputFlush took 0:00:00:00.152
2024-03-05 16:35:18 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Remotevolume" ("OperationID", "Name", "Type", "State", "Size", "VerificationCount", "DeleteGraceTime") VALUES (1, "duplicati-bb2433e22613d451bb6ffeb71488228fe.dblock.zip", "Blocks", "Temporary", -1, 0, 0); SELECT last_insert_rowid();
2024-03-05 16:35:18 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Remotevolume" ("OperationID", "Name", "Type", "State", "Size", "VerificationCount", "DeleteGraceTime") VALUES (1, "duplicati-bb2433e22613d451bb6ffeb71488228fe.dblock.zip", "Blocks", "Temporary", -1, 0, 0); SELECT last_insert_rowid(); took 0:00:00:00.000
2024-03-05 16:35:18 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd2258df0dba44b059a8214a717604645.dblock.zip (9.98 MB)
2024-03-05 16:35:18 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - CommitAddBlockToOutputFlush
2024-03-05 16:35:18 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: CommitAddBlockToOutputFlush took 0:00:00:00.262

Mar 5, 2024 4:35 PM: Backend event: Put - Started: duplicati-bd2258df0dba44b059a8214a717604645.dblock.zip (9.98 MB)

FileZilla log showed the usual two Too many active connections, but I don't know how Duplicati is deciding to do just two.
As before, a dblock made it up, lastPgEvent looks caught up, but this time (as was expected) it's still processing the source file.
It seems just as stuck as last time (Stop, Quit, kill).

EDIT 2:

FTP (Alternative) wouldn't break with this test. It uploaded a series of dblock and dindex on one login then dlist gave an error.
Attempting to test the Local folder destination on Windows with ACL tricks got messy. Maybe I'll test on Linux with a chmod.

EDIT 3:

Linux with Duplicati as an ordinary user backing up duplicati-2.0.7.100_canary_2023-12-27.zip to a local folder survives root chmod 555 on folder during dblock upload. The dindex retries then errors as expected. Tested both 1 and multi dblock cases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant