Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Substiantial slowdown while downloading thousands of files #43

Open
coezbek opened this issue Apr 11, 2024 · 0 comments
Open

Substiantial slowdown while downloading thousands of files #43

coezbek opened this issue Apr 11, 2024 · 0 comments

Comments

@coezbek
Copy link

coezbek commented Apr 11, 2024

I am using WSL2 to run gphotos-cdp and have noticed that the performance noticeably drops the more files are already downloaded in the download folder. I have measured it to take roughly 1s slower for each download for each 1000 files which have already been downloaded.

I have traced it to a readDir call in the download function, which seems to take longer and longer to scan the whole directory when there are already thousands of directories in the download folder.

As a quick fix I have modified the moveDownload function to move the files/folders into a subdirectory called 'results' (see newDir := line):

func (s *Session) moveDownload(ctx context.Context, dlFile, location string) (string, error) {
	log.Printf("Move Download start")
	parts := strings.Split(location, "/")
	if len(parts) < 5 {
		return "", fmt.Errorf("not enough slash separated parts in location %v: %d", location, len(parts))
	}
	newDir := filepath.Join(s.dlDir, "results", parts[4])
	if err := os.MkdirAll(newDir, 0700); err != nil {
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant