First some background info:
Linux (Ubuntu 20.04), 2b2t 256² spawn download, 678GiB zip, NTFS partition on a HDD, Spare Allocation
I also had the same issue when downloading a 678GiB file.
Trying to save for every piece, pieces taking close to an hour to eventually save, only downloading 10-30 pieces every 15-30 minutes.
After lots of troubleshooting and looking only (including finding this post) I didn't find a solution. A few months went by but still no solution.
But then I noticed the mount for my
NTFS partition was constantly at
25% CPU usage, so I went looking for a reason why this was the case. I found this comment on an AskUbuntu question:
https://askubuntu.com/questions/520992/what-causes-high-cpu-usage-by-mount-ntfs#comment2455046_1332891 So basically
NTFS does not work well for very large compressed files.
I then moved the file to an external HDD (because I don't have 700GiB of free space laying around) so I could format my internal HDD and make an
XFS partition.
I then started the torrent from scratch on the XFS partition, with
fast allocation. Then continued the torrent on the external HDD, to do the file check, and moved that one to the XFS partition and merged it with the one I just initialised with fast allocation.
Right now it is downloading as any other torrent, without the
Trying to save!
How to create an XFS partition on Linux:
https://www.xmodulo.com/create-mount-xfs-file-system-linux.html (XFS is only
for Linux as far as I know)
Why did I use XFS and not another filesystem? According to this article:
https://www.salvagedata.com/btrfs-zfs-xfs-ext4-how-are-they-different, XFS works extremely well with large files.
I did however use
sparse allocation when I started the torrent, but I don't think this was the root cause after seeing the huge performant hit from NTFS.
TL;DR NTFS does not work well for very large compressed files. Using an
XFS partition solved the problem. (XFS is only for Linux as far as I know)