Help and Support
Ask a question, report a problem, request a feature...
<<  Back To Forum

"Trying to save" bug causing program lock

by Guest on 2022/01/11 04:59:54 AM    
I've had this problem for a little while now, on 2.81 and just upgraded to 2.87 it's the same, sadly.

Scenario: a big torrent is not yet fully allocated, but there is torrent data coming in from eager peers or seeds. I don't know on what basis Tixati allows some torrents to accept data while a torrent is not yet allocated while most seem to prevent activity until the files are written, but it's big ones that are proving a problem for me. Because there's no allocation, the incoming pieces get backed up in the cache folder.

At some point Tixati doesn't accept any more data, and puts "Trying to save" messages on the pieces. I would have thought that it would be LOGICAL for this early piece data to be immediately written out as the file is allocating, in situ, rather than writing out full empty files and then putting this data in when it's all complete. Very inefficient, but par for the course where Tixati's shoddy file handling is concerned. Anything that interferes with huge atomic operations like writing 100s of GB of empty file out is just blocked until it's all finished... Anyway, normally that's fine and once the file is all allocated the backed-up pieces get written out and the file continues to download.

The problem is that for some big files recently, this "trying to save" state sticks. Even when the file is fully allocated, the accumulated pieces never get written out. It may be triggered if there are more torrents queued to allocate, perhaps even on the same disk. That's pretty normal for me to end up with many queuing at once.

Worse, if you try and STOP the file, to give it a kick if you will to get it to restart, or even remove it, Tixati crashes, completely unresponsive. Obviously that's bad as it is so paranoid it has to recheck everything downloading again - try that with a number of incomplete 200GB files... :( Even worse than that, the state it comes back up in on restart means that not all your recent changes (removes, stops etc.) are recorded, so the bad torrent is still in there, live, still blocking.

The only method I've found around this is to be extremely quick and stop the torrent once Tixati comes back up, before any data has downloaded for it, or perhaps before it's connected out. I don't know what the key is. Doing a filecheck allows you to bring it back up normally then. But that's still forcing a destructive restart of Tixati. And wastefully, the pieces that were blocking (and complete) are now lost, even though they should still exist in cache and LOGICALLY be applied as the file is checking. It's all EXTREMELY annoying when it happens.

This needs examining urgently to find out how this block on saving out is getting applied, and stuck. Maybe a saner way of managing this would simply be to prevent any data coming in until the allocation is complete, if you can't write it out into the file as it is allocating, which would be a MUCH better solution. But preventing the complete program crash would be a start, so that a blocking torrent can at least be stopped, or removed.

Thanks.
by Guest on 2022/01/11 09:16:51 AM    
what OS?

are you getting crash reports? if so, send them in. thats the best way to get things fixed.

you should post the magnet link to the torrent you are having trouble with. the mods will send it to the devs.
by Guest on 2022/01/11 12:04:46 PM    
Win 7, FWIW, but I'd guess this is system agnostic, it's a logic bug in whatever routine handles the "trying to save" state of a torrent when it's being blocked from saving data normally.

This is a Tixati problem, not a torrent problem. The latest torrent to do this is fine - it's downloading normally now. As said, once it's been stopped and checked and restarted, whatever is blocking it stops. But it's getting to that state from an unresponsive torrent that causes a complete Tixati crash once you try to stop or remove it, it's so locked up, that's the difficulty.
by notaLamer on 2022/01/14 10:10:44 AM    
I've never had it, but you should mention the file allocation setting. Settings > Files > File allocation. I use Fast Allocate even tho I suspect it's not working as advertised.
by Guest on 2022/06/18 12:31:23 PM    
First some background info:
Linux (Ubuntu 20.04), 2b2t 256² spawn download, 678GiB zip, NTFS partition on a HDD, Spare Allocation

I also had the same issue when downloading a 678GiB file. Trying to save for every piece, pieces taking close to an hour to eventually save, only downloading 10-30 pieces every 15-30 minutes.
After lots of troubleshooting and looking only (including finding this post) I didn't find a solution. A few months went by but still no solution.

But then I noticed the mount for my NTFS partition was constantly at 25% CPU usage, so I went looking for a reason why this was the case. I found this comment on an AskUbuntu question: https://askubuntu.com/questions/520992/what-causes-high-cpu-usage-by-mount-ntfs#comment2455046_1332891  So basically NTFS does not work well for very large compressed files.

I then moved the file to an external HDD (because I don't have 700GiB of free space laying around) so I could format my internal HDD and make an XFS partition.
I then started the torrent from scratch on the XFS partition, with fast allocation. Then continued the torrent on the external HDD, to do the file check, and moved that one to the XFS partition and merged it with the one I just initialised with fast allocation.
Right now it is downloading as any other torrent, without the Trying to save!

How to create an XFS partition on Linux: https://www.xmodulo.com/create-mount-xfs-file-system-linux.html  (XFS is only for Linux as far as I know)
Why did I use XFS and not another filesystem? According to this article: https://www.salvagedata.com/btrfs-zfs-xfs-ext4-how-are-they-different,  XFS works extremely well with large files.
I did however use sparse allocation when I started the torrent, but I don't think this was the root cause after seeing the huge performant hit from NTFS.

TL;DR NTFS does not work well for very large compressed files. Using an XFS partition solved the problem. (XFS is only for Linux as far as I know)




This web site is powered by Super Simple Server