BT Sync using up a lot of CPU and RAM after update

  • I updated everything on my OMV 2.x install over the weekend. Ever since then, btsync has been using up a large % of the CPU power. It uses anywhere from 30% to 70% when I check the Diagnostic -> System Info -> Processes tab. The other process that also shows high usage is mount.ntfs. It's been like that for about 3 days now. I've restarted OMV a couple of times, but the issue hasn't gone away. I've turned of BT Sync for now since it's slowing down my load times when streaming and I don't want it to wear out my drive. My CPU usage is now about 1% with an occasional jump to 30% and memory usage has gone down from about 50% to 26% (out of 996.26 MiB). I don't know what else I can do to gather more information to troubleshoot this issue. This was working fine before the update.

  • There's no other good replacement. Syncthing has also been reported to use a lot of CPU power as well, so I don't really want to use that. BT Sync was working perfectly fine. Should have backed-up the OS. I'll have to reinstall the original version of OMV I downloaded just to get this to work. I need it to sync my work projects across a few machines.

  • I resolved the issue, documenting here in case someone finds this useful.


    1. I connected to OMV via SSH and ran the lsblk command. lsblk will list all the partitions. The ones we are interested in are the ones that btsync uses.


    2. I then ran lsof <path to media bt sync uses>. Example, lsof /media/B20AA9E80AB9AA2D.
    This showed me the process (btsync) and which files it was accessing. I found that the process was stuck on a large 26GB image file I had.


    3. I removed the problematic folder from btsync and then re-added it. The high CPU usage continued. However, the bt sync GUI showed that it was indexing the re-added folder so this was fine. I let it run for a few hours and then CPU usage dropped back down to normal.


    You can confirm that bt sync is no longer stuck by periodically re-running the lsof <file path> command and checking that it's not on the same file for too long. Of course, you must be a bit patient if the file is really large, but it shouldn't be stuck on any file too long. I indexed 230GB worth of stuff overnight.

Jetzt mitmachen!

Sie haben noch kein Benutzerkonto auf unserer Seite? Registrieren Sie sich kostenlos und nehmen Sie an unserer Community teil!