With the latest beta,
Everything eats up more than 80% cpu usage while updating its database if there is massive file deletion. Actually, it made system unresponsive and was forced to quit in task manager. The dB size of mine is rather big though. (53MB)
It would be better if there is any way to slow down the speed of updating index when there are big changes of mft while monitoring hdd drives.
High cpu usage while deleting many files at once
Re: High cpu usage while deleting many files at once
How many is massive?massive file deletion
In any case, I would gather its probably expected - though unwanted.
Related (& I see I used the word "mass" deletion ), Delay Update if User Not Currently Logged In.
Re: High cpu usage while deleting many files at once
I'm using i3570K in first. It occurred when uninstalling one program. It's about 100GB with many small files. I guess everything is stuck in updating its database. I had a same problem before with similar tasks. IIRC, I did the job (like massive file deletion) after exiting everything manually because of that. If everything can detect the number of files processing at once, what about having some limitation to do it at once? e.g. If there is more than 100 files to process at the same time, pause monitoring operation temporarily.
At least, it would be appreciated if it is possible to suppress the cpu usage to some extent as you think proper.
At least, it would be appreciated if it is possible to suppress the cpu usage to some extent as you think proper.
Re: High cpu usage while deleting many files at once
The high cpu usage is expected when deleting 100+ folders.
Deleting files shouldn't use much CPU at all.
Deleting files shouldn't use much CPU at all.
I hope to improve the performance in a future release of Everything by batching updates together so I do not have to build a temporary folder tree each time a folder is deleted.Actually, it made system unresponsive and was forced to quit in task manager.
-
- Posts: 1
- Joined: Sat Mar 25, 2017 5:49 am
Re: High cpu usage while deleting many files at once
Suggest the index updating policy is as below :
Suppose during normal system usage, the deleting rate is below 30 files/directories per second.
step 1 : When need update index, first check if the system is busy (like the currently deleting file rate is above 30 files/directories per second), when it is true, add the update job to a queue and delay 2 seconds.
step 2 : After 2 seconds, check if the system is not busy and when it is not buy, get the update job from queue and process, otherwise if the system is still busy then repeat delay 2 seconds again and again until the system is not busy.
Suppose during normal system usage, the deleting rate is below 30 files/directories per second.
step 1 : When need update index, first check if the system is busy (like the currently deleting file rate is above 30 files/directories per second), when it is true, add the update job to a queue and delay 2 seconds.
step 2 : After 2 seconds, check if the system is not busy and when it is not buy, get the update job from queue and process, otherwise if the system is still busy then repeat delay 2 seconds again and again until the system is not busy.
Re: High cpu usage while deleting many files at once
It's a good idea, thanks.Suggest the index updating policy is as below :
Suppose during normal system usage, the deleting rate is below 30 files/directories per second.
I'll consider adding something like this, I do have plans to rewrite the algorithm for deleting/renaming files in Everything 1.5 to make it way more efficient (it will require some additional ram usage).
I doubt much will change for the Everything 1.4 release, as it would be a big change to the database code.
There is an ini option find_subfolders_and_files_max_threads in 852b or later to reduce the CPU usage when deleting and renaming folders.
By default find_subfolders_and_files_max_threads is set to 0, which will use half the maximum logical CPUs, so you shouldn't see more than 50% CPU usage when deleting and renaming folders in Everything 852b. There is no noticeable loss in performance when using half the maximum availalbe logical CPUs compared to using them all.
Ideally, you will want to set find_subfolders_and_files_max_threads to the number if memory channels you have.