Hello
I would know if there limits using --args-file ? i.e. may I use a file with 1 million rows of /home locations ?
Thank you
This thread was automatically locked due to age.
Hello
in /home/user/public_html
I have about 5 hundred users . Instead to execute a
savscan /home/*/public_html
I wrote a php file which creates a file /tmp/myfile (i.e.) with a list of files modified in latest 24 hours in /home/*/public_html
and I want check this file using args-file=myfile with the -f and -archive option for each file in the list
savscan --args-file=/tmp/myfile
Since there are various users using webmail or other dynamic script which adds constantly new files , the
generated file /tmp/myfile is huge.
I tested this way using only a part of these users , and I noticed that if the number of files contained in the file grows also
savscan execution seems to be more slower . Is it so ? More large is the file and more slow run savscan --args-file=myfile ?
It seems to became very very slow over 25000 files , so I am considering to use linux split .
Thank you