This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Best Practice for Hi-Volume File server scanning?

Hello, I installed standalone antivirus ver 10 on a Windows 2008 File server which has over 2TB of data that is accessed by hundreds of people each day during business hours.  

With prior products I set up a evening or weekend scan to run so the CPU would not take a hit during the day.  I do not see a way to schedule off-hours scans for all files.  I do not feel safe using on-access scanning for a file server with such a high volume of user activity.

I did a search for scheduled scans on the forum but it all seemed geared towards clients and not servers.  My question is:  What are the best practices for anti-virus protection for large amounts of data with high volume access that does not negatively impact the CPU?

thanks

Drew 

:24077


This thread was automatically locked due to age.
Parents
  • Hello Drew,

    congratulations to your post - generating three replies is not bad :smileywink:. RougeViper and Jak have already given some immediate answers, let me add a few remarks.

    As to performance: It's not primarily the amount of data in bytes or the number of users accessing the data which affects on-access performance. More important is the number of (different) files which are accessed, how often they change and their contents (type). Once a file opened for reading has been scanned it is usually not rescanned on subsequent opens (whether by the same or different users). Also while a quick scan of the structure is performed on all files to determine whether they belong to "executables" the scan goes only deeper when they do.

    Accessibility : Naturally it makes a difference whether the server is a "simple" data store, houses the users'  home directories or stores the roaming profiles. Under certain circumstances having the files blocked on the server can lead to nasty errors in Windows.

    Protection: As Jak has suggested you should at least enable on-access scanning for the server. It also depends on the level of protection on the clients. Thus if all (repeat all) your clients are up-to-date protected with the recommended settings on-access scanning by the server is redundant. "Less advisable" is no on-access scanning on the server and scanning of remote files excluded on the clients. You should also consider the amount of sharing (e.g. you could exclude those areas where a user has exclusive access while scanning all shared space).

    Scheduled/On-demand scans: IMO their main purpose is to clean up what's been missed by on-access (either due to lax settings - which might nevertheless be justified - or because protection or cleanup has not been available at the time the threat was encountered). It is not a replacement for real-time protection. It's a good idea to run a scheduled scan before backup.

    Christian

    :24113
Reply
  • Hello Drew,

    congratulations to your post - generating three replies is not bad :smileywink:. RougeViper and Jak have already given some immediate answers, let me add a few remarks.

    As to performance: It's not primarily the amount of data in bytes or the number of users accessing the data which affects on-access performance. More important is the number of (different) files which are accessed, how often they change and their contents (type). Once a file opened for reading has been scanned it is usually not rescanned on subsequent opens (whether by the same or different users). Also while a quick scan of the structure is performed on all files to determine whether they belong to "executables" the scan goes only deeper when they do.

    Accessibility : Naturally it makes a difference whether the server is a "simple" data store, houses the users'  home directories or stores the roaming profiles. Under certain circumstances having the files blocked on the server can lead to nasty errors in Windows.

    Protection: As Jak has suggested you should at least enable on-access scanning for the server. It also depends on the level of protection on the clients. Thus if all (repeat all) your clients are up-to-date protected with the recommended settings on-access scanning by the server is redundant. "Less advisable" is no on-access scanning on the server and scanning of remote files excluded on the clients. You should also consider the amount of sharing (e.g. you could exclude those areas where a user has exclusive access while scanning all shared space).

    Scheduled/On-demand scans: IMO their main purpose is to clean up what's been missed by on-access (either due to lax settings - which might nevertheless be justified - or because protection or cleanup has not been available at the time the threat was encountered). It is not a replacement for real-time protection. It's a good idea to run a scheduled scan before backup.

    Christian

    :24113
Children
No Data