This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Excluding a Website from On-Access Scanning

Hi All

We have an internal website that we use (a CRM system) that is having issues with Sophos. We believe the issue occurs because Sophos attempts to scan all js/html and image files as they are sent from the web application. Unfortunately the web app isn't exactly efficient and can send up to 100+ individual files to render a single page!

We would like to basically stop sophos on access scanning js/html and possibly image files. However because we don't really want to do this for *all* websites I'm wondering if its possible to put individual web sites on a "safe" list to get around our performance issue in a more granular way?

I hope that makes sense

Many thanks all!

S

:29159


This thread was automatically locked due to age.
  • Hello S,

    this is not a solution, I'm just thinking in writing ...

    I've admittedly never given a thought to how often a downloaded file - say a JS - is actually scanned before it is presented to the application and to what extent. If download scanning is active then the LSP will request a scan. On-access can be set to Scan On Write - thus I assume it will again be subject to scanning. The subsequent open for read will likely result in a quick inspection only as the file is "known" from the preceding on-write scan. I assume that the download scan is not a replacement for the on-disk scan (i.e. not as "deep" but rather concentrating on certain properties), but I don't know.

    AFAIK you can't exempt sites from download scanning - it's all or none (the website authorization applies to blocking of malicious sites only). Do you have download scanning enabled and if, does turning it off reduce the issues?

    As to scanning on disk: There's no indicator where a file came from, thus exclusion of these files from on-access scanning would only be possible if the application uses a "private" temporary location or cache.

    Putting it all together I don't see a simple general solution to the performance problem which would not introduce some additional risk. This of course under the assumption that you don't do web scanning on a gateway - in that case you could at least disable download scanning without affecting protection.

    Christian

    :29163
  • Hi Christian,

    Thanks very much for your reply. I've been having a bit of a play about and I think I'm with you on the fact that its not possible to exlude individual URLs from the effects of on access scanning. Using Process Monitor I've tried as many options as I can think of short of turning scanning off completely and the only option I've found that works is to either fully exclude the Temporary Files folder for the browser or excluding named file extensions completely.

    I guess we'll have to consider whether the performance is bearable vs disabling scanning for all js/html files :-(

    Many thanks

    Simon

    :29171