This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Excluding a Website from On-Access Scanning

Hi All

We have an internal website that we use (a CRM system) that is having issues with Sophos. We believe the issue occurs because Sophos attempts to scan all js/html and image files as they are sent from the web application. Unfortunately the web app isn't exactly efficient and can send up to 100+ individual files to render a single page!

We would like to basically stop sophos on access scanning js/html and possibly image files. However because we don't really want to do this for *all* websites I'm wondering if its possible to put individual web sites on a "safe" list to get around our performance issue in a more granular way?

I hope that makes sense

Many thanks all!

S

:29159


This thread was automatically locked due to age.
Parents
  • Hello S,

    this is not a solution, I'm just thinking in writing ...

    I've admittedly never given a thought to how often a downloaded file - say a JS - is actually scanned before it is presented to the application and to what extent. If download scanning is active then the LSP will request a scan. On-access can be set to Scan On Write - thus I assume it will again be subject to scanning. The subsequent open for read will likely result in a quick inspection only as the file is "known" from the preceding on-write scan. I assume that the download scan is not a replacement for the on-disk scan (i.e. not as "deep" but rather concentrating on certain properties), but I don't know.

    AFAIK you can't exempt sites from download scanning - it's all or none (the website authorization applies to blocking of malicious sites only). Do you have download scanning enabled and if, does turning it off reduce the issues?

    As to scanning on disk: There's no indicator where a file came from, thus exclusion of these files from on-access scanning would only be possible if the application uses a "private" temporary location or cache.

    Putting it all together I don't see a simple general solution to the performance problem which would not introduce some additional risk. This of course under the assumption that you don't do web scanning on a gateway - in that case you could at least disable download scanning without affecting protection.

    Christian

    :29163
Reply
  • Hello S,

    this is not a solution, I'm just thinking in writing ...

    I've admittedly never given a thought to how often a downloaded file - say a JS - is actually scanned before it is presented to the application and to what extent. If download scanning is active then the LSP will request a scan. On-access can be set to Scan On Write - thus I assume it will again be subject to scanning. The subsequent open for read will likely result in a quick inspection only as the file is "known" from the preceding on-write scan. I assume that the download scan is not a replacement for the on-disk scan (i.e. not as "deep" but rather concentrating on certain properties), but I don't know.

    AFAIK you can't exempt sites from download scanning - it's all or none (the website authorization applies to blocking of malicious sites only). Do you have download scanning enabled and if, does turning it off reduce the issues?

    As to scanning on disk: There's no indicator where a file came from, thus exclusion of these files from on-access scanning would only be possible if the application uses a "private" temporary location or cache.

    Putting it all together I don't see a simple general solution to the performance problem which would not introduce some additional risk. This of course under the assumption that you don't do web scanning on a gateway - in that case you could at least disable download scanning without affecting protection.

    Christian

    :29163
Children
No Data