Hi All
We have an internal website that we use (a CRM system) that is having issues with Sophos. We believe the issue occurs because Sophos attempts to scan all js/html and image files as they are sent from the web application. Unfortunately the web app isn't exactly efficient and can send up to 100+ individual files to render a single page!
We would like to basically stop sophos on access scanning js/html and possibly image files. However because we don't really want to do this for *all* websites I'm wondering if its possible to put individual web sites on a "safe" list to get around our performance issue in a more granular way?
I hope that makes sense
Many thanks all!
S
This thread was automatically locked due to age.