Greetings,
I was wondering if there's a way to have a regular expression exception set up for a site, but then be able to blacklist certain portions of the site?
For example:
I have www.corporatecloud.com and I want to disable https scanning, cert trust checking, and authentication for it. I make an exception for it.
But, the site has specific content on it I don't want a user to view.
So, I make a blacklist for www.corporatecloud.com/users/dirty_pictures, but it gets ignored due to the previous exception.
If I turn off my exception, the blacklist works fine. If I turn the exception on, it ignores the blacklist.
If anyone has some advice on how to handle this, please let me know. Right now, I'm assuming I am missing something simple!
Edit:
UPDATE
I've narrowed it down to the 'SSL scanning' option. If I leave that unchecked in the exception, both the exception and the blacklist work as expected. I understand why the content filter options would prevent this, but now it seems more like a bug since SSL scanning shouldn't really have anything to do with whether a URL is able to be blacklisted, right?
This thread was automatically locked due to age.