Is it possible to put an blanket exception to a website (for eg. http://*.google.com or http://*.microsoft.com) so that pages from any links (within this website) which maps to their own mapped servers/sub-domains does not pass through the content filters?
Currently while google/microsoft is under exceptions list, sites like maps.google.com / kh0.google.com / mt3.google.com and update.microsoft.com needs to be separately listed in the 'Exceptions'.
This thread was automatically locked due to age.