This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Allow Web Traffic URL based

Hello,

In a Project I am using Sophos XG (Virtual Appliance in Azure) to achieve the following:

In our (new) Azure network environment we neither want nor can use a proxy. Still we do not want to allow all traffic to 80/ 443 TCP. Therefore we need some kind of filtering for URLs.

I'd like to draw a quick szenario:

  1. we have two virtual Machines (Adresses 10.0.0.4 and 10.0.0.5) in our environment.
  2. We have two Groups of URLs that need to be allowed
    1. *.pkg.gihub.com, godaddy.github.io
    2. helm.gravitee.io
  3. Group a is assigned to both Machines group b only to the first machine

Right now we are using FQDN-host groups as destinations in firewall rules to achieve this.

This comes with mainly two problems:

  1. Sometimes I have no access to URL's like docker.pkg.github.com. This issue is resolved after I use the policy test.
    I think this problem comes from the following issue: docker.pkg.github.com has multiple Public IP's assigned (140.82.118.33 and 140.82.118.34). 
    The problem might be that if the virtual Machine tries to reach docker.pkg.github.com and has 140.82.118.33 stored as IP Adress for this domain in local DNS cache but in the Sophos we only have 140.82.118.34 stored as IP for this specific domain. Then this traffic is apparently blocked.
    When I then use the policy test the Sophos starts DNS resoloution for this URL and adds the other IP to the list of IP's for this host.
  2. On machine two (which should only be allowed to reach *.pkg.gihub.com and godaddy.github.io ) I am sometimes able to reach helm.gravitee.io

    I think this is because those hrlm.gravitee.io is an alias for gravitee-io.github.io. and godaddy.github.io and gravitee-io.github.io are both reached via a couple of loadbalancers  (185.199.109.153, 185.199.111.153, 185.199.110.153, 185.199.108.153)

Does someone now any way to overcome this issues without using URL Groups and web filter policies?

I hope there is someone understanding my problem and who is able to help me Slight smile

If tehre are any questions or you need clarification I will try to help wtih it.

Kind regards

Jonas



This thread was automatically locked due to age.
Parents
  • Does someone now any way to overcome this issues without using URL Groups and web filter policies?

    There's no way to correctly do this without using URL Groups with Web Filter Policies since Sophos XG is a L4 SPI Firewall with L7 policies glued on it.

    Just to get things right, what you want to do is:

    Allow access only to "*.pkg.gihub.com", "godaddy.github.io" to both machines, and also "helm.gravitee.io" to machine B only, and nothing more ?

  • Thanks for your answer!!

    That sounds like I have to find a way to properly use url_grps and web filter policies and still manage to fullfill all documentation requirements. But that is another sing to fix. 

    Prism said:

    Just to get things right, what you want to do is:

    Allow access only to "*.pkg.gihub.com", "godaddy.github.io" to both machines, and also "helm.gravitee.io" to machine B only, and nothing more ?

    In general yes. There are of course some more URL's to be allowed but they are pretty much the same.

    Again, thanks a lot!

  • You will face a lot of problems on this.

    The first two main problems is, when creating a rule to allow a Website on TCP/80 and TCP/443, the firewall will open those ports for everything then apply web filtering on It. There's no way to enforce protocols such as allow only HTTP/S on them.

    Not only this, but any application that isn't identifiable by their engine will bypass all your actions, an example of this is running Wireguard VPN over UDP/53 (DNS Port).

    The best thing you can do if you don't want to use a proxy, is to use the new DPI engine which doesn't proxy nor break the connection, and try to enforce the URL filtering with the Web Policies.

    Also, there's no Regex support besides doing exceptions, and the wildcard support is barely functional.

Reply
  • You will face a lot of problems on this.

    The first two main problems is, when creating a rule to allow a Website on TCP/80 and TCP/443, the firewall will open those ports for everything then apply web filtering on It. There's no way to enforce protocols such as allow only HTTP/S on them.

    Not only this, but any application that isn't identifiable by their engine will bypass all your actions, an example of this is running Wireguard VPN over UDP/53 (DNS Port).

    The best thing you can do if you don't want to use a proxy, is to use the new DPI engine which doesn't proxy nor break the connection, and try to enforce the URL filtering with the Web Policies.

    Also, there's no Regex support besides doing exceptions, and the wildcard support is barely functional.

Children
No Data