Guest User!

You are not Sophos Staff.

This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

UTM best practise guide for strict webfiltering

Note for any user users stumbeling upon this post and looking for advice (just a small list. Any moderator is welcome to edit or to make a better list here):

* Sophos Sales Team DACH published a best practise guide on 26th Feb 2016 by Mail.
Some people have published the newsletter, therefore I will link it here. It also includes some categories to block and general interactions and hints:
http://web.sophos.com/res/sophos/92aaef928aee3e5ff8216622c999157c.pdf
http://utm-shop.de/information/news-und-co/sofortmassnahmen-gegen-krypta-trojaner-wie-wie-cryptowall-teslacrypt-oder-locky(Note: I did not found a English version. In case you need, maybe Sophos has a translation ;) )

* Have a scroll down to 2nd post here and later posts for blocking/logging/categorizing of unkown sites

* Have a look for the ad-blocking over
-- DouglasFoster 2nd post
-- here: https://community.sophos.com/products/unified-threat-management/f/55/t/46207
-- and/or here: https://drashna.net/blog/2015/03/an-exercise-in-frustration-fine-tuning-the-web-filter-in-sophos-utm/#comments

* Keep in mind:
-- Sophos filtering of Flash, ActiveX and Java is not secure! (Malware gets tru!) => Therefore block the risiky ad networks!

Sophos support itself says that it does only gives an enhanced layer of security and not a security itself. Be sure not to trust Sophos UTM to protect you. It is just another layer, which may protect you.

Keep this in mind and the flash filtering, that Sophos seems not to mention that something does not work as you expect (compare like to this: https://community.sophos.com/products/unified-threat-management/f/55/t/74173)

 

=========== Original post ============

Hello together,

we are using here a fully licensed Sophos UTM 9.3. The computers (Win7) have running Sophos Endpoint Cloud, but no Web Control on.
The Win7 computer was fully patched, and the user surfed with MSIE 11 (latest version) with Adobe Flash (lastest version).

During surfing on a more or less popular German website, the user got infected with a perfect language specific ransomware software, probably by malifious ads.

We probably had some low security guideline for the computer. Specific errors are:
- using MSIE instead of Firefox,
- using no ad-blocker
- using Adobe Flash

In any case we had hoped that the Web filter gives us an additional layer, concreate:
- filtering all HTTP and HTTPS traffic
- using reputation limit with lowest limit (i.e. blacklist)
- Anti-Virus filter with Avira
- Blocking ActiveX/Flash/Java

So, now we are wondering, which are the probably right settings for the UTM
- Anti-Virus is not helpful as it does not detect new stuff
- Blocking ActiveX/Flash is not helpful as it does no automatic stripping of ActiveX and Flash - i.e. several sites with Flash work

Therefore, the only logical consequence is whitelisting.
- We have now set reputation limit to whitelist (i.e. neutral)

How does whitelisting work and is it safe enough?

Currently I have the feeling the Web Filtering is not enough...
- Anti-Virus is not 100% secure
- Blocking ActiveX/Flash is not 100% secure
- How secure is the whitelisting?

(Edit 14.30 UTC) We did now the following, too:
- Blocking suspicious category (In German you have to scroll down the list and I did not see it)
- Following the user guide over there to block the ads: https://drashna.net/blog/2015/03/an-exercise-in-frustration-fine-tuning-the-web-filter-in-sophos-utm/#comments
- Checking the boxes in Options for strict HTTP and blocking of unscannable downloads to get this one right: https://community.sophos.com/products/unified-threat-management/f/55/t/74173

Any best practise hints, for making a good tradeoff between surfing and safety?
With current whitelisting some pictures are blocked, even community.sophos.com is blocked...



This thread was automatically locked due to age.
  • Based on the previous framework, below is the query to identify all sites that were blocked because of SSL Certificate problems:

    I export the query results to Excel, then fill in the blank contact column with an email address (obtained from WHOIS).
    I often use an SSL Checker to determine why the site failed, and put that reason into the blank "Reason" column.
    Once the Excel file is populated, I use Word mail-merge to send the emails.   I average about 10-15 problem sites per day.;

    SELECT Max(T.EventTime) AS LastSeen, 

    "" AS Reason,
    "" AS Contact,
    ParseURL([T].[ItmUrl],"D") AS FQDN,
    T.ItmDstip,
    T.ItmReferer,
    Count(T.EventTime) AS Events

    FROM ParsedFile AS T
    WHERE (((T.ItmError)="") AND ((T.ItmId)="0002"))
    GROUP BY ParseURL([T].[ItmUrl],"D"), T.ItmDstip, T.ItmReferer
    ORDER BY ParseURL([T].[ItmUrl],"D"), T.ItmReferer;

  • I made this Access database as per your instructions and it works wonderfully for pulling logs apart. Thanks for taking the time to share it with us. I noticed that the forum is replacing [ W ] with a icon of a withered rose. So to get around that click "Reply" to the post with the icon in it and it gives the original text.

  • Advanced Topics for using Microsoft Access for Log Parsing

    HANDLING MULTIPLE DAYS

    My example does everything based on parsing of a text file.  UTM generates one log file per day.   If you need to merge multiple days, you can combine the text files suing the comand prompt copy command:

    copy file1.txt+file2.txt+file3.txt  combinedfile.txt

    or

    copy file*.txt combinedfile.txt

    However, Access only supports text files of 2GB or less.   My log files are running about 1GB per weekday, so combining has limited usefulness.  

    LOADING THE DATA INTO A SQL DATABASSE

    One option is to load the parsed data into a SQL database, so that you can do queries across any time period.   I have found that loading the data takes a lot of time, and woory about the rapid file growth that would be involved.   As a result, I have a SQL database available but I am not actively populating it.   Omitting references to an SQL database also makes my examples easier for getting started.

    FILES OVER 2GB

    In large environments, your log file might be over 2GB for a single day.  If that happens, you will have to split the log file. into sections of 2GB or less.   I recently had a problem like this and opted to split the file by putting alternate lines into two different files, so I am posting it as a starter.   You will probably want to keep all log entries in time order, so you would probably want to use a counter to control when the file changed, rather than alternating, and you might even need to extend the logic to split into more than 2 files, depending on the size of your logs.   This script was created as a text file with .vbs extension, and then executed with cscript filename.vbs.

    Const ForAppending = 8
    Set objFSO = CreateObject("Scripting.FileSystemObject")
    Set InFile = objFSO.OpenTextFile("infile.txt",, True)
    Set OutFile1 = objFSO.OpenTextFile("outfile1.txt",ForAppending, True)
    Set OutFile2 = objFSO.OpenTextFile("outfile2.txt",ForAppending, True)
    cntr = 0
    While not Infile.AtEndOfStream
      InFile.ReadLine intext
      if cntr = 0 then
        OutFile1.WriteLine intext
        cntr = 1
      else
         OutFile2.WriteLine intext
        cntr = 0
      end if
    Wend
    OutFile1.Close
    OutFile2.Close
    Infile.Close

    WARNED AND PROCEEDED

    I decided that I don't care if a user is warned about a website but does not click the proceed button.  However, if they click through, then I want to ensure that the event is reviewed and that any uncategorized sites are submitted for categorization.   This was a little bit complicated because the PROCEEDED event does not have any user information included.   My conceptual logic seqeunce is:

    (a) find all of the "PROCEEDED" events, and save them to a temporary table
    (b) find the "WARN" event which preceded it and the ALLOW event which followed it.
    (c) combine all three events into one report line.

    The Access query sequence follows, which I have assembled into a Macro

    1) Clean out the temporary table

    DELETE TblProceeded.* FROM TblProceeded;

    (Primary Key is ItmUrl, ItmSrcip, EventTime, ItmID

    2) Add the Proceeded entries into the temporary table:

    INSERT INTO TblProceeded ( ItmId, ItmUrl, ItmSrcip, EventTime )
    SELECT PF.ItmId, PF.ItmUrl, PF.ItmSrcip, PF.EventTime
    FROM ParsedFile AS PF
    WHERE (((PF.ItmId)="0072"));

    3) Get the related records to determine "Who did it"

    INSERT INTO TblProceeded ( EventTime, ItmId, ItmSrcip, ItmUrl, ItmUser, ItmAd_domain, ItmStatuscode, ItmError, ItmCategory, ItmReputation, ItmApplication )
    SELECT PF.EventTime, PF.ItmId, PF.ItmSrcip, PF.ItmUrl, PF.ItmUser, PF.ItmAd_domain, PF.ItmStatuscode, PF.ItmError, PF.ItmCategory, PF.ItmReputation, PF.ItmApplication
    FROM TblProceeded INNER JOIN ParsedFile AS PF ON (TblProceeded.ItmUrl = PF.ItmUrl) AND (TblProceeded.ItmSrcip = PF.ItmSrcip)
    WHERE (((PF.ItmId)<>"0072"));

    4) Query to find the best pair between "01" (allow) and 72 (proceeded).   This is not run separately, it is nested in a later query.

    Query name: QryTblProceeded_0172

    SELECT T72.ItmUrl, T72.ItmSrcip, T72.ItmId AS ID72, T01.ItmId AS ID01, T72.EventTime, Max(T01.EventTime) AS T01Time, T01.ItmUser, T01.ItmAd_domain, T01.ItmStatuscode, T01.ItmError, T01.ItmCategory, T01.ItmReputation, T01.ItmApplication, HttpErrorCodes.HttpErrorText
    FROM (TblProceeded AS T72 INNER JOIN TblProceeded AS T01 ON (T72.ItmUrl = T01.ItmUrl) AND (T72.ItmSrcip = T01.ItmSrcip)) LEFT JOIN HttpErrorCodes ON T01.ItmStatuscode = HttpErrorCodes.ItmStatuscode
    GROUP BY T72.ItmUrl, T72.ItmSrcip, T72.ItmId, T01.ItmId, T72.EventTime, T01.ItmUser, T01.ItmAd_domain, T01.ItmStatuscode, T01.ItmError, T01.ItmCategory, T01.ItmReputation, T01.ItmApplication, HttpErrorCodes.HttpErrorText
    HAVING (((T72.ItmId)="0072") AND ((T01.ItmId)="0001") AND ((Max(T01.EventTime))>=[T72].[EventTime]))
    ORDER BY T72.ItmUrl;

    5) Query to find the best pair between "71"  (warned) and "72" (proceeded).   This is also referenced by query nesting, as shown in the next step

    Query name: QryTblProceeded_7172

    SELECT T72.ItmUrl, T72.ItmSrcip, T72.ItmId AS ID72, T71.ItmId AS ID71, T72.EventTime, Max(T71.EventTime) AS T71Time, T71.ItmUser, T71.ItmAd_domain, T71.ItmStatuscode, T71.ItmError, T71.ItmCategory, T71.ItmReputation, T71.ItmApplication
    FROM TblProceeded AS T72 INNER JOIN TblProceeded AS T71 ON (T72.ItmUrl = T71.ItmUrl) AND (T72.ItmSrcip = T71.ItmSrcip)
    GROUP BY T72.ItmUrl, T72.ItmSrcip, T72.ItmId, T71.ItmId, T72.EventTime, T71.ItmUser, T71.ItmAd_domain, T71.ItmStatuscode, T71.ItmError, T71.ItmCategory, T71.ItmReputation, T71.ItmApplication
    HAVING (((T72.ItmId)="0072") AND ((T71.ItmId)="0071") AND ((Max(T71.EventTime))<=[T72].[EventTime]))
    ORDER BY T72.ItmUrl;

    6) Combine everything together

    SELECT T01.ItmUrl, T01.ItmSrcip, Format$(T01.T01Time,"yyyy-mm-dd hh:nn:ss am/pm ddd") AS PassTime, Format$(T01.EventTime,"yyyy-mm-dd hh:nn:ss am/pm ddd") AS ProceedTime, Format$(T71.T71Time,"yyyy-mm-dd hh:nn:ss am/pm ddd") AS WarnTime, T01.ItmUser, T01.ItmAd_domain, T01.ItmStatuscode, T01.ItmError, T01.ItmCategory, T01.ItmReputation, T01.ItmApplication, TblCategoryIDs.ItmCategoryname
    FROM (QryTblProceeded_0172 AS T01 INNER JOIN QryTblProceeded_7172 AS T71 ON (T01.ItmUrl = T71.ItmUrl) AND (T01.ItmSrcip = T71.ItmSrcip) AND (T01.EventTime = T71.EventTime)) LEFT JOIN TblCategoryIDs ON T01.ItmCategory = TblCategoryIDs.ItmCategory;

    Sorry about the wilted rose.  Thanks for exmplaining the workaround.

  • UPDATE:  I have begun finding discrepancies between the UTM and TrustedSource classifications.

    Sophos support now says that I need to use this URL:   https://www.sophos.com/en-us/threat-center/reassessment-request.asp instead of (or in addition to) www.trustedsource.org.   (The related Sophos knowledgebase entry is Article ID: 119440)   

    The Sophos site is a big disappointment, because it is much less efficient to use for multiple submission:   only one URL at a time, no go-back button, no memory of you from one submit to the next.  

    I also suggested that Sophos reconsider the references to www.trustedsource.org in section 9.1 of the UTM admin guide.   Technically, the reference is to reputation issues rather than categorization, so maybe they use trustedsource for reputation only.

  • I am finding quite a few sites that have port 443 open, even though they do not intend to be an HTTPS site, and have not installed a certificate.   These systems are hosted on akamai and have a default akamai certificate installed.   Since the port is open, the google webcrawler will find it and index the site using https.   When our users do a google search, they will be blocked when they follow the link.  

    The site owner generally does not care when I point out the problem.  I suspect that the search result will be slow to disappear from google even if it is fixed, which can only increase the reluctance of the site owner to make the change.

  • Thanks for the links to the best practice guides. Highly appreciated.