Announcement

Collapse
No announcement yet.

Latest antivirus benchmark comments

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Latest antivirus benchmark comments

    Hi, I discovered on Avast forums your latest Anti-virus performance benchmark.

    Pretty informative test.
    You describe the tests in details - that makes it very trustworthy.

    There are some nice and thorough tests, including boot time test using Windows SDK (that is a cool approach, i bet noone used it before), installed size test, etc.

    At the same time you did some slips:
    - it seems that you did not install the latest windows updates, (which is an unusual situation nowadays)
    - for file scans, you average based on 4 runs (because the first one is slower most of the times). That is incorrect approach, as normally AV scans the file only once, and not 4 times, not in a row. What you should have done is to measure only the first scan time.
    - the average is not correctly calculated in some graphs: there is a clear outsider on many tests that is like 1000% slower that the rest. That outsider should have been excluded from the "average". Also the scale on some graphs is messed up by that outsider, which makes it difficult to compare.
    - you count amount of reg keys created. The number of reg keys does not affect performance in any way.
    - there were no "virus detection" tests... so these tests are only about performance, right?

    Besides that, very good and useful report!
    Last edited by andru123; Feb-19-2011, 12:57 PM.

  • #2
    I agree some things should be changed. For one all products need to be tested for each update to the benchmark. Using the latest non beta build of each product tested. Windows 7 must have all updates applied before hand.

    As far as the file scan testing I see nothing really wrong with the 5 run average. I would guess in most cases the same ranking would result if only the first run counted. I can see a problem in cases where a given products default is always scan all files on a right click scan. In these cases the product would perform poorly compared to other products even though a right click scan should scan everything possible every time IMO. Because of this I give little weight to this test. Without knowing how each program is setup to scan on a right click the results do not really reflect scan performance. They only reflect perceived scan performance. A possible fix would be to show the time for the first scan and the 5 scan average in the results. Then people could draw their own concussion from the results.

    I agree 100% about the average result being messed up in tests where one or more products are so doggy they sway the average to such a degree the average is of little use. Dropping the best and worst results from the average would help out tremendously IMO.

    I disagree on your viewpoint on the registry key test. The bigger the registry is the worse a given system will perform. More ram will be used to contain the registry and it will take more time to access a given key. How much difference will depend on many factors. A slower PC that is lean and mean will see a bigger drop in performance when a registry hog is installed compared to a higher performance PC that has monstrous registry before hand.

    P.S. I know Symantec is paying for these tests so they have some control over the products tested and the methods used for testing. I have been waiting for you to include Avast IS near for ever. Is Symantec scared or what? Seems several other IS programs have been added since the first test but not Avast IS.

    Bill

    Originally posted by andru123 View Post
    Hi, I discovered on Avast forums your latest Anti-virus performance benchmark.

    Pretty informative test.
    You describe the tests in details - that makes it very trustworthy.

    There are some nice and thorough tests, including boot time test using Windows SDK (that is a cool approach, i bet noone used it before), installed size test, etc.

    At the same time you did some slips:
    - it seems that you did not install the latest windows updates, (which is an unusual situation nowadays)
    - for file scans, you average based on 4 runs (because the first one is slower most of the times). That is incorrect approach, as normally AV scans the file only once, and not 4 times, not in a row. What you should have done is to measure only the first scan time.
    - the average is not correctly calculated in some graphs: there is a clear outsider on many tests that is like 1000% slower that the rest. That outsider should have been excluded from the "average". Also the scale on some graphs is messed up by that outsider, which makes it difficult to compare.
    - you count amount of reg keys created. The number of reg keys does not affect performance in any way.
    - there were no "virus detection" tests... so these tests are only about performance, right?

    Besides that, very good and useful report!
    Last edited by wonderwrench; Feb-19-2011, 05:10 PM.
    Main Box*AMD Ryzen 7 5800X*ASUS ROG STRIX B550-F GAMING*G.SKILL 32GB 2X16 D4 3600 TRZ RGB*Geforce GTX 1070Ti*Samsung 980 Pro 1 TB*Samsung 860 EVO 1 TB*Samsung 860 EVO 2 TB*Asus DRW-24B3LT*LG HL-DT-ST BD-RE WH14NS40*Windows 10 Pro 21H2

    Comment


    • #3
      it seems that you did not install the latest windows update
      The tests were done over a 6 month period. So we can't just make a statement that the latest Windows patches were used. The system needs to remain the same for all products. Win 7 service pack 1 was not available at the start of testing.

      as normally AV scans the file only once, and not 4 times, not in a row. What you should have done is to measure only the first scan time
      There are 2 issues. One is that doing just a single run leads to more variance in the results. Several runs with an average smooths this out. The 2nd issue is that a lot of AV product do in fact do periodic scans. Scanning the same files multiple times.

      the average is not correctly calculated in some graphs:
      We are not aware of any problem. Note that there is a difference between saying there average was not incorrectly and saying that you didn't like some results and you think they should have been excluded from the average. I assume you are referring to the latter, and are saying that the average should become a subjective process, where someone needs to make a decision about which results are "worthy" of being included in an average?

      Dropping the best and worst might help a bit. Better solution might be to take the median rather than the mean.

      I agree is does make a couple of the graphs harder to read however.

      The number of reg keys does not affect performance in any way
      It is a general indication of bloat. Like the install size. You are correct that adding just a few keys doesn't measurably impact performance. You need at add / delete a lot of keys before it becomes significant.

      there were no "virus detection" tests
      Correct. The report was just about performance. Good AV detection testing is very hard to do.

      Comment

      Working...
      X