Hi, I discovered on Avast forums your latest Anti-virus performance benchmark.
Pretty informative test.
You describe the tests in details - that makes it very trustworthy.
There are some nice and thorough tests, including boot time test using Windows SDK (that is a cool approach, i bet noone used it before), installed size test, etc.
At the same time you did some slips:
- it seems that you did not install the latest windows updates, (which is an unusual situation nowadays)
- for file scans, you average based on 4 runs (because the first one is slower most of the times). That is incorrect approach, as normally AV scans the file only once, and not 4 times, not in a row. What you should have done is to measure only the first scan time.
- the average is not correctly calculated in some graphs: there is a clear outsider on many tests that is like 1000% slower that the rest. That outsider should have been excluded from the "average". Also the scale on some graphs is messed up by that outsider, which makes it difficult to compare.
- you count amount of reg keys created. The number of reg keys does not affect performance in any way.
- there were no "virus detection" tests... so these tests are only about performance, right?
Besides that, very good and useful report!
Pretty informative test.
You describe the tests in details - that makes it very trustworthy.
There are some nice and thorough tests, including boot time test using Windows SDK (that is a cool approach, i bet noone used it before), installed size test, etc.
At the same time you did some slips:
- it seems that you did not install the latest windows updates, (which is an unusual situation nowadays)
- for file scans, you average based on 4 runs (because the first one is slower most of the times). That is incorrect approach, as normally AV scans the file only once, and not 4 times, not in a row. What you should have done is to measure only the first scan time.
- the average is not correctly calculated in some graphs: there is a clear outsider on many tests that is like 1000% slower that the rest. That outsider should have been excluded from the "average". Also the scale on some graphs is messed up by that outsider, which makes it difficult to compare.
- you count amount of reg keys created. The number of reg keys does not affect performance in any way.
- there were no "virus detection" tests... so these tests are only about performance, right?
Besides that, very good and useful report!
Comment