It appears sometimes the (mean?) averages displayed on the website can be skewed when a few users post a result with a significantly worse result than the rest; this is especially the case when there's only tens of results.
e.g. I was looking at the results of the A9-9420e and found only 15 entries in PTv10, and most score in the 1200 - 1300, while one user submitted a score of 700.
Looking at the downloaded baseline, it appears your software is also storing what background processes are running (not sure why?), and perhaps your software doesn't doesn't refuse to run a benchmark even when the CPU is overloaded with background tasks (whether background open browsers, or background Windows tasks), and worse still, doesn't inform the user and lets them submit a skewed result?
I came across a video of a user a little dumbfounded when he attempted to run UserBenchmark and found it scored significantly worse than other similarly spec'd machines, unaware that it's likely background tasks which gave the wrong figures - the website even states that there was significant background CPU usage!
e.g. I was looking at the results of the A9-9420e and found only 15 entries in PTv10, and most score in the 1200 - 1300, while one user submitted a score of 700.
Looking at the downloaded baseline, it appears your software is also storing what background processes are running (not sure why?), and perhaps your software doesn't doesn't refuse to run a benchmark even when the CPU is overloaded with background tasks (whether background open browsers, or background Windows tasks), and worse still, doesn't inform the user and lets them submit a skewed result?
I came across a video of a user a little dumbfounded when he attempted to run UserBenchmark and found it scored significantly worse than other similarly spec'd machines, unaware that it's likely background tasks which gave the wrong figures - the website even states that there was significant background CPU usage!
Comment