With your tests I have examined several older systems before and (in some cases) after upgrading hardware. It was most curious to me that the default scores achieved on the CPU tests fell well short of the averages reported by others with the same processors - at least with the setups here of Windows 10/64 Pro. For instance, an Intel quadcore Q6600 CPU averages 2985 according to your other testers, yet the first time through obtained a measly 2048 [itself a number that at exactly 2^11 power makes me somewhat wary] under system conditions that are ordinary here. Guessing that speedstep - which was set by Windows Power Options to run within a 5%-100% range - could possibly be an issue, I reran the test with processor at 100%min/100%max power, and saw a 2418 as the score. Lastly, the closing of half a dozen background programs such as Dropbox, Evernote, and Teamviewer let the cpu achieve a 2607, approaching but still not up to the average of the published results.
CPU-testing other venerable processors such as an e3400 and a T4200 and a T5670 with Power Options set to float the CPU between 5%min and 100%max yielded results that similarly stank compared to published averages, but which came into the average or even above average range when the processors were set to run flat out at 100% max speed. In fact, the dual-cores seemed even more affected by the change in power setting, with the float scores at around 50-60% of published averages and fixed scores in the 80-110% of the averages.
So my question is whether the published tests reflect random user settings, or perhaps should be seen in light of implicit recommendations and common sense for power settings and background programs. I could certainly understand the possibility of loading a barebones Windows OS under which to run tests and obtain the "best" results, but the practical relevance to me would be quite limited.
Or, a reasonable question would be, is Microsoft or the board manufacturer or the chip manufacturer unintentionally limiting the performance of a speedstep CPU, by creating too much lag between states? Or is the aged technology on my boards/chips creating sub-optimal conditions to run tests? Or could the tests themselves have an option to briefly change the power options?
In any case, I await any comments, wisdom, or admonition.
Thanks. ITG
p.s. Yes I know that running a new OS on an old system is asking for degraded performance, but I have been pleasantly surprised that an optimal combination of upgraded cpu, memory and hdd->ssd lets a system as old as 8-10 years run quite well for everyday desktop or laptop use - and at less than $100 cost per computer including new batteries for the laptops. Were I to be a gamer, 'twould be highly unlikely that the old mule computers could be whipped into shape to run today's games.
p.p.s. Thanks very much for the testing programs and forums. You fill a vital need.
CPU-testing other venerable processors such as an e3400 and a T4200 and a T5670 with Power Options set to float the CPU between 5%min and 100%max yielded results that similarly stank compared to published averages, but which came into the average or even above average range when the processors were set to run flat out at 100% max speed. In fact, the dual-cores seemed even more affected by the change in power setting, with the float scores at around 50-60% of published averages and fixed scores in the 80-110% of the averages.
So my question is whether the published tests reflect random user settings, or perhaps should be seen in light of implicit recommendations and common sense for power settings and background programs. I could certainly understand the possibility of loading a barebones Windows OS under which to run tests and obtain the "best" results, but the practical relevance to me would be quite limited.
Or, a reasonable question would be, is Microsoft or the board manufacturer or the chip manufacturer unintentionally limiting the performance of a speedstep CPU, by creating too much lag between states? Or is the aged technology on my boards/chips creating sub-optimal conditions to run tests? Or could the tests themselves have an option to briefly change the power options?
In any case, I await any comments, wisdom, or admonition.
Thanks. ITG
p.s. Yes I know that running a new OS on an old system is asking for degraded performance, but I have been pleasantly surprised that an optimal combination of upgraded cpu, memory and hdd->ssd lets a system as old as 8-10 years run quite well for everyday desktop or laptop use - and at less than $100 cost per computer including new batteries for the laptops. Were I to be a gamer, 'twould be highly unlikely that the old mule computers could be whipped into shape to run today's games.
p.p.s. Thanks very much for the testing programs and forums. You fill a vital need.
Comment