I have been trying to set up a test lab VMware Workstation based app server so I can run some WIN7 and WIN8 based "appliances" and have been doing some benchmarking to see where the bottlenecks/chokepoints might be located and how to optimize. For some reason I assumed that running SERV2102R2 as the host would be better than using WIN81 as a host, but that appears to be wrong. For lack of anything better I am using Passmark as a performance measuring tool both for the host and for the clients ( I LIKE easy to compare numbers!
Server is an i7-2600, Z77 chipset, 32Gb of ram, GTX 750ti, ADATA SSD256 as the boot drive. Client drive is ADATA SSD128 running on MB SATA3 or SATA3 PCI add in card. Using Workstation 10.3 and Passmark 8. Server2012R2 and WIN8.1, all updates, drivers are current.
Interestingly right from the start, Passmark seems to be marginally and consistently higher on WIN81 vs SERV2012 - CPU, 3D graphics, memory and disk performance are within 3% which may be within benchmark tolerance. BUT 2D graphics is 23% higher in WIN81 vs SERV2012. Not sure as to why this is.
On the client side I ran the WIN81 client on both host O/S and via the 2nd MB SATA3 port as well as via a Vantec PCIeSATA3 dual port card.
As expected the choice of SATA ports only affected the clients disk score but interestingly enough the client performance on the WIN81 host was significantly better across the board - CPU +10%, 2D +80%, 3D +20%, memory +25%, disk +30%. I would have expected that the client performance should have been much closer as the underlying architecture is supposed to be similar - if anything I would have expected they ...should... if anything be skewed in the favor of SERV>WIN81. because fewer services are running...
This seems counter intuitive - should I be running my appliances on a Win81 host? this just doesn't seem right! Alternatively should I be using something else as my host O/s? Don't really need super high 3D performance within the clients (no gaming involved) but snappy desktop response is definitely a priority.
I was wondering if others had seen similar results or achieved significantly better test results. I can't be the only one who has tried to do generic benchmarks inside a VM in the effort to determine optimal generic host configurations.
There seems to be a distinct lack of current Workstation performance tuning or testing info on the net - anyone have any suggestions or hints? Any comments or opinions?
Ragnar
Server is an i7-2600, Z77 chipset, 32Gb of ram, GTX 750ti, ADATA SSD256 as the boot drive. Client drive is ADATA SSD128 running on MB SATA3 or SATA3 PCI add in card. Using Workstation 10.3 and Passmark 8. Server2012R2 and WIN8.1, all updates, drivers are current.
Interestingly right from the start, Passmark seems to be marginally and consistently higher on WIN81 vs SERV2012 - CPU, 3D graphics, memory and disk performance are within 3% which may be within benchmark tolerance. BUT 2D graphics is 23% higher in WIN81 vs SERV2012. Not sure as to why this is.
On the client side I ran the WIN81 client on both host O/S and via the 2nd MB SATA3 port as well as via a Vantec PCIeSATA3 dual port card.
As expected the choice of SATA ports only affected the clients disk score but interestingly enough the client performance on the WIN81 host was significantly better across the board - CPU +10%, 2D +80%, 3D +20%, memory +25%, disk +30%. I would have expected that the client performance should have been much closer as the underlying architecture is supposed to be similar - if anything I would have expected they ...should... if anything be skewed in the favor of SERV>WIN81. because fewer services are running...
This seems counter intuitive - should I be running my appliances on a Win81 host? this just doesn't seem right! Alternatively should I be using something else as my host O/s? Don't really need super high 3D performance within the clients (no gaming involved) but snappy desktop response is definitely a priority.
I was wondering if others had seen similar results or achieved significantly better test results. I can't be the only one who has tried to do generic benchmarks inside a VM in the effort to determine optimal generic host configurations.
There seems to be a distinct lack of current Workstation performance tuning or testing info on the net - anyone have any suggestions or hints? Any comments or opinions?
Ragnar
Comment