Hello Passmark,
I am currently in the market for a new Graphics card and have tended to come here to try and determine the best bang for my buck performance wise. Usually AMD tends to come ahead in this in the CPU market and sometimes with Graphics cards. I have noticed that the new line of AMD Graphics cards appear to pale in comparison to the Nvidia cards in the same price bracket to the point where your chart indicates they shouldn't really be considered (Nvidia cards end up with a higher performance vs cost). For example looking at the AMD R9 Fury vs the 980ti, your benchmark indicates the 980 is 40% faster for a similar price. However as I look at benchmarks the two cards appear more neck and neck while giving Nvidia the edge usually.
For example here:
http://techreport.com/review/28685/geforce-gtx-980-ti-cards-compared/5
It does not appear to demonstrate a 40% performance improvement via the majority of the tests that we have seen. That being said I did read that the 300 line is a rebrand of the 200 line and your software has a hard time determining the difference, however I can't imagine the performance between the benchmark program and game testing wouldn't be reflective of eachother.
Could you please explain what is causing the sizeable discrepency between your benchmark and other benchmarks that we have seen? Is there some kind of weight that you are multiplying performance numbers that may not be reflective of real world tests?
Thank you,
FrozenIceman
I am currently in the market for a new Graphics card and have tended to come here to try and determine the best bang for my buck performance wise. Usually AMD tends to come ahead in this in the CPU market and sometimes with Graphics cards. I have noticed that the new line of AMD Graphics cards appear to pale in comparison to the Nvidia cards in the same price bracket to the point where your chart indicates they shouldn't really be considered (Nvidia cards end up with a higher performance vs cost). For example looking at the AMD R9 Fury vs the 980ti, your benchmark indicates the 980 is 40% faster for a similar price. However as I look at benchmarks the two cards appear more neck and neck while giving Nvidia the edge usually.
For example here:
http://techreport.com/review/28685/geforce-gtx-980-ti-cards-compared/5
It does not appear to demonstrate a 40% performance improvement via the majority of the tests that we have seen. That being said I did read that the 300 line is a rebrand of the 200 line and your software has a hard time determining the difference, however I can't imagine the performance between the benchmark program and game testing wouldn't be reflective of eachother.
Could you please explain what is causing the sizeable discrepency between your benchmark and other benchmarks that we have seen? Is there some kind of weight that you are multiplying performance numbers that may not be reflective of real world tests?
Thank you,
FrozenIceman
Comment