Announcement

Collapse
No announcement yet.

Passmark and SLI #'s

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Passmark and SLI #'s

    So how well does passmark deal with SLI configs? I recently added a second 8800GT to my newly built system and my 3d score went DOWN. With one card I was getting ~1066-1020, with 2 cards I generally get ~920-950. Whats the deal?

    Overall my PT score is usually about a 1290, but I'm thinking I should be able to pull over a 1300 w/ 2 cards.

    The breakdown I generally see is this:

    3d simple: With one card I am averaging ~2400-2500fps, with 2 cards it drops to ~1900-2200

    3d medium: There are times where I've been over 800FPS, yet times where it wont go over 720-750. On average I'm getting ~750, but it flucuates ALOT! Not too sure whats going on here.

    3d complex: I've gotten over 70fps, but now I'm generally sitting ~62-67, ocassionally 60-61 (rare). Also why is it I can get over 100 FPS in COD4 @ 1400x1050 w/ 4x AA and all details on high, yet @ 800x600 in this test I struggle to hit 70?

    The rest of my system is:

    Vista x64
    Q6600 (stock speed)
    4gb of ddr2-800 (4-4-4-12)
    Nvidia Nforce 680i chipset
    2 x EVGA 8800 GT KO edition (675mhz / 1950 mhz / 1688 shader)

    What I've tried:

    -Overclocking the video card (700 / 2000 / 1750) and go NO performance increase.

    -Different drivers - 169.13, 169.21, 169.25 (which went WHQL today btw) -- no difference.

    -Added PT.exe to the nvidia control panel and changed SLI performance mode to AFR 1/2 (as opposed to single GPU), BAD IDEA! 3d simple went to ~500fps ouch!

    Any ideas?
    Intel Q9450
    EVGA 790i Ultra Motherboard
    4gb OCZ DDR3 1600 @ 7-7-7-24
    2 x EVGA 8800GT's KO Edition
    74gb Raptor HDD
    Vista Ultimate x64

  • #2
    My guess would be Performance test does not handle SLI correctly.
    Many games see a performance hit with SLI enabled so its a common problem.
    As far as the 3d complex test goes I would guess the video drivers are not optimized for low resolutions like the 800x600 that Performance Test uses.
    Main Box*AMD Ryzen 7 5800X*ASUS ROG STRIX B550-F GAMING*G.SKILL 32GB 2X16 D4 3600 TRZ RGB*Geforce GTX 1070Ti*Samsung 980 Pro 1 TB*Samsung 860 EVO 1 TB*Samsung 860 EVO 2 TB*Asus DRW-24B3LT*LG HL-DT-ST BD-RE WH14NS40*Windows 10 Pro 21H2

    Comment


    • #3
      When SLI was introduced the idea was that it would automatically speed up your 3D applications. At least this was how it was marketed and what people believed as a result. But it wasn't actually true. SLI was buggy and only worked effectively with a few popular games and the video card device drivers needed to be optimized for each new game.

      As I wrote back in 2005, "NVidia has provided zero information on SLI for developers. So there is no way to write software that is sure to work with
      SLI. Inside the NVidia device driver is a list of about 80 software applications that work with SLI. If your software is not in this NVidia list then the device driver will not attempt to use SLI for that application. (Only 80 applications out of 100,000's of applications available for Windows!)."

      The situation has improved slightly since then, and there is now a DLL that we could include with our software to programatically request the device driver to switch behaviors. But in our tests SLI seemed to lead to display corruption or reduced performance, didn't seem to work for Window'ed applications nor across multiple monitors. In short, it didn't work as advertised. And NVidia didn't want to provide any support unless you had a top ten game.

      Unfortunately this is also somewhat like going back to the bad old days of software development. The whole purpose of device drivers is to remove the need to write different code for each hardware device. (any older gamer will remember sound card problems due to each game needing code for each sound card). But this is now the situation NVidia is trying for force upon developers. Different code for different video cards.

      Even using using this DLL doesn't ensure better performance however. We would also need to re-write our code for nVidia's cards to ensure better performance (but it would be better performance only on their cards, not necessarily on ATI or Intel video cards, and is that fair if we are doing benchmarks?).

      Getting a frame rate of 100fps in COD4, or any other game, doesn't mean you should score more than 100fps in PerformanceTest. There is a deliberately high workload in the complex test.

      Having said all that, we might have another look at SLI (and Crossfire) for the next major release of PerfomanceTest.

      Comment


      • #4
        Originally posted by passmark View Post

        Getting a frame rate of 100fps in COD4, or any other game, doesn't mean you should score more than 100fps in PerformanceTest. There is a deliberately high workload in the complex test.

        Having said all that, we might have another look at SLI (and Crossfire) for the next major release of PerfomanceTest.
        I completely understand that, just found it odd the FPS dropped. Ah well, guess I have to wait until PT 6.2 huh?
        Intel Q9450
        EVGA 790i Ultra Motherboard
        4gb OCZ DDR3 1600 @ 7-7-7-24
        2 x EVGA 8800GT's KO Edition
        74gb Raptor HDD
        Vista Ultimate x64

        Comment


        • #5
          This is what happens when a Company thinks they are top dog.. Their cards get more expensive and their Drivers suck horribly because they don't seem to give a rats ass so long as they are speed king on paper or select game benches they optimize for to make their cards look good.. I have a GTS 8800 and a Geforce 6800 ultra and a Ati 800xl PCIE are kicking my video cards ass in both 3d and 2d tests with this software and my games.. How fn sad is that?

          Comment


          • #6
            Originally posted by SniprKlr View Post
            This is what happens when a Company thinks they are top dog.. Their cards get more expensive and their Drivers suck horribly because they don't seem to give a rats ass so long as they are speed king on paper or select game benches they optimize for to make their cards look good.. I have a GTS 8800 and a Geforce 6800 ultra and a Ati 800xl PCIE are kicking my video cards ass in both 3d and 2d tests with this software and my games.. How fn sad is that?
            Thing is, I kick my friends ass who has 2 x1950's in 3dmark 2006, its just in here I'm getting "low" scores.

            Ahh I've given up lol
            Intel Q9450
            EVGA 790i Ultra Motherboard
            4gb OCZ DDR3 1600 @ 7-7-7-24
            2 x EVGA 8800GT's KO Edition
            74gb Raptor HDD
            Vista Ultimate x64

            Comment

            Working...
            X