No announcement yet.

Good 2D performance in some resolutions only

  • Filter
  • Time
  • Show
Clear All
new posts

  • Good 2D performance in some resolutions only

    I'm working for a smaller company designing computers using COM Express modules. We're recently updated an older platform with the Intel i7-4650UE (4th gen) to the i7-1185GRE (11th gen).

    I've used PerformanceTest to measure the 2D performance and there are two tests that are worse on the new platform compared to the old one. Going deeper into the numbers show that it is the Image Filters and Image Rendering scores that are much lower, every other measurement is really good.

    I've gone through some posts on the forum related to 2D performance:

    I've understood that the tests are divided into two parts, and it looks like it is the DX11 part that does not perform as it is supposed to.

    I've moved away from our platform a little and done lots of testing on a commercial platform, a Dell Latitude 5520 with the similar CPU i7-1185G7 3.0GHz. It has the similar bad score on these numbers, so that looks a bit troublesome. But when I do the measurement in different resolutions I will get some strange numbers on two resolutions, 1680x1050 and 1400x1050. Here the numbers are indeed very good. This led me to doing test with other CPU/graphic cards to see if the problem has something to do with the Passmark testing program or if different graphic cards does not have a problem with these resolutions.

    To compare with something else than Intel I grabbed an older AMD graphics card, an OEM HD8670. That card and driver has a linear performance score when comparing different resolutions, I guess that is the performance penalty that Passmark adds for using lower resolutions. That points to that it is not the PerformanceTest program that does a bad job.

    When performing the 2D DirectX 11 Image Filter test, one can see that the rotating image is slow when the score is low.

    After working a lot of hours with this problem, I checked the GPU usage with TechPowerUp's GPU-Z tool. It shows that the score follows the percentage of used GPU in the DX11 test - the 1680/1400x1050 resolutions manage to use like 93-94% of the GPU, other resolutions only use some 30-54%, naturally giving the lower score.

    So, the problem is easy to replicate:
    * Select resolution in Windows
    * Run the PerformanceTest 2D tab and only the Image Rendering test. It takes about 5s.
    * Verify the score for different resolutions - some has a performance tap because the GPU is not fully used.

    I've reached out to Intel to take a look at this problem, as it seems it has something to do with what resources the graphics driver uses during the test. But is there some info that is important from the Passmark team? What does the program do when changing resolution (more than the penalty), is the data adjusted (the picture looks different in lower resolutions really so I guess something is done to match the resolution?)

    I'll add two graphs of the 2D Image Rendering score and the 2D Image Filtering score showing the 'abnormal' good numbers for two resolutions.

    Have you as reader seen this on your platform/computer? Any insights? For this special case, deciding what resolution to run during the test really affects the total score, as the 2D has a high weight for this test. Changing the 'Graphics 2D - Image Rendering' from 10 to 20 bumps the 2D mark from 186p to 258p, according to calculations from

    Graphics on these systems are the Intel HD5000 and the Intel Iris Xe, OS is Windows 10 x64 (LTSC). Driver (and lots of older too). I've used PT 10.2 b1010 most of the measurements, but also 10.1 b1007 just to see if there was any difference.

    The X scale in the graphs are the number of pixels in the resolution:

    Click image for larger version

Name:	image.png
Views:	347
Size:	5.4 KB
ID:	53711
    Click image for larger version

Name:	Passmark 2D Image Rendering in different resolutions.png
Views:	417
Size:	12.9 KB
ID:	53710Click image for larger version

Name:	Passmark 2D Image Filtering in different resolutions.png
Views:	462
Size:	17.6 KB
ID:	53709

  • #2
    I am wondering if when you change the resolution, you are indirectly changing other settings as well. e.g. Refresh rate or DPI.
    (Refresh rate might be forced to change as typically you can run at a higher Refresh rate at lower resolutions).

    Also what was the monitor's native solution?

    I did a test here and forcing a resolution of 1680x1050 didn't have the effect you are describing. But this was with a different configuration.
    (Config: PerformanceTest V10.2, Win10.0,19043, nVidia 3060 video card, driver 512.15, CPU i7-5820K, native resolution 2560x1440x75Hz, 96DPI).
    The results actually went slightly down for me after forcing a lower 1050 resolution. The opposite of your effect. So maybe it is an Intel only thing.

    There were also some nasty grid scaling artifacts at the lower (non native) resolution as well.
    Click image for larger version  Name:	Artifacts.jpg Views:	2 Size:	258.0 KB ID:	53719

    I should also note I got slightly different results in V10.2 vs V10.1. We are having a deeper look at that aspect.
    Update: 10.1 vs 10.2 performance difference in 2D could not be reproduced after carefully doing several more runs and testing a few different machines. Likely there was some background task running during the first run.


    • #3
      the native resolution of the Dell is 1920x1200. As I've tested with an AMD card too, that one does not have this kind of issue. Running the GPU-Z tool showed that it does not use 100% of the GPU, but much lower. Maybe the driver designer decided that 2D just does not need to use 100% when it hits some kind of good enough.

      I think the refresh rate stays the same on this monitor, 60Hz, but I'll check that. Technically, the signal over the cable always uses the native resolution and the scaling is done in the graphics card, so it will still send 1920x1200@60Hz but with scaled screen size. But that is also a thing that can be selected in the driver, so I'll check that too.

      I'm trying to report this issue to Intel but are having some problems with it being reported as spam. We'll see what they say when it comes through that filter.

      I see the same scaling artifacts too.

      Thanks for the feedback so far.


      • #4
        Technically, the signal over the cable always uses the native resolution and the scaling is done in the graphics card, so it will still send 1920x1200@60Hz
        This is an option for this (at least for some video cards) but I don't think it is the default. Pretty sure the default is display scaling.

        There is a block of data called the Extended Display Identification Data (EDID) stored in a monitor. This data is sent to the PC via the Dynamic Data Channel (DDC). The video card then uses this data to work out what timing and resolution to use.
        In this data block there is a list of resolutions. All this would be massive overkill if the video card always used the same settings and GPU scaling.

        But it certainly yet another display setting (along with power settings) that one could imagine could have an effect on system performance.