Community Founders Amaterasu イタチ Posted August 1, 2023 Community Founders Share Posted August 1, 2023 The best graphics cards don't always have to run full throttle. Sometimes, we may leave our system idling or use it to do the normal stuff, like watching YouTube videos. Some of us may ignore the graphics card's power consumption. Still, the German publication ComputerBase has inadvertently unearthed a simple option to help reduce idle power consumption on AMD Radeon graphics cards if you're running single or dual high-resolution, high-refresh-rate monitors. ComputerBase recently upgraded its labs with Powenetics V2 measuring software and hardware to improve power metric logging and a new 4K display with a 144 Hz refresh rate and Adaptive Sync support. As a result, the news outlet had to retest all its graphics cards and stumbled upon a discovery that enabling Variable Refresh Rate (VRR) can substantially lower the graphics card's idle power draw. The VRR option is available in Windows and AMD's drivers; however, the latter cannot enable it automatically. VRR is a technology that automatically adjusts your monitor's refresh rate to match the frame rate output from your games or content so that both remain in sync. You may find the VRR term under a different name, depending on the manufacturer, like Adaptive Sync, AMD FreeSync, or Nvidia G-Sync. AMD's RDNA 3 graphics cards, specifically the Radeon RX 7900 XTX, showed the most significant benefit. According to ComputerBase's results, the Radeon RX 7900 XTX drew 81% and 71% less idle power in a single-and dual-monitor configuration, respectively. Even under a more realistic test with Windows movement, the graphics card saw 36% lower power consumption. Watching YouTube at SDR and 60 FPS didn't yield significant results. With HDR enabled, however, the graphics card consumed 31% less power.The phenomenon isn't limited to AMD's RDNA 3 graphics cards, either. ComputerBase saw similar behavior on last-generation RDNA 2 SKUs, such as the Radeon RX 6800 XT and Radeon RX 6700 XT, although to a lesser degree. For example, enabling VRR decreased the Radeon RX 6800 XT's idle power consumption by 79% and the window movement test by 17%. The option didn't substantially impact the Radeon RX 6800 XT in the other scenarios. VRR enablement had a reversed effect on Intel and Nvidia graphics cards. ComputerBase recorded up to 11% higher idle power consumption on the Arc A770. However, Arc A770 sipped 4% less power with the YouTube HDR test at 60 FPS. The same option negatively impacted Nvidia's latest GeForce RTX 40-series graphics cards, such as the GeForce RTX 4080. ComputerBase observed 25% higher idle power on a single monitor and up to 12% on a dual-monitor layout. The VRR option also increased power consumption between 4% to 5% when watching YouTube videos. People had been critical of AMD's RDNA 3 graphics card's idle power consumption and high-resolution display, and with good reason. A graphics card shouldn't pull over 100W when the system is idling or even when watching a YouTube video. The VRR option is a game-changer for RDNA 3, as it brings the graphics card's idle power consumption down to Ada Lovelace levels in idle scenarios. However, Ada Lovelace is still power efficient. For example, even with VRR enabled, the Radeon RX 7900 XTX's power consumption was still higher than the GeForce RTX 4080 in the Windows movement and YouTube tests. ComputerBase has contacted AMD to inquire about the technical details of why and how VRR can make a big difference in lowering idle power consumption on RDNA 3 graphics cards. However, the chipmaker hasn't gotten back to the German publication, but we'll keep our eyes peeled when it does and update this article with some insight from AMD. https://www.tomshardware.com/news/amd-rdna-2-rdna-3-gpu-idle-power-draw-reduced-by-enabling-vrr Link to comment Share on other sites More sharing options...
Recommended Posts