I ran this test multiple times and kept getting the same result. Chrome uses more onboard intergraded GPU resources than Brave creating a better experience, the video is smoother and the fan doesn’t spin up (Laptop). I wonder if this can be changed in Brave’s code? Brave is using way too much CPU and not leveraging the onboard integrated graphics card (I disabled hardware acceleration for battery usage and I don’t want to hear the fan running+heat)
When HWA is enabled, do you get the same results?
With hardware acceleration its using much less processor but splitting the load between the Nvidia 1050 and integrated GPU. I think the allocation tables just need to be changed for integrated GPU.
This can be done in windows, you can change Brave to use Nvidia instead of the Intel iGPU. I do the same thing on my Ryzen Laptop (Nvidia + Vega graphics)
I did change it to use intergraded GPU instead of the Nvidia GPU and it uses too much CPU vs GPU power.
Yeah, some GPU’s will decode higher resolution videos better than others. If all things are equal (same GPU/CPU/HW=off/on used by both) how video is offloaded to the GPU is no different between the chromium browsers (Brave, Chrome etc). I would chose the best performing GPU for your browser if that’s the important issue here.
Unrelated; but when I was tuning gpu usage.
On my laptop, my Vega iGPU uses much less power, so changing the default from Nvidia makes more sense for battery usage. Nothing about Brave being an issue, just how the hardware performs. Nvidia GPU video decoding performs much better, but also uses much more power.
Apparently it is different, I played the same video and did the test multiple times and got the same result.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.