Normal
I did some additional tests : - I connected a "powerfull PC" (i7 880, Geforce GTX1060) to the 4K HDV TV : everything is OK. Very low CPU use, medium GPU use - I connected a HD screen (1080) to the HTPC and read 4K HDR video with MadVR : everything is OK, but low GPU use, high CPUConclusion : the codec do not use the GT1030 to decode. On HD screen, video are smooth because G4560 is powerfull enought for HD. But on 4K HDR TV, it's too weak.So my problem is that the codec use the internet iGPU of the G4560 instead of the hardware GT1030 decoder. However I totally disabled iGPU in the bios and I configured LAV to use GT1030. It still use IGPU even if the TV is connected to the GT1030. It's strange.Did someone solve this problème ?I tried a complete new windows install : same problem.I tried to activate the IGPU, but still choose GT1030 in LAV codec : same problem.Still searching ....
I did some additional tests :
- I connected a "powerfull PC" (i7 880, Geforce GTX1060) to the 4K HDV TV : everything is OK. Very low CPU use, medium GPU use
- I connected a HD screen (1080) to the HTPC and read 4K HDR video with MadVR : everything is OK, but low GPU use, high CPU
Conclusion : the codec do not use the GT1030 to decode. On HD screen, video are smooth because G4560 is powerfull enought for HD. But on 4K HDR TV, it's too weak.
So my problem is that the codec use the internet iGPU of the G4560 instead of the hardware GT1030 decoder. However I totally disabled iGPU in the bios and I configured LAV to use GT1030. It still use IGPU even if the TV is connected to the GT1030. It's strange.
Did someone solve this problème ?
I tried a complete new windows install : same problem.
I tried to activate the IGPU, but still choose GT1030 in LAV codec : same problem.
Still searching ....