home
products
contribute
download
documentation
forum
Home
Forums
New posts
Search forums
What's new
New posts
All posts
Latest activity
Members
Registered members
Current visitors
Donate
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Search titles only
By:
Menu
Log in
Register
Navigation
Install the app
Install
More options
Contact us
Close Menu
Forums
MediaPortal 1
MediaPortal 1 Talk
madVR Settings thread
Contact us
RSS
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="LoDeNo" data-source="post: 1262379" data-attributes="member: 88777"><p>I did some additional tests :</p><p> - I connected a "powerfull PC" (i7 880, Geforce GTX1060) to the 4K HDV TV : everything is OK. Very low CPU use, medium GPU use</p><p> - I connected a HD screen (1080) to the HTPC and read 4K HDR video with MadVR : everything is OK, but low GPU use, high CPU</p><p></p><p>Conclusion : the codec do not use the GT1030 to decode. On HD screen, video are smooth because G4560 is powerfull enought for HD. But on 4K HDR TV, it's too weak.</p><p>So my problem is that the codec use the internet iGPU of the G4560 instead of the hardware GT1030 decoder. However I totally disabled iGPU in the bios and I configured LAV to use GT1030. It still use IGPU even if the TV is connected to the GT1030. It's strange.</p><p></p><p>Did someone solve this problème ?</p><p>I tried a complete new windows install : same problem.</p><p>I tried to activate the IGPU, but still choose GT1030 in LAV codec : same problem.</p><p></p><p>Still searching ....</p></blockquote><p></p>
[QUOTE="LoDeNo, post: 1262379, member: 88777"] I did some additional tests : - I connected a "powerfull PC" (i7 880, Geforce GTX1060) to the 4K HDV TV : everything is OK. Very low CPU use, medium GPU use - I connected a HD screen (1080) to the HTPC and read 4K HDR video with MadVR : everything is OK, but low GPU use, high CPU Conclusion : the codec do not use the GT1030 to decode. On HD screen, video are smooth because G4560 is powerfull enought for HD. But on 4K HDR TV, it's too weak. So my problem is that the codec use the internet iGPU of the G4560 instead of the hardware GT1030 decoder. However I totally disabled iGPU in the bios and I configured LAV to use GT1030. It still use IGPU even if the TV is connected to the GT1030. It's strange. Did someone solve this problème ? I tried a complete new windows install : same problem. I tried to activate the IGPU, but still choose GT1030 in LAV codec : same problem. Still searching .... [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
MediaPortal 1
MediaPortal 1 Talk
madVR Settings thread
Contact us
RSS
Top
Bottom