madVR Settings thread (3 Viewers)

KlausWirn

Portal Pro
November 18, 2010
87
18
New question:
Is it correct that toggling through the profiles only works for the first and second profile?
I tested it with 5 profiles and tried to switch between them with just a shortcut, but could not get it to work. It only switches between the first two profiles. So i guess that functions as designed?
 

joecrow

Test Group
  • Team MediaPortal
  • August 9, 2012
    2,528
    1,880
    Home Country
    Germany Germany
    Is it correct that toggling through the profiles only works for the first and second profile?

    I'm really not sure on this one, I tried with just 2 profiles but could never reliably toggle between then, from memory it would switch from the first to second but never back again. I ended up using a different key code for each profile and that worked OK but is almost certainly not practical for 5 profiles, hopefully someone else can add some advice here.
     

    KlausWirn

    Portal Pro
    November 18, 2010
    87
    18
    I'm really not sure on this one, I tried with just 2 profiles but could never reliably toggle between then, from memory it would switch from the first to second but never back again. I ended up using a different key code for each profile and that worked OK but is almost certainly not practical for 5 profiles, hopefully someone else can add some advice here.
    Me too.
    Ended up with many shortcuts, and make it a lot more difficult to switch through all profiles i added.
    So hopefully someone can help.
     

    CuraeL

    New Member
    July 14, 2019
    3
    1
    36
    Home Country
    Denmark Denmark
    Thanks for the reply.
    Unfortunately thats not my main question.
    I would like to know if its relevant if the "WindowsScreenResolution" match the movie properties.
    Does it cost the same performance for cpu/gpu or is there a big differenz when madvr needs to scale the movie (up/or down)

    What would be the best way:

    Examples hopefully explain more :)

    Example1:
    Set Windows to same ressolution and refreshrate as movie.

    Movie is 2160p/23.976 HDR
    Windows is set to 2160p/24hz
    Madvr dont need to scale anything -> more free cpu/gpu capacity

    same with another resolution
    Movie is 1080p/23.976
    Windows is set to 1080p/24hz
    Madvr dont need to scale anything -> more free cpu/gpu capacity

    Example2:
    Movie is 1080p/23.976
    Windows is 2160p/24hz
    Madvr scale up to UltraHD -> less cpu/gpu capacity

    or
    Movie is 2160p/23.976
    Windows is 1080p/24hz
    Madvr scale down to FullHD -> less cpu/gpu capacity

    Wha would be the best way (except the loss of usability by looking changing resolution and frequency)


    And last question:
    How do i get the native looking from an 2160p HDR movie without any picture improvement. Just like the art directotr wanted to show it to the audience.


    Under your display settings in MadVR Control Center you wanna config manual refresh rate switch. It's a line of comma separated resolutions + frame rates you configure MadVR to be allowed to change to. If you use MadVR correctly these should always be the same resolution as your native display resolution.

    The whole point of MadVR is to leverage the extra horsepower of your GFX card to upscale your content to native display resolution. This means that using MadVR to downscale from 4K to 1080p is a waste, besides, MadVR will then use Cat-mul downscaling to process the 4K into 1080p. You're basically just forcing resources onto the gfx card to be spent on getting a worse image. The correct way to go about using MadVR with 4K content on a native 4K screen is to actually set up a profile that will disable all upscaling and image processing when watching native 4K content. This way you will just have MadVR render the 4K file without any extra resources for image scaling. I'm talking about upscaling chroma on a 4K image, which can be really heavy and also doing image processing on an image that is not being upscaled but is already 4K will hit the GPU hard for no reason - it's basically a waste.

    Basically you want everything below 4K to be upsclaled by MadVR and leave 4K untouched.

    Bottom line is: Using MadVR in a manner where you set it to downscale to anything but native display resolution is a waste and not worth the use of MadVR to begin with.

    You should only force MadVR into your native resolution, but then with different refresh rates so that MadVR is limited to only use 23,24,30,59,60 in 2160p.

    It is a hardware upscaler (THE hardware upscaler), after all. It's supposed to upscale your SD and HD content into higher resolutions, if you're not doing this, you shouldn't be using MadVR to begin with.

    In terms of the HDR 4K question, MadVR has a feature to "Pass HDR metadata to display" under the Display settings option in Control Panel. This way MadVR will not do anything to the HDR signals on the way to the TV. You can also have MadVR do it's own HDR treatment which is usually better or more correct than what your TV comes with. But that is entirely up to you.

    Just know that "What the director intended" is hardly achieved anyway since the displays are made to clip highlights according to their ramp up to 1000-2000 nits~ (LCD panels ofc, OLED is lower.) Where as the bluray you're watching can be mastered all the way up to 4000 nits. So the display will have to treat the nits to a ramp so it can "keep" a decent ramp of brightness. This means that every TV out there uses different settings depending on display and software and that MadVR might be able to do something better with the HDR than what your LG, Samsung, Sony... what ever display does out of the box. So.. Let MadVR passthrough the HDR metadata, or try to have MadVR treat it - see what you like the most. It's up to you.

    See ya!
     

    LoDeNo

    MP Donator
  • Premium Supporter
  • February 19, 2009
    337
    69
    Home Country
    France France
    Hi,

    I used MP1 with an HD TV (1080). Evrything was fine.
    My config was Pentium G4560 with internal graphic intel HD. I used LAV codec and Enhanced Video Renderer in MP.


    I changed for a 4K HDR TV (samsung Qled)
    I upgraded my HTPC with a Geforce GT1030 because it have H265 hard decoding abillity (yes ?)
    Everythink works fine (smooth video), even 4K video. But colors are drab/dull.
    I changed video renderer in MP for MadVR. I added MadVR codec in windows and configured it (using GT1030, copy-back).
    Color are bright now but video are not smooth anymore (very jerky).
    I read that MadVr do not need a big CPU if you decode via GPU, is it true ? My windows monitor show my a very important CPU use (75%) when I use MadVR.

    My question is :
    - Is my PC (Pentium G4560 and GeForce GT1030) powerful enough ?
    - If not : where is the problem ? GPU or CPU ?
    - if Yes : I would be happy with any help to configure it. I tried a lot of configurations in MadVr, with no result.

    Thanks !
     
    Last edited:

    LoDeNo

    MP Donator
  • Premium Supporter
  • February 19, 2009
    337
    69
    Home Country
    France France
    I did some additional tests :
    - I connected a "powerfull PC" (i7 880, Geforce GTX1060) to the 4K HDV TV : everything is OK. Very low CPU use, medium GPU use
    - I connected a HD screen (1080) to the HTPC and read 4K HDR video with MadVR : everything is OK, but low GPU use, high CPU

    Conclusion : the codec do not use the GT1030 to decode. On HD screen, video are smooth because G4560 is powerfull enought for HD. But on 4K HDR TV, it's too weak.
    So my problem is that the codec use the internet iGPU of the G4560 instead of the hardware GT1030 decoder. However I totally disabled iGPU in the bios and I configured LAV to use GT1030. It still use IGPU even if the TV is connected to the GT1030. It's strange.

    Did someone solve this problème ?
    I tried a complete new windows install : same problem.
    I tried to activate the IGPU, but still choose GT1030 in LAV codec : same problem.

    Still searching ....
     

    joecrow

    Test Group
  • Team MediaPortal
  • August 9, 2012
    2,528
    1,880
    Home Country
    Germany Germany
    Conclusion : the codec do not use the GT1030 to decode. On HD screen, video are smooth because G4560 is powerfull enought for HD. But on 4K HDR TV, it's too weak.
    So my problem is that the codec use the internet iGPU of the G4560 instead of the hardware GT1030 decoder. However I totally disabled iGPU in the bios and I configured LAV to use GT1030. It still use IGPU even if the TV is connected to the GT1030. It's strange.

    Did you try setting LAV to DXVA" (native), that is normally recommended for Nvidia GPUs though that does not explain why the GT1030 is not being used. I am not familiar with the GTX 1030 but judging from the price it may not be powerful enough to use the higher end features of Madvr but it is H265 capable.
    The attached screenshot is of my LAV Video settings with a HEVC (H265) vid playing, note it lists the "Active Hardware Accelerator", in this case my amd rx460, cpu during playback is less than 10%.
    Hope that helps
    lav1.jpg
     

    LoDeNo

    MP Donator
  • Premium Supporter
  • February 19, 2009
    337
    69
    Home Country
    France France
    thx.
    I tried Native too, same problem. I'll try again ...

    Look at my screenshot (on the "powerfull PC", what is working fine):

    19100912574023451816451356.png

    You can see the the HEVC video running in the background, in MP1.
    But active decode still <inactive> !
    How can I display the active decoder like you ?
     

    joecrow

    Test Group
  • Team MediaPortal
  • August 9, 2012
    2,528
    1,880
    Home Country
    Germany Germany
    I would untick the "Enable Adaptive HW Deinterlacing but other than that the settings look OK, was your screenshot an HD or 4k vid?
    Are you sure you have LAV set for everything under MP Config/Codecs and Renderer/Video Codecs. If yes just double check by reselecting each LAV codec and clicking on OK when all done to close the MP Configuration program.
     

    LoDeNo

    MP Donator
  • Premium Supporter
  • February 19, 2009
    337
    69
    Home Country
    France France
    The video is a 4k HDR film, and the sreen is a HD monitor.
    I unticked "Enable Adaptive HW Deinterlacing" but I think it's not used because it's grey.
    My config in MP configuration is full LAV :
    mini_19100902140323451816451490.png

    Détails for HEVC :
    mini_19100902140423451816451491.png

    Renderer config :
    mini_19100902140223451816451489.png


    In your screenshot, the title of the windows is "LAV video decoder Properties"
    In My screenshot, the title of the windows is juste "Properties".
    How did you open this windows ?
    I Did this way :
    mini_19100902223423451816451495.png
     
    Last edited:

    Users who are viewing this thread

    Top Bottom