Codecs Lav filters now support DXVA2 (1 Viewer)

Spooky

MP Donator
  • Premium Supporter
  • February 14, 2005
    1,187
    47
    void 4tl
    Home Country
    Austria Austria
    So, for me (NVIDIA user) what is better to use, CUVID or DXVA2?

    CUDA.

    Why? Isn't it pretty much the same (if you get DXVA running, which I understand can be a bit tricky)?

    I only ask to learn...:p
    In principle you should prefer DXVA. When running CUDA applications on your GPU, the GPU will be in its maximum performance power state, meaning that it will consume more power and produce more heat. When using DXVA, only specialized components of the GPU are used (NVidia PureVideo for instance) in a lower power state, consuming less power and producing less heat.

    But in its current implementation, it's probably better to use the LAV CUVID instead of LAV DXVA2.
     

    tourettes

    Retired Team Member
  • Premium Supporter
  • January 7, 2005
    17,301
    4,800

    Why? Isn't it pretty much the same (if you get DXVA running, which I understand can be a bit tricky)?

    I only ask to learn...:p
    In principle you should prefer DXVA. When running CUDA applications on your GPU, the GPU will be in its maximum performance power state, meaning that it will consume more power and produce more heat. When using DXVA, only specialized components of the GPU are used (NVidia PureVideo for instance) in a lower power state, consuming less power and producing less heat.

    But in its current implementation, it's probably better to use the LAV CUVID instead of LAV DXVA2.

    Actually CUDA is using same decoder part of GPU as the DXVA is.

    Video Decoder utilizing the NVIDIA hardware decoder engine through the CUDA Video Decoding API ("CUVID").
     

    Spooky

    MP Donator
  • Premium Supporter
  • February 14, 2005
    1,187
    47
    void 4tl
    Home Country
    Austria Austria
    Why? Isn't it pretty much the same (if you get DXVA running, which I understand can be a bit tricky)?

    I only ask to learn...:p
    In principle you should prefer DXVA. When running CUDA applications on your GPU, the GPU will be in its maximum performance power state, meaning that it will consume more power and produce more heat. When using DXVA, only specialized components of the GPU are used (NVidia PureVideo for instance) in a lower power state, consuming less power and producing less heat.

    But in its current implementation, it's probably better to use the LAV CUVID instead of LAV DXVA2.

    Actually CUDA is using same decoder part of GPU as the DXVA is.

    Video Decoder utilizing the NVIDIA hardware decoder engine through the CUDA Video Decoding API ("CUVID").
    Hm, but some users have shown, that the GPU is in a higher power state and indeed consuming more power when using LAV CUVID, instead of DXVA. I guess via the CUVID API the PureVideo components of the GPU are accessed, however since it is still a CUDA application, the GPU remains in the maximum performance power state.
     

    logifuse

    Portal Pro
    January 19, 2011
    118
    11
    Sydney
    Home Country
    I'm finding that my ION2 client can only use the MS decoder for 1080i MPEG2 as it doesn't have enough CUDA cores for CUVID, & the copy back DXVA2 struggles with the PCIe x1 limitation. Looking forward to full DXVA2 mode to see if that will work.
     

    tourettes

    Retired Team Member
  • Premium Supporter
  • January 7, 2005
    17,301
    4,800
    Actually CUDA is using same decoder part of GPU as the DXVA is.

    Video Decoder utilizing the NVIDIA hardware decoder engine through the CUDA Video Decoding API ("CUVID").

    Then why did you recommend CUDA in the previous page of this thread?

    In my own experience CUDA implemetation in LAV is more error resistent.

    I'm finding that my ION2 client can only use the MS decoder for 1080i MPEG2 as it doesn't have enough CUDA cores for CUVID, & the copy back DXVA2 struggles with the PCIe x1 limitation. Looking forward to full DXVA2 mode to see if that will work.

    if I remember correctly CUDA does similar copy back as well for the texture.
     

    michael_t

    Portal Pro
    November 30, 2008
    1,258
    813
    Home Country
    Germany Germany
    What codec (MS or LAV) would you recommend for live tv (720p) and hd videos with a ati hd 4200 onboard GPU?

    Michael
     

    logifuse

    Portal Pro
    January 19, 2011
    118
    11
    Sydney
    Home Country
    MS is good for an ATI card. You'll probably need to manually adjust your CCC settings & set deinterlacing lower than VA.

    I think I read LAV 0.47 has full DXVA support, so it might be worth a try.
     

    michael_t

    Portal Pro
    November 30, 2008
    1,258
    813
    Home Country
    Germany Germany
    I just tried it out and saw that I have lots of framedrops (about 50%) with both decoders on my single-seat system when playing live tv or tv recordings (hd and sd) with a 1920x1080 resolution (HDMI). I did have hardly any frame drops in the past with the MS decoder and a 1360x768 screen resolution (VGA). The frame drops are also much less (but not 0) when playing the same tv recordings in high resolution as a video.
    The latter would lead to the conclusion that the source of the problem is the streaming process from tvservice to tvclient. But the same hd recordings perform much better with a lower screen resolution, which obviously indicates a decoder problem. How can I find out what is really wrong or is this the best I can get out of my ati 4200 onboard graphics? Then I would like to know what hardware configuration is really suited for HD tv on a large screen?

    Michael
     

    Users who are viewing this thread

    Top Bottom