A bug in the Nvidia PureVideo Decoder ? (1 Viewer)

Taipan

Retired Team Member
  • Premium Supporter
  • February 23, 2005
    2,075
    44
    Melbourne
    Home Country
    Australia Australia
    MediaPortal Version: 0.2.3.0 RC2 + latest SVN
    MediaPortal Skin: BlueTwo wide
    Windows Version: Windows XP Prof. SP2
    CPU Type: AMD Athlon XP 2700+
    HDD: Seagate 200GB ATA100
    Memory: 768MB - DDR 333 (PC3200)
    Motherboard: Gigabyte 7VM400M-RZ
    Motherboard Chipset: VIA KM400
    Motherboard Bios:
    Video Card: Gigabyte - Radeon 9550
    Video Card Driver: ATI Radeon v6.4
    Sound Card: On board VIA VT1617
    Sound Card AC3: using 5.1 analog outputs
    Sound Card Driver: VIA Vinyl Audio v6.50a
    1. TV Card: Dvico FusionHDTV DVB-T
    1. TV Card Type: DVB-T
    1. TV Card Driver: v 3.50.02
    2. TV Card: DNTVLive! LP DVB-T
    2. TV Card Type: DVB-T
    2. TV Card Driver: v 2.0.0.4
    MPEG2 Video Codec: Nvidia PureVideo
    MPEG2 Audio Codec: MPA
    HTPC Case: custom built
    Cooling: custom built - super quiet
    Power Supply: custom built - 200W
    Remote: MCE - Australian version
    TV: Mitsubishi HC1100 DLP projector
    TV - HTPC Connection: D-sub (15 pin)



    I can receive the following video resolutions via DVB-T

    576i
    576p
    720p
    1080i​

    I have been doing extensive testing recently to find a video decoder that gives good performance with all those video resolutions, and can process .dvr-ms recorded files. The closest I have found (out of MPV, Descaler, CyberLink, Intervideo, Dvico DxVA decoder, and Nvidia PureVideo) as the best all-rounder is the Nvidia PureVideo decoder.

    However, using the Nvidia PureVideo decoder gives some odd results:-

    Using the Nvidia PureVideo decoder and setting the “De-interlace control” to “Smart” (the default setting) gives jerky video and 40% CPU on the progressive channels (576p, 720p), and perfect video with 20% CPU on the interlaced channels (576i, 1080i).

    Changing the “De-interlace control” setting to “Film” gives perfect video and 20% CPU on progressive channels, but poor de-interlacing on interlaced channels.

    And changing the “De-interlace control” setting to “Video” gives perfect video and 20% CPU on interlaced channels, but jerky video and 40% CPU on the progressive channels.​

    So, it looks like the “Smart” setting should determine if the video stream being decoded is interlaced and thus use the “Video” setting and de-interlace the video, OR, if the video stream being decoded is progressive and thus use the “Film” setting and not try to de-interlace the video.

    But, the “Smart” setting is not doing that. The “De-interlace control” setting of "Smart" seems to be always stuck on “Video”, and will not change to "Film" for progessive channels.

    I have also tried the "Automatic" setting under the “De-interlace control”, but it seems to be idental to the "Smart" setting, as I get the same results.

    Now, I don’t know if this is a bug in the Nvidia PureVideo decoder (which hasn’t been updated by Nvidia since June last year) or if it is a bug in MediaPortal such that a pin connection between the filters is incorrect or missing?

    Can anyone else comment on this, or suggest what else I might try? ..... :confused:

    UPDATE:-

    I have just re-read the NVIDIA PureVideo Decoder User's Guide for version 1.02-223 and under the section "Feature Change" it says:-

    Removed the Smart De-interlacing mode
    Use the NVIDIA Control Panel to control driver cadence

    I am not sure what they mean by "driver cadence", but it implies that this "feature" is only available with Nvidia video cards? (I have an ATI video card)

    However, the De-Interlace control of the Nvidia PureVideo decoder has the option "Automatic", which according to the User manual:-

    "..... reads the source type and automatically selects video or film mode, depending on the source."

    However, I cannot get this "Automatic" mode to do what the manual describes - for me the de-interlacing is always in video mode irrespective of the source video ..... :(

    .
     

    Taipan

    Retired Team Member
  • Premium Supporter
  • February 23, 2005
    2,075
    44
    Melbourne
    Home Country
    Australia Australia
    I am probably talking to myself in this thread .... ;)

    But maybe this information will help someone else.

    After many hours of testing (trying all combinations of video drivers, video cards - both ATI and Nvidia, and Windows Media Player versions) I have concluded that the fault (bug?) lies within the Nvidia PureVideo Decoder.

    It doesn't matter what I do, I have not been able to get the Nvidia PureVideo Decoder, when the “De-interlace control” is set to “Automatic”, to detect a progressive video stream and disable de-interlacing. The Nvidia PureVideo Decoder is permanently on "Video" mode for de-interlacing, irrespective of whether the video stream is interlaced or progresive.

    What the Nvidia PureVideo Decoder should do, when the “De-interlace control” is set to “Automatic”, is use the "Video" mode for an interlaced stream and switch to the "Movie" mode for a progressive stream.

    This is quite easy to test with Windows Media Player:-

    1. make sure that the Nvidia PureVideo Decoder is set as the preferred decoder for WMP (use Windows XP Video Decoder Checkup Utility to do this)

    2. set the “De-interlace control” to “Automatic” in the Nvidia PureVideo Decoder control panel. You have to play a video for the Nvidia PureVideo Decoder control panel to be visible in the System Tray

    3. play an mpeg video that is progressive (720p, for example) and note the CPU% in the Task Manager

    4. while the video is playing, change the “De-interlace control” to "Movie" and notice that the CPU% drops by about half, and the video plays smoother.

    5. while the video is playing, change the “De-interlace control” to "Video" and notice that the CPU% increases by about half, and the video becomes choppy.

    6. while the video is playing, change the “De-interlace control” to "Automatic" and notice that the CPU% doesn't change, and the video is still choppy. If the Nvidia PureVideo Decoder was working correctly, it should automatically switch to "Movie" mode and the CPU% should drop by about half, as well as the video play smoother.​

    I would love to hear from anyone else who has noticed this bug, or who can replicate my test and prove me wrong!

    .
     

    Paranoid Delusion

    Moderation Manager
  • Premium Supporter
  • June 13, 2005
    13,062
    2,978
    Cheshire
    Home Country
    United Kingdom United Kingdom
    I am probably talking to myself in this thread ....

    Nope, actually making some good reading.

    As far as i was aware, until you pointed it out "automatic" has been broken since way back (9xxx drivers) and the advice has been to use the "smart" option.

    If that is now not working as well, may explain why i actually am getting less problems using the PowerDVD7 codec for playback of HD content than with the Nvidia one.

    Does sound like it needs investigating further to find out what the real option settings should be to get the best out of purevideo in regards to HD.

    Searching on the net, actually not answering much in regards these changes regarding video playback, and the latest Nvidia driver help file (version 163.67) has no reference to "driver cadence", you'd have thought it would at least be a term in the Glossary.

    Thanks for the research, have reinstalled the purevideo codec and will try and see if i can duplicate your findings during the next week and actually try and get some real comparison with PowerDVD and how it handles interlacing and progressive scanning.

    Cheers
     

    Paranoid Delusion

    Moderation Manager
  • Premium Supporter
  • June 13, 2005
    13,062
    2,978
    Cheshire
    Home Country
    United Kingdom United Kingdom
    Hi Taipan

    Do you have similar problem with HD movie files, as my limitation is no DVB HD :(

    If you can point to any downloadable HD content (M$ in mind), that can be tested by anyone to check for the characteristics would be great, at least that would level the playing field for everyone who may want to test this scenario out.

    Ta
     

    Taipan

    Retired Team Member
  • Premium Supporter
  • February 23, 2005
    2,075
    44
    Melbourne
    Home Country
    Australia Australia
    Do you have similar problem with HD movie files,

    Unfortunately, the only HD movie files I have are .wmv or divx - neither of which require the Nvidia PureVideo Decoder .... :(

    My only source of HD (progressive) mpg files is from DVB-T transmissions.


    If you can point to any downloadable HD content (M$ in mind), that can be tested by anyone to check for the characteristics would be great, at least that would level the playing field for everyone who may want to test this scenario out.

    OK, let me do some hunting around - what I need to find is a downloadable mpg movie file that is progressive - either 480p, 576p or 720p.


    As a further update, I have reported this bug to Nvidia, and received this response today:-

    Your case is being escalated to our Level 2 Tech Support group. The Level 2 agents will review the case notes and may attempt to recreate your issue or find a workaround solution if possible. As this process may take some time we ask that you be patient and a Level 2 tech will contact you as soon they can to help resolve your issue.

    Best Regards,
    NVIDIA Customer Care​
     

    Dexx

    Portal Member
    May 11, 2007
    47
    1
    I also have issues when using PureVideo. I'm using MP 0.2.3.0 RC3 via an Nv 8600gt card. Using the MPV decoder the picture is fine. Using Purevideo on a 1080i channel doesnt deinterlace properly regardless of what setting i choose. The picture is all broken up with horizontal lines as if the deinterlacing is broken.
     

    deebo

    Portal Pro
    April 19, 2006
    233
    3
    in my experience none of the mpeg2 decoders work well with interlaced contents

    i just dont deinterlace but add ffdshow to the graph for deinterlacing

    much better results for me atleast

    ie movies looked fine, but cartoons looked awful

    now both look fine with ffdshow
     

    seco

    Retired Team Member
  • Premium Supporter
  • August 7, 2007
    1,575
    1,239
    Home Country
    Finland Finland
    I'm experiencing same stuttering and choppy playback issues when watching live TV / recorded TV (SD). Today I found out that nVidia Purevideo is causing it. I tried installing DScaler5 and selected it as a MPEG2 decoder, and after that the stuttering and choppy picture was gone. Side effects: bad picture quality and sound/picture is way out of sync.

    https://forum.team-mediaportal.com/problems_running_mp_secondary_display-t29722.html?t=29722

    The problem is actually only on my secondary display, which is 42" Mirai LCD TV @ 1080p. Playback on my 20" TFT is fine.

    This is really annoying because all my .mkv movies and videos @1080p are playing fine and looking great, the only problem is MPEG2 playback and I've tried Cyberlink, DScaler and MPV codecs too.

    I'll try to change the deinterlacing and mode settings and see if it helps for me.
     

    Spragleknas

    Moderator
  • Team MediaPortal
  • December 21, 2005
    9,474
    1,822
    Located
    Home Country
    Norway Norway
    I have simular experience and found that it was caused by DXVA. Disabling DXVA "fixed" this. The reason for DXVA messing up seems to be high latency.

    This seems to be a known problem.
    Topic: https://forum.team-mediaportal.com/...codec-t24895.html?t=24895&highlight=WebKnight

    EDIT: When using DXVA FPS are +/- 35 FPS - should be 25 (PAL). When DXVA is not enabled FPS is better (not perfect).

    Ia there no way to make FPS "correct"?
     

    Telstar

    Portal Pro
    October 18, 2007
    151
    1
    Home Country
    Italy Italy
    in my experience none of the mpeg2 decoders work well with interlaced contents

    i just dont deinterlace but add ffdshow to the graph for deinterlacing

    much better results for me atleast

    ie movies looked fine, but cartoons looked awful

    now both look fine with ffdshow

    I use Elecard mpeg2 decoder for streaming video. It gives better PQ and less jerkiness than any other codec, including purevideo. Speaking of cartoons, i watched shark tale couple days ago and looked great.

    BTW, using ffdshow that is not accelerated don't you use too much cpu power?
     

    Users who are viewing this thread

    Top Bottom