Best Quality for Interlaced Displays (1 Viewer)

brainbone

Portal Member
August 14, 2006
34
1
Setup:
I have a MP PC connected to a standard NTSC 32" TV via S-Video.
TV Capture: ATI Theatre Pro 550 (Driver v6.14.10.186)
Graphics Card: Geforce 6200 128MB AGP. (Driver v91.47)
Chipset: Nforce 2 (Driver v5.10)
MP: 0.2.0.0 Release
CPU: Old Athlon XP Mobile (barton core?) 2.0Ghz
RAM: 512MB DDR 400
OS: Windows XP Home SP2

Situation:
I'd like to have video displayed on a standard NTSC interlaced 4:3 display, without underscanning and without de-interlacing artifacts.

The best quality I've achieved, that appears to pass the video through with little to no noticeable quality loss is:
- Set display resolution to 720x480
- In Nvidia control panel/TV setting/positioning, expand the size to the largest possible (removing the underscanning) and turn off the flicker filter.
- Set TV In MP to use MPV/MPA decoders
- Set MPV decoder to use "BOB" deinterlacing
- Enable MP's "exclusive directx mode"

Video looks great. No deinterlacing artifacts.
Problems?
With "exclusive directx mode" there is intermittent stuttering in the video that makes it almost impossible to enjoy. The stuttering appears to be less if time shifting back even just a few seconds, however MP becomes much less responsive. Trying to get the EPG to display, for example, might take up to 10 seconds from the time the button is pressed. The "lagginess" is also intermittent.

Without "exclusive directx mode", but maintaining MPV with "BOB" deinterlacing, MP responsiveness seems fine -- no noticeable "stuttering", etc., but there is a great amount of video tearing, as would be expected.

Currently, I'm using Nvidia's PureVideo Decoder ("Smart mode" / "WMP Default"), however there are noticeable de-interlacing artifacts. Setting it to "Video" / "Vertical Stretch" (should double the frame rate to 60hz, just like MPV "BOB"?) gives the appearance that one of the interlaced fields was simply thrown away – so that wont work.

Question:
How can I get interlaced video (from the ATI Theatre 550) to display on a Standard 32" NTSC TV (via a the S-Video port on a Geforce 6200 128MB Agp) without having to bother with deinterlacing (or at least have the de-interlace method keep the appearance of a non-de-interlaced source, such as "BOB" does with MPV decorder, but without the performance penalty)?

--Thanks
 

bigj

Portal Pro
January 10, 2005
245
1
I use vga->scart with nvidia codec. Set BOB for all options you can find.

For the syncing - I use reclock - and I'm using this without exclusive directx. It works.

If you didn't want to use reclock and need to use exclusive directx: Your multiple frames latency issue (lag on input) sounds like a driver bug - the dx driver is not supposed to fill more than say 2 frames in its command buffer. Try new drivers.
 

brainbone

Portal Member
August 14, 2006
34
1
1. Tried Reclock. No difference. Tearing still exists when using it while MP's "Exclusive Direct X" mode is off.

2. Tried Older graphics drivers (v91.45 and v91.37). No difference.

3. Clarification of problem:

When playing TV with:
- MPV decoder using BOB deinterlacing
- MP's "Exclusive Direct X" mode off.
CPU utilization is ~60%. Video plays fine, but there is excessive tearing.

When playing TV with:
- MPV decoder using BOB deinterlacing
- MP's "Exclusive Direct X" mode on.
CPU utilization jumps to 100%. No tearing, but excessive "pausing" and/or stuttering.

Why does CPU utilization jump to 100% with "Exclusive Direct X" mode on, and only when using the MPV decoder with "BOB" deinterlace? "Exclusive Direct X" does not seem to increase the CPU load at all (stays at ~60%) with MPV in any other deinterlace mode. CPU load is only high when MPV Decorder is set to "BOB" deinterlace.
 

Taipan

Retired Team Member
  • Premium Supporter
  • February 23, 2005
    2,075
    44
    Melbourne
    Home Country
    Australia Australia
    If your display is an interlaced display (which it sounds like it is) then you should have "de-interlacing" disabled. It doesn't make sense to de-interlace the MPEG2 video and then have the video card turn around and create an interlaced version for your TV...:confused:
     

    brainbone

    Portal Member
    August 14, 2006
    34
    1
    If your display is an interlaced display (which it sounds like it is) then you should have "de-interlacing" disabled.

    In a perfect world, yes, but for the Nvidia "purevideo" decoder, or the MPV decoder, there is no "disable" deintrlacing option in MP. Under the TV settings in MP there is an option for "none" in deinterlacing, but this does not appear to have any effect.

    The "BOB" deinterlacing in the MPV decoder, however, appears to separate the fields of 30fps interlaced material to 60fps "progressive". When pushing this to an interlaced display, it has the same effect as not having any deinterlacing.
     

    knutinh

    Portal Pro
    September 4, 2005
    558
    2
    If MP is doing any kind of scaling, it would probably be wise to use deinterlace anyways. I am finding myself using 14:9 modes, zoom etc a lot.

    regards
    Knut
     

    brainbone

    Portal Member
    August 14, 2006
    34
    1
    For TV I do no scaling. I have 720x480 in from the ATI tuner, and 720x480 out to the TV. Any de-interlacing (other than "BOB" with MPV) or scaling will result in a loss of image quality when using an interlaced display.
     

    Users who are viewing this thread

    Top Bottom