Best picture quality in Vista with Gigabyte GA-MA78GM-S2H (1 Viewer)

kiwijunglist

Super Moderator
  • Team MediaPortal
  • June 10, 2008
    6,746
    1,751
    New Zealand
    Home Country
    New Zealand New Zealand
    coreavc uses coda which is nvidia only, i see no documentation that the gpu is actually performing the deinterlacing. I think it is just gpu scaling. If the hardware was doing the deinterlacing you would expect cpu to be around 0-20%, my cpu is about 10% or so, depending on how much i've got running in the background.
     

    kszabo

    MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    you can CPU-decode a video stream, leave it interlaced and let the GPU to do the deinterlacing. This has not much to do with DXVA (= video decoding only). CoreAVC and newer ffdshow can do this. The codec must set the interlaced flag for the stream so the GFX knows it must be deinterlaced.

    I am sceptic if 1080i adaptive deintercing by CPU is possible without HT3.0 even with a very fast X2. It would need the same fast data bus to RAM as in case of DXVA. The best will remain bob I guess.

    Just take a look to the CoreAVC settings:
    - hardware deinterlacing is possible
    - best software deinterlacing is only bob
     

    Attachments

    • coreavc_settings_2.png
      coreavc_settings_2.png
      77.5 KB

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    I am sceptic if 1080i adaptive deintercaing is possible without HT3.0 even with a very fast X2. It would need the same fast data bus to RAM as in case of DXVA. The best will remain bob I guess.
    It is definitely possible with a 5000+ with 50-70% CPU and "skip deblocking selected when safe" in the ffdshow H.264 codec options- I definitely get vector adaptive as I play a MPEG2 channel and switch to ffdshow H.264 1080i channel and the flag does not change, with PDVD, Arcsoft or DivX it changes to Bob (jaggy lines on flag loop).
     

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    coreavc uses coda which is nvidia only, i see no documentation that the gpu is actually performing the deinterlacing. I think it is just gpu scaling. If the hardware was doing the deinterlacing you would expect cpu to be around 0-20%, my cpu is about 10% or so, depending on how much i've got running in the background.
    CPU is still high even if you have hardware deinterlacing as to decode H.264 HD still requires a lot of processing power- hardware deinterlacing is perfectly possible in CoreAVC (see hardware deinterlace button in kszabo's screenshot) and in ffdshow it also possible with NV12 colorspace and "set interlace flag" in output selected.

    Just take a look to the CoreAVC settings:
    - hardware deinterlacing is possible
    - best software deinterlacing is only bob
    You are using an old version of CoreAVC (1.7.0)- you have to use NV12 output for vector adaptive hardware deinterlacing- this was added in CoreAVC 1.8.5 - latest version is 1.9.5.
     

    mdsflyer

    Portal Pro
    November 14, 2005
    283
    6
    Home Country
    New Zealand New Zealand
    mmmm 50-70% CPU, thats not a good option in my opinion, very little headroom left for anything else.
     

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    mmmm 50-70% CPU, thats not a good option in my opinion, very little headroom left for anything else.
    Yes, that is with ffdshow, CoreAVC may be more efficient but I don't have it to test. I usually just use PDVD8/9 with DXVA and Bob as a large proportion of 1080i is not truly interlaced- i.e. content filmed in 720p/1080p- if you turn off deinterlacing you will not see any interlace blinds, the framerate will just halve. There can be an issue though with some channel logos that will shimmer with just Bob, even if the actual content was shot in 720p/1080p. FFdshow is also useful for a weakened signal i.e. in heavy rain as the error resilience is better than using DXVA.
     

    kiwijunglist

    Super Moderator
  • Team MediaPortal
  • June 10, 2008
    6,746
    1,751
    New Zealand
    Home Country
    New Zealand New Zealand
    This is gone off topic. I just wanted to say that if you have HD3200 with X2 CPU (HT1000Mhz) and use hardware decoding & hardware deinterlacing of H264 1080i in Mediaportal, you may notice an fps increase and the smoother video if you increase the HT frequency. This wont help/is not relevant if you are using coreavc/ffdshow or if you have a kuma core/phenom.
     

    Spragleknas

    Moderator
  • Team MediaPortal
  • December 21, 2005
    9,474
    1,822
    Located
    Home Country
    Norway Norway
    Reminder:

    Topic is:
    Best picture quality in Vista with Gigabyte GA-MA78GM-S2H

    So far so good, IMO :)
     

    Users who are viewing this thread

    Top Bottom