Best picture quality in Vista with Gigabyte GA-MA78GM-S2H (1 Viewer)

jo16v

Portal Pro
August 23, 2008
263
8
Home Country
United Kingdom United Kingdom
the Sapphire HD 4650 was coupled with a Intel Core i7 940 CPU, the onboard HD3300 with a weaker AMD Phenom 9600 Black Edition. This Core i7 is ca. 30-40% faster then this Phenom. This makes this comparison bit worthless. I guess the onboard and dedicated GFX unloads the CPU in the same amount at 1080 DXVA playback. (or not significantly differently).
Hmmm, I guess you could be right about that:oops:. You'd think they would try to use a processor of equal performance for each gfx review but I suppose in that case it was more of a chipset/motherboard test so maybe not the best example. Pity they don't have figures for 4350/4550 but from what you say it would presumably be similar figures.
Could it be that in case of picture-in-picture at least one video stream will be purely CPU-decoded, and at topmost only 1 through DXVA by the GFX chip?
Yes as flokel says it looks like HD 3200/HD 3300 cannot run two DXVA videos simultaneously.
 

kszabo

MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    it looks like HD 3200/HD 3300 cannot run two DXVA videos simultaneously.

    Thanx. This problem is theoretical anyway as MP does not support p-i-p (yet). And if, we can still decode 1 stream software-based (one more argument against a slow castrated X2e CPU).
     

    kiwijunglist

    Super Moderator
  • Team MediaPortal
  • June 10, 2008
    6,746
    1,751
    New Zealand
    Home Country
    New Zealand New Zealand
    This post is only relavant to people who have a HT frequency of 1000Mhz eg. Dual core AMD CPU and want to increase hardware acceleration (DXVA) performance of H264/VC1

    Think of the HT frequency as the "road" between the onboard graphics and the RAM
    Since the onboard video shares RAM with the system, the "road" (HT frequency) determines how fast the onboard graphics card can do hardware acceleration

    If you have an AMD X2 CPU then the HT frequency = 1000Mhz
    If you have a phenom the HT frequency is a lot faster (Either 1800 or 2600Mhz i forget which)
    Anyways that's why if you have a phenom (faster HT frequency) you get to do adaptive/vector interlacing etc...

    You can speed things up by overclocking the HT frequency (You can do this by increasing the FSB speed). Increasing the FSB speed also increases your CPU speed. This maybe undesirable so you should proabably decrease the CPU multiplier by the same amount to keep the CPU speed constant. To enable additional overclock functions press CTRL+F1 after loading bios on the main screen.

    Eg. I have AMD X2 4200+ CPU with DDR 800 RAM

    My Bios default settings

    FSB = 200Mhz
    CPU Multiplier = x11
    CPU Frequency = 200 x 11 = 2200Mhz (Which is default CPU frequency for X2 4200+ CPU)
    VGA Core = 500 Mhz

    --> HT = FSB x 5 = 1000Mhz

    Suggested overclock

    FSB = 220
    CPU Multiplier = x10
    CPU Frequency = 200 x 10 = 2200Mhz (ie this is unchanged)
    VGA Core = 600 Mhz

    --> HT = FSB x 5 = 1200Mhz (A 20% increase in the speed of the "road" between the RAM and the HD3200)
    --> The HD3200 also running 20% faster core speed
    --> Note this also overclocks my RAM slightly by 10% as it now runs at 440Mhz (x2 = 880Mhz) instead of 400Mhz (x2 = 800Mhz)

    Your settings maybe different depending on what speed your X2 cpu is

    Do this at your own risk. Read up about overclocking before you try this. Don't do this if you don't understand what you are doing as you can damage your computer. Overclocking may also void your warranty. IF YOU MAKE A MISTAKE YOU COULD FRY YOUR CPU / RAM / MOTHERBOARD

    That said a lot of people can overclock the VGA core up to 900 or more, most people can get it up to 700 at the minimun. I don't suggest overclocking the VGA core as it doesn't make a great deal of difference, the main difference you will see is increasing the HT frequency.

    I am currently using the following bios settings

    HD3200 UMA (Video memory size) = 512mb (This setting is under integrated peripherals i think)
    VGA Core = 600
    FSB = 275
    CPU Multiplier = x8
    I leave everything else on auto, including RAM timings and voltage.

    This gives me an HT frequency of 1375Mhz. (CPU=2200Mhz and RAM=440Mhzx2=880Mhz) This makes mediaportal run much better for 1080i live TV. It doesn't give you any additional deinterlacing options, but makes things less suttery when the hardware acceleration is understrain (eg deinterlacing 1080i and running MP menus at the same time)

    I should also add that i have a system fan blowing air over the northbridge (HD3200) heatsink. The HD3200 gets very hot by default especially if you have a rev1.0 board which has a tiny heatsink.

    I didn't make the table below but it makes interesting reading:
    69378d1206884962-overclocking-780g-hd3200-hd3200-
     

    kszabo

    MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    very nice post! The big question is still: what happens if you overclock your HT to a resonable amount AND THEN hack the Catalyst registry to activate adaptive deinterlacing. Could it be fast enough then? What I don´t understand is why you get much better 1080i picture with the same deinterlacing settings (bob). HT 1000MHz is already enough for bob then why should you profit from 1300 MHz?

    So if 1080i is a concern, the cheapest X2 with Kuma core (=Phenom I architecture with HT3.0) or a dedicated GFX-card for ca. 40 € is still a good solution.
     

    kiwijunglist

    Super Moderator
  • Team MediaPortal
  • June 10, 2008
    6,746
    1,751
    New Zealand
    Home Country
    New Zealand New Zealand
    Deinterlacing of 1080i is essentially the same, but I notice that the frame rate is better and more constantly at 50 fps. I find at 1000Mhz the fps is variable and runs around 45-50.5fps for 1080i with slight jerkiness for panning shots. It's about 50fps when the camera is not moving, but if there is a lot of panning it dips down to 45fps. With my overclock fps sits at 49.5 - 50.5 fps in mediaportal (SHIFT+! to bring up fps info) for 1080i with no jerking for panning shots.

    Yeah Kuma core (HT3=1800Mhz) is a good solution. I havn't tried hacking to enable other options, it might work, however 1375Mhz might not be fast enough. I am happy with bob deinterlacing of 1080i i think the picture quality is good and now it is nice and smooth as well. This overclock may also help with the stuttering people experience with 1080i + MP1.0.2 + HD3200. "Apparently MP1.0.2 needs slightly more GPU power when playing 1080i"
     

    kszabo

    MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    So it looks like that HT2.0 1000 MHz ist not even enough for bob deinterlacing of 1080i, only overclocked. Nice finding!!!
     

    kiwijunglist

    Super Moderator
  • Team MediaPortal
  • June 10, 2008
    6,746
    1,751
    New Zealand
    Home Country
    New Zealand New Zealand
    This is only my experience. I would say that HT2 - 1000Mhz is enough for 1080i bob deinterlacing, however it is only just, and with any additional strain on the GPU it doesn't hold up well to panning shots.
     

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    So it looks like that HT2.0 1000 MHz ist not even enough for bob deinterlacing of 1080i, only overclocked. Nice finding!!!
    This is not accurate at all- I have no problem with framerate drop or jerkiness on 1080i, I just have tale tale sign of Bob deinterlacing- shimmering or jaggies on thin lines- maybe my 5000+ CPU makes a difference vs kiwi's 4200? Just to remind that vector adaptive 1080i is availiable on dual core (non HT3) processors if you do the decoding via CPU and deinterlacing via GPU using ffdshow or coreavc.
     

    kiwijunglist

    Super Moderator
  • Team MediaPortal
  • June 10, 2008
    6,746
    1,751
    New Zealand
    Home Country
    New Zealand New Zealand
    Just to remind that vector adaptive 1080i is availiable on dual core (non HT3) processors if you do the decoding via CPU and deinterlacing via GPU using ffdshow or coreavc.

    ffdshow is software decoder doesn't use GPU
    coreavc is mostly a software decoder not a hardware decoder.
    I am refering to 100% hardware decoding of 1080i
    you can do any form of deinterlacing with software decoding if cpu is powerful enough
     

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    ffdshow is software decoder doesn't use GPU
    coreavc is mostly a software decoder not a hardware decoder.
    I am refering to 100% hardware decoding of 1080i
    you can do any form of deinterlacing with software decoding if cpu is powerful enough
    You are correct that ffdshow and coreavc decoder uses the CPU but they can both pass to the GPU for hardware deinterlacing with NV12 colorspace- this allows 1080i vector adaptive deinterlacing on a dual core but the CPU is high (50-70% depending on bitrate on a 64x2 5000+ with skip deblocking selected). ffdshow also has much better error resilience if you have a weak DVB signal in certain times or weather conditions.
     

    Users who are viewing this thread

    Top Bottom