Best picture quality in Vista with Gigabyte GA-MA78GM-S2H (2 Viewers)

mdsflyer

Portal Pro
November 14, 2005
283
6
Home Country
New Zealand New Zealand
From the release of the 780g boards i've always liked the idea of integrated graphics, it seems like "the way ahead".

If i had to buy a graphics card it would have cost me more as i've been able to sell my old cpu and only had to pay a very small amount to make up the difference. Buying the graphics card would have been an outright purchase and significantly more.

Regarding power use, i dont know my facts here but undoubtedly i'm using a CPU that potentially uses more power and generates more heat than my last one but then again how much additional heat and power would a dedicated graphics card produce/use. i wouldn't be suprised to learn that it is the same if not more. The CPU is rarely running above 30-40% anyways, this is the same as my previous CPU.

I think that a HT3 CPU gives you the best possible deinterlacing available from ATI and therefore there would be no performance increase in upgrading to a graphics card for HD and 1080i, but i'm sure that a graphics card may improve gaming. This is not a requirement of mine.

So overall i opted for the CPU upgrade due to cost and "neatness", even the wife notices the picture quality improvement and my vista equipment score increased from 3.6 to 4.0

I'm not saying that this option is for eveyone but i'm very definitely happy with the result.
 

iancalderban

Portal Pro
December 12, 2008
140
7
milton keynes
Home Country
United Kingdom United Kingdom
I currently have a 5050e and I cant decide whether to get the 7750 now, or wait for the 45nm athlons coming soon. What do you guys reckon?

I went through the same thing 2months ago, I have a 4850e X2. Depending on your htpc cooling design and location, check what it does to your TDP.

The only -e HT3 processors I could find were the high end (X4), so very expensive and complete overkill for the HTPC and still it was 65W TDP instead of 45w of my current X2. So I put in a 4550 passive low profile GFX with a 20W TDP, and kept my 4850e. cost me £50, vs ~£200 for the HT3 X4, and still only +20W TDP. The non -e cheaper HT3 processors were all 95 or 90W I think.
if there was a low cost -e HT3 processor coming closer to the 45W mark of the current X2's, then I'd possibly have thought different. Certainly if I didn't have the X2 already (or had another use for that CPU) I'd think seriously about a HT3 @ 45w (if it exists) instead of the gfx card.

To be sure the extra 20W tdp doesn't cause any trouble, I've also sawn a hole in the back of my AV cab (which doesn't have a lot of natural airflow) and put two 140mm yate loons in at low speed extracting from the cab, so my temps are still low and silence is golden...

The only reason to do this by the way (from a pure HTPC perspective) is 1080i de-interlacing. If you are happy with bob, or don't watch 1080i, don't bother.

ta
Ian
 

kszabo

MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    IMHO TPD has very-very limited informative value about the real power consumption of a CPU. Important is the architecture and producing method (45nm vs 65 nm). TPD is important for PowerSupply and Motherboard power design, has not much to do with the average power consumtion of the CPU.

    The CPU tests prove that newer (65nm) AMD Athlons have quite good "idle" consumption (ca. 10W), Phenom Is are worse (ca. 20 W), Phenom IIs (45nm) are very good again (<10W).
    look at this here for idle power consumption: http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=54&Itemid=42&limit=1&limitstart=4

    winidle.gif


    Power Consumption under load looks different: Phenom IIs need less energy for the same computing task, followed by Phenom Is. the worst are Athlon X2s. = under the same load your 45W TPD X2xxxxe will need more energy as a "non-e", 65 or 95 or 125 W TPD Phenom II.
    for power efficiency check this out (2nd graph): http://www.lostcircuits.com/mambo//index.php?option=com_content&task=view&id=54&Itemid=42&limit=1&limitstart=6

    For the same computing task a Phenom II (125W TPD!) needed less energy (4625 W) than the best X2 (5530 Watt, 65 nm Brisbane X2 with 45 W TPD!)

    caljoule.gif


    So in real life best choice is a 45 nm Phenom II, second place is difficult to decide (if your HTPC is always on, long in idle, X2 is better, if you power it up only if you need and 1080i is important for you without GFX-Card, Phenom I is better). So if someone can, should wait for the cheap mainstream 45W Phenom IIs (X2, X3 or X4)

    But I don´t really understand the problem. You want to spare 10 USD electricity bill a year or what??? All of these CPUs are efficient enough to be used in a HTPC! My Phenom I X3 8450 is passively cooled in an Antec case (2x 12 cm Fan), absolute quiet and cool (CPU Temps <40 °C), I dont care about the 20 Watt idle consumption.
     

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    If i had to buy a graphics card it would have cost me more as i've been able to sell my old cpu and only had to pay a very small amount to make up the difference. Buying the graphics card would have been an outright purchase and significantly more.
    OK, I didn't think of that one:). I wonder how much I can get for a 5000+...
    I think that a HT3 CPU gives you the best possible deinterlacing available from ATI and therefore there would be no performance increase in upgrading to a graphics card for HD and 1080i, but i'm sure that a graphics card may improve gaming. This is not a requirement of mine.
    I presume you are not saying that you cannot get 1080i vector adaptive without an HT3 CPU because that does not seem to be true given the answers by ian. Also a better graphics card can reduce CPU usage for video decoding quite considerably from the tests I have seen- on (I think) a 4350, CPU was 15% with a 1080p blu ray, but with a 4650 it was only 5%.

    So in real life best choice is a 45 nm Phenom II, second place is difficult to decide (if your HTPC is always on, long in idle, X2 is better, if you power it up only if you need and 1080i is important for you without GFX-Card, Phenom I is better). So if someone can, should wait for the cheap mainstream 45W Phenom IIs (X2, X3 or X4)

    But I don´t really understand the problem. You want to spare 10 USD electricity bill a year or what??? All of these CPUs are efficient enough to be used in a HTPC! My Phenom I X3 8450 is passively cooled in an Antec case (2x 12 cm Fan), absolute quiet and cool (CPU Temps <40 °C), I dont care about the 20 Watt idle consumption.
    Thanks for the graphs. I am getting a separate graphics card (prob 4650/4670 or 9500) as I want better gaming but I think I'll probably stick with my 5000+ until the new x2s come out, or maybe the Phenom II prices comes down:D.
     

    mdsflyer

    Portal Pro
    November 14, 2005
    283
    6
    Home Country
    New Zealand New Zealand
    I presume you are not saying that you cannot get 1080i vector adaptive without an HT3 CPU because that does not seem to be true given the answers by ian. Also a better graphics card can reduce CPU usage for video decoding quite considerably from the tests I have seen- on (I think) a 4350, CPU was 15% with a 1080p blu ray, but with a 4650 it was only 5%

    Hi again, I believe that Vector adaptive is not available for 1080i H264 with a 780G unless an HT3 CPU is installed The only official options with HT2 are weave or bob. You can muck around with the registry or dvxa checker to get it to attempt VA but in my experience there was massive frame loss and the picture was unwatchable. I'm all about smoothness!

    Whilst a graphics card may further reduce your CPU load it seems rather academic in lowering it from 15% to 5%, in neither instance is the CPU taxed. My feeling is that if I've got a CPU installed, which we all do, then i might as well use it. However again this may only be pertinent to my own situation.

    The original title of this thread is Best picture quality in Vista with Gigabyte GA-MA78GM-S2H. My take on that is how do we get the best quality soleley from the board, and that is with an HT3 processor for 1080i sources. Of course we could install a highly powered graphics card and get good results with that too, but then that really applies for any motherboard. One of the interesting things about the 780g is its powerful(ish) on board graphics and if they can compete with a dedicated graphics card. I think for movies/tv/dvd's/ blu ray with the correct CPU, RAM etc the answer is yes.

    Enjoying the discussion! :)
     

    kszabo

    MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    You can muck around with the registry or dvxa checker to get it to attempt VA but in my experience there was massive frame loss and the picture was unwatchable.

    of course, ATi deactivated 1080i VA for HT2.0 CPUs because the RAM bandwidth (=HT2.0) cannot cope with that. You can activate it with a hack but it won´t work. Bob only doubles the framerate, not a big deal. by vector adaptive the picture has to be analyzed and according to an algorhyth certain parts of the picture will be deinterlaced differently than the rest. This means much more videoRAM (=shared memory through HT memory controller of CPU) traffic. HT2.0 is enough for bob, but not for this.

    Whilst a graphics card may further reduce your CPU load it seems rather academic in lowering it from 15% to 5%, .
    agreed totally. The only duty a GFX can take over from CPU is the load on the Memory controller integrated into th CPU (the part of load for the shared memory of the HD3200). The onboard HD3200 can unload the CPU in the same amount as a dedicated GFX-card for video decoding.

    Enjoying the discussion also :)
     

    jo16v

    Portal Pro
    August 23, 2008
    263
    8
    Home Country
    United Kingdom United Kingdom
    The original title of this thread is Best picture quality in Vista with Gigabyte GA-MA78GM-S2H. My take on that is how do we get the best quality soleley from the board, and that is with an HT3 processor for 1080i sources. Of course we could install a highly powered graphics card and get good results with that too, but then that really applies for any motherboard. One of the interesting things about the 780g is its powerful(ish) on board graphics and if they can compete with a dedicated graphics card. I think for movies/tv/dvd's/ blu ray with the correct CPU, RAM etc the answer is yes.
    Enjoying the discussion! :)
    Basically I think it all comes down to gaming- if you want to do that at any decent resolution, you need a separate gfx card. If you aren't bothered about gaming, but want best deinterlacing for 1080i, go with a 780G + a 7750 or Phenom.
    agreed totally. The only duty a GFX can take over from CPU is the load on the Memory controller integrated into th CPU (the part of load for the shared memory of the HD3200). The onboard HD3200 can unload the CPU in the same amount as a dedicated GFX-card for video decoding.
    Enjoying the discussion also :)
    Take a look at the blu ray (Casino Royale) CPU load comparisons below:
    790GX HD 3300 integrated gfx - CPU is 15-20%, 47-55% for picture in picture
    Sapphire HD 4650 - CPU is 5-8%, 8-15% for picture in picture
    Sapphire HD 4670 Ultimate- CPU is 5-7%, 8-15% for picture in picture
    from that it seems dedicated gfx can unload more CPU than integrated?
     

    midasxxl

    Portal Pro
    March 22, 2006
    53
    2
    43
    Home Country
    England England
    Just a short Question to this Motherboard. I have a Benq HD2400 monitor. I am connecting that device via DVI. Does it make any sense to connect my normal non TV LCD Screen
    via HDMI to play 24p on this Monitor. Is it possible at all ?
    Any experience ?

    Love this Board, it's a dream, works best for me with a ATI HD4450 and Skyystar DVb-s Tuner Card.
    With MP 1.0 everything works perfect for the first time, MP took me already several 100 hours lifetime :)
     

    kszabo

    MP Donator
  • Premium Supporter
  • December 6, 2007
    796
    86
    Germany, Bayern
    Home Country
    Hungary Hungary
    Take a look at the blu ray (Casino Royale) CPU load comparisons below:
    790GX HD 3300 integrated gfx - CPU is 15-20%, 47-55% for picture in picture
    Sapphire HD 4650 - CPU is 5-8%, 8-15% for picture in picture
    Sapphire HD 4670 Ultimate- CPU is 5-7%, 8-15% for picture in picture
    from that it seems dedicated gfx can unload more CPU than integrated?

    the Sapphire HD 4650 was coupled with a Intel Core i7 940 CPU, the onboard HD3300 with a weaker AMD Phenom 9600 Black Edition. This Core i7 is ca. 30-40% faster then this Phenom. This makes this comparison bit worthless. I guess the onboard and dedicated GFX unloads the CPU in the same amount at 1080 DXVA playback. (or not significantly differently).

    Could it be that in case of picture-in-picture at least one video stream will be purely CPU-decoded, and at topmost only 1 through DXVA by the GFX chip?
     

    flokel

    Portal Developer
  • Premium Supporter
  • October 11, 2005
    1,601
    168
    Unterfranken
    Home Country
    Germany Germany
    Could it be that in case of picture-in-picture at least one video stream will be purely CPU-decoded, and at topmost only 1 through DXVA by the GFX chip?

    Yep, the tested dedicated cards ship at least with UVD 2.0 which supports dual video stream decoding.
     

    Users who are viewing this thread

    Top Bottom