How does HTPC picture quality compare to dedicated satellite receiver? (1 Viewer)

spiderwheels

Portal Pro
October 28, 2009
101
3
Home Country
United Kingdom United Kingdom
Hello.

I'm trying to work out if using a PC for my satellite TV viewing is degrading the picture quality compared to a dedicated satellite receiver box (think Sky HD here in UK).

My theory is that it would be better with a Sky box because that passes the interlaced signal through to the TV. The TV then interlaces the signal - along with other video processing (e.g. noise reduction) - before display.

Now when I use my PC the signal is interlaced in the graphics card and output to the TV as full frame. Then the TV just displays the frame.

Now any TV (being dedicated to the task) is probably going to do a much better job of interlacing than my ATI 4200HD.

I was thinking I might get a new TV in an effort to improve picture quality but I wonder if the PC is the real problem.

:D
 

mbuzina

Retired Team Member
  • Premium Supporter
  • April 11, 2005
    2,839
    726
    Germany
    Home Country
    Germany Germany
    Why should the dedicated interlacing (if good it is vector adaptive deinterlacing) in your TV be better at it than the vector adaptive algorithm running on your dedicated GPU Hardware?

    If you like to check which one looks better, set your PC to 1080i resolution. Remember to disable deinterlacing and just have a look which is better.
     

    miroslav22

    Development Group Member
  • Premium Supporter
  • September 4, 2009
    703
    460
    Warwick
    Home Country
    United Kingdom United Kingdom
    It really depends on your tv but I've found most do a very good job at de-interlacing.

    My Ati 5870 does de-interlace the picture ever so slightly better than my TV does (the difference is small though). This is comparing it to the output of my retired Sky HD box.

    Does your graphics card support full bitstream DXVA mode for h264? If not you might find a better graphics card will give a better de-interlaced picture.
     

    spiderwheels

    Portal Pro
    October 28, 2009
    101
    3
    Home Country
    United Kingdom United Kingdom
    Thanks

    I'm suspicious that switching the PC to output 1080i would just cause it to generate full frames in the memory buffer (therefore running the interlacing anyway) and then simply slice them up again to output.

    I do know that my integrated ATI chip doesn't perform vector adaptive interlacing and a switch to an ATI 5670 is required.

    Anyway, interlacing aside, is a TV able to do anything to further enhance the picture coming from a PC? The excessive noise and banding in dark background colours (even those on the BBC HD channel) really bug me. My TV is quite old (original Sony Bravia).. can newer TVs enhance the (progressive) signal from the PC any more?
     

    Users who are viewing this thread

    Top Bottom