Curious about picture quality (1 Viewer)

Alexh

Portal Member
October 30, 2016
27
0
60
Home Country
United States of America United States of America
Hi,

This is not a complaint per se. I'm running mediaportal on a sandybridge CPU, over HDMI to a 4k TV. TV reports 1080p input with OTA signals using hauppage PCI tuners.

My wife has a Comcast cable box, also 1080p.
I have noticed that the cable box picture quality is significantly better in terms of both sharpness and color saturation, even when I compare identical channels with the OTA being a 1080i signal. I can switch the HDMI input from cable to mediaportal to compare.

I have worked in video before and I know there are lots of variables but I thought that a 1080i OTA signal was about as good as it gets. Many OTA channels are 480 or 720 and in this case I assume Comcast is probably receiving a 1080i version so that's no surprise. Playing Netflix natively on the TV (I don't have 4k Netflix) is even better than the cable box for some shows.

We recently switched to the 4k TV but it actually seemed that the picture quality using the HTPC on the previous 1080p TV was better (was using mediacenter at that time).

When a microATX MB with HDMI 2.0 is available I'll build a new 4k HTPC but just wondering why the big difference.

Thanks
 

mm1352000

Retired Team Member
  • Premium Supporter
  • September 1, 2008
    21,577
    8,224
    Home Country
    New Zealand New Zealand
    Hello again

    My wife has a Comcast cable box, also 1080p.
    I have noticed that the cable box picture quality is significantly better in terms of both sharpness and color saturation, even when I compare identical channels with the OTA being a 1080i signal. I can switch the HDMI input from cable to mediaportal to compare.

    ...but just wondering why the big difference.
    Because you're not actually comparing identical channels.

    The source content (eg. channel WABC) and resolution (1080i) are the same... but this says nothing about bit-rates/bandwidth and encoding (ie. MPEG 2 vs. h.264).

    I have worked in video before and I know there are lots of variables...
    Yes, there are. Bit-rate, encoding, deinterlacing algorithm, RGB levels, colour space/conversion... and the list goes on.

    If you're interested in getting the best possible picture quality from MediaPortal, my advice would be to take a look at your:
    • GPU settings
    • video codec settings
    • refresh rate changing settings
    Perhaps worth a read:
    Guide: Perfect playback & Display calibration

    P.S. I've assumed that your TV settings (dynamic contrast etc.) are already optimal. That's a huge assumption. There's a good chance it's not valid. However, since you're comparing 2 HDMI inputs on the same TV...
     

    Alexh

    Portal Member
    October 30, 2016
    27
    0
    60
    Home Country
    United States of America United States of America
    Thanks! I was looking for a Mediaportal specific guide.

    The dynamic refresh was not set and also DXVA2 was not set. This seemed to make a difference.

    I'm using a HDMI receiver for audio - even in this case all of the LAV audio the LAV audio bit streaming should be disabled?
     

    Owlsroost

    Retired Team Member
  • Premium Supporter
  • October 28, 2008
    5,540
    5,038
    Cambridge
    Home Country
    United Kingdom United Kingdom
    DXVA2 was not set. This seemed to make a difference.

    ...which means (assuming you are using LAV Video Decoder) you have just switched from a software decoder to the Intel hardware decoder.

    When I switched from a lowish-end nVidia GPU to a lowish-end Intel GPU a while back, there was a noticeable lift in picture quality, particularly for MPEG2. I've been quite impressed by the Intel video decoding and de-interlacing performance. You might want to explore turning off the various Intel picture 'enhancements' in the Intel control panel - I think it looks better without them.
     

    mm1352000

    Retired Team Member
  • Premium Supporter
  • September 1, 2008
    21,577
    8,224
    Home Country
    New Zealand New Zealand
    I'm using a HDMI receiver for audio - even in this case all of the LAV audio the LAV audio bit streaming should be disabled?
    Depends if you want to prioritise video (ie. avoid dropped frames) or audio.
    If you want to prioritise audio you'd enable bitstreaming.
     

    Chananain

    New Member
    September 8, 2021
    1
    0
    Home Country
    United States of America United States of America
    Yeah, the problem was in hdmi cable version. My 6 years old full-hd tv shows a picture precisely as 4k tv when using an old hdmi cable. The bit rate throughput was not enough to render 4k videos correctly. Thankfully, noways cables are capable of dealing even with 8k60, even though less than 1% of people have such tv. However, I haven't spotted such a problem with 4k images. Despite the fact all my images are processed with an image sharpener, the data rate must be the same, so. Or, the image just loads slower. In the case of video, such slow loading would cause a "slow-motion" effect.
     
    Last edited:

    Users who are viewing this thread

    Top Bottom