My guide to elminating juddering/stuttering play back (plus an upscale guide!)

Scythe42

Retired Team Member
  • Premium Supporter
  • June 20, 2009
    2,065
    113
    46
    Berlin


    Your guide is ok in general but your ffdshow profiles for SD quality are giving you a horrible result. :cry:

    The deblocking settings are way too high. This blurs the image and you loose a ton of detail. Just turn it on and off during playback and you'll see right away. Choose a scene where someone has a three day beard.

    Deblocking is basically a choice for high compressed streams: actual see blocks or blur them away loosing any detail that might be left.

    Also the Noise Reduction is way too agressive. Basically does nothing for you unless 70s like analog reception is what you have (think snow storm). Waste of processing power that doesn't do anything visible to the picture in the end. In fact with your deblocking there can't be any noise left as any detail is blurred away.

    Turn if off as well and go for full screen resolution scaling instead of you 1.5 multiplier. Will give you a better result without loosing all the detail. Noise is not an issue these days. You just destroy film grain here and make blocky SD quality (especially TV) even more blocky with your setting. But as said, you already blurred anything away with the Postprocessing. So it does nothing for you anymore.

    On the contrary you want to add some noise for SD (not DVDs!), especially for xvids to get rid of the blocky look. This is the better choice.

    I recommend just going for scaling for anything lower 1080p and just a bit of sharpening. Not too much or you lose details and create artifacts.

    Same settings for 720p as for SD. Don't do more for SD than you would do for 720p! Higher settings make things worse not better! You won't get more out of the picture. You will make it worse. If you go for different SD sharpening don't go too far. Just a notch more. Maybe even better not to sharpen 720p at all, the scaling should take care of it so that sharpening will not really help to improve the quality.

    For stuff lower than normal SD resolutions (xvids and stuff) I recommend trying out to add some noise. So basically use your HD settings also for SD and for everything below SD (read xvids) try to add noise to make all the artifacts a bit less visible. But don't do much or you ruin it again.

    But don't expect wonders. You can't polish a turd. :D

    In fact scale and a bit of sharpening is all you should do in general. Try out if Luma or Chroma works better for you in general.

    Good idea is to do the sharpening during scaling and not afterwards.

    Better play with the scaling algorithms a bit for lower resolution what works nice on your CPU. You chose Spline/Bicubic. That's a decent choice and should not stress most CPUs. You will get some problems though with title sequences on interlaced material. Different story. The titles might jump up a few scanlines on the first frame they are visible before being on the correct position.

    You never get SD content to near 720p quality. And all these xvids will never be DVD like.

    On the other hand: calibrating your TV/projector to the correct color space will be a revelation!!!

    Feels like 100% picture improvement. Turn off all the fancy picture enhancement stuff and instead calibrate it or let it get calibrated by someone (not possible with eyes). I do this for all my friends. They love me for it.

    if you update your ffdshow settings I take a look at them if you like.

    If you ask for mine: just scale everything to screen resolution. Bicubic in my case as this is what my CPU likes the most. Nothing else. Provides some improvement over what my onboard nvidia chip does. :barefoot:
     
    Last edited:

    FireAza

    Portal Pro
    June 30, 2011
    50
    8
    33
    Are the Preset autoload condition correct for the HD (Upscale) profile, as it seems to be set to load for 1280x720->2048x2040, which overlaps with the Full HD profiles, which covers 1920x1080->2048x2048?
    Yes, it's correct. I know it looks weird, but this was the advice I was originally given, and I made sure to test them to ensure that they were loading correctly. But you're right, it's strange, so I've corrected it ;) Naturally, I've tested all three profiles to make sure they're loading when they should, and not when they shouldn't.

    I also don't get why your Dynamic Refresh Rate settings should be better, as CINEMA - 23.976 is obviously closer to 24 than 23, NTSC - 59.94 is closer to 60 than 59 and NTSCFILM - 29.97 is closer to 60 than 59, so I'd think using the lower refresh rates would cause more sync problems.
    I was originally using exact matches for my refresh rates. But I was finding that some of my videos were rapidly dropping frames, and it turns out that using non-whole figures was the cause. The numbers I've advised are the same as what I was told to use, and I haven't been having any issues with dropped frames since then.

    Your guide is ok in general but your ffdshow profiles for SD quality are giving you a horrible result. :cry:

    The deblocking settings are way too high. This blurs the image and you loose a ton of detail. Just turn it on and off during playback and you'll see right away. Choose a scene where someone has a three day beard.

    Deblocking is basically a choice for high compressed streams: actual see blocks or blur them away loosing any detail that might be left.

    Also the Noise Reduction is way too agressive. Basically does nothing for you unless 70s like analog reception is what you have (think snow storm). Waste of processing power that doesn't do anything visible to the picture in the end. In fact with your deblocking there can't be any noise left as any detail is blurred away.

    Turn if off as well and go for full screen resolution scaling instead of you 1.5 multiplier. Will give you a better result without loosing all the detail. Noise is not an issue these days. You just destroy film grain here and make blocky SD quality (especially TV) even more blocky with your setting. But as said, you already blurred anything away with the Postprocessing. So it does nothing for you anymore.

    On the contrary you want to add some noise for SD (not DVDs!), especially for xvids to get rid of the blocky look. This is the better choice.

    I recommend just going for scaling for anything lower 1080p and just a bit of sharpening. Not too much or you lose details and create artifacts.

    Same settings for 720p as for SD. Don't do more for SD than you would do for 720p! Higher settings make things worse not better! You won't get more out of the picture. You will make it worse. If you go for different SD sharpening don't go too far. Just a notch more. Maybe even better not to sharpen 720p at all, the scaling should take care of it so that sharpening will not really help to improve the quality.

    For stuff lower than normal SD resolutions (xvids and stuff) I recommend trying out to add some noise. So basically use your HD settings also for SD and for everything below SD (read xvids) try to add noise to make all the artifacts a bit less visible. But don't do much or you ruin it again.

    But don't expect wonders. You can't polish a turd. :D

    In fact scale and a bit of sharpening is all you should do in general. Try out if Luma or Chroma works better for you in general.

    Good idea is to do the sharpening during scaling and not afterwards.

    Better play with the scaling algorithms a bit for lower resolution what works nice on your CPU. You chose Spline/Bicubic. That's a decent choice and should not stress most CPUs. You will get some problems though with title sequences on interlaced material. Different story. The titles might jump up a few scanlines on the first frame they are visible before being on the correct position.

    You never get SD content to near 720p quality. And all these xvids will never be DVD like.

    On the other hand: calibrating your TV/projector to the correct color space will be a revelation!!!

    Feels like 100% picture improvement. Turn off all the fancy picture enhancement stuff and instead calibrate it or let it get calibrated by someone (not possible with eyes). I do this for all my friends. They love me for it.

    if you update your ffdshow settings I take a look at them if you like.

    If you ask for mine: just scale everything to screen resolution. Bicubic in my case as this is what my CPU likes the most. Nothing else. Provides some improvement over what my onboard nvidia chip does. :barefoot:
    Thanks for the advice! I'll be sure to try them out! I've already calibrated my TV, but I'd like to one day get one of those puck things are calibrate it better!

    But, but, I like my SD profile :( I had always been having difficulty getting SD to look right (too blocky, too blurry, too much rainbowing etc), but these settings seemed to hit the sweet spot. I've got some DVD rips of Pinky and the Brain and they look fantastic on my TV, they look like 720p videos that have been blurred slightly :D

    Something you mentioned that interested me was DVDs vs xvid videos. I always go for the highest quality DVD rips I can find for my SD content, are there settings in my SD profile that are useful for crappy xvid rips, but degrade high-quality DVD rips?
     

    Scythe42

    Retired Team Member
  • Premium Supporter
  • June 20, 2009
    2,065
    113
    46
    Berlin
    Thanks for the advice! I'll be sure to try them out! I've already calibrated my TV, but I'd like to one day get one of those puck things are calibrate it better!

    But, but, I like my SD profile :( I had always been having difficulty getting SD to look right (too blocky, too blurry, too much rainbowing etc), but these settings seemed to hit the sweet spot. I've got some DVD rips of Pinky and the Brain and they look fantastic on my TV, they look like 720p videos that have been blurred slightly :D

    Something you mentioned that interested me was DVDs vs xvid videos. I always go for the highest quality DVD rips I can find for my SD content, are there settings in my SD profile that are useful for crappy xvid rips, but degrade high-quality DVD rips?
    You really need one of these "pucks" for a proper calibration. Believe me, it will be a HUGE difference. But you need a decent quality one for a good result. Read some reviews before deciding.

    For HQ DVD rips, remuxed DVDs/isos do the following:
    - turn off postprocessing/deblocking (it blurs the picture way too much!)
    - move the noise reduction/blur before the scaling to save CPU power. Here you can make the picture softer instead of the brutal deblocking.
    - set scaling to "resize to screen resolution" instead of 1.5 multiplier

    This will give you a really nice scaled picture preserving most details (not blur them all away by deblocking). You can do a finer blur with the Noise Reduction stuff.
    If playback gets choppy turn off Noise Reduction (very CPU intensive). Better to invest in "resize to screen resolution" in my opinion.

    For lower quality stuff: Turn off the Noise Reduction and keep the deblocking. As said deblocking blurs the picture so much that there can't be an noise left. An also go for "resize to screen resolution".
     

    Scythe42

    Retired Team Member
  • Premium Supporter
  • June 20, 2009
    2,065
    113
    46
    Berlin
    Just play a bit with it. Always try to perserve details. And give the "add noise" strategy a try instead of deblocking for lower quality stuff. It makes the artificats less visibile to the eyes and persevers the details. But of course it is more grainy. You need to choose what you like more.

    If it is really bad (like very compressed TV stream where you see a ton of artifacts on camera pans) there is no other choice than deblocking. Or in other words - if it looks like Lego: deblock ;) (= blur the hell out of it!).

    Also make sure you set you NVidia to output 16-235/YCbCr Output. You might see washed out picture in that case on non h264 files (read xvid). The nice thing is that the GPU converts the color space always to TV specs if it received RGB or anything else. so codecs can't screw it up badly (defaults should be fine). Only a thing for xvid and mpeg2 though to avoid black levels becoming grey. The thing here is that xvid/mpeg2 don't really obey to a color space and always assume 16-235.

    Depending on GPU and driver it might be a good strategy to force the output color levels in the ffdshow postprocessing and let the driver just sent RGB. It won't contain anything out of the color range of SD this way and won't affect HD stuff. But as said the driver should not do 16-235 conversions for HD material when connected to a HD display over HDMI. Just give it a try how it works in your environment.

    It can't hurt to set the correct color spaces in the postprocessing though! Usually codecs have the correct settings right away and you don't need to let them do stuff. Best thing is to do color corrections in a post processing step. When doing it in codecs it can really screw stuff up (learnt that the hard way). Only exception is if a codec really screws up color space. For example CoreAVC had some issues when I tried it and required a 16-235 settings or it washed out everything...

    So best is to leave codecs alone to their default color space handling (shit in -> shit out)

    Or if you are unsure force 16-235 for SD and RGB levels (ITU701) for HD stuff. Just check if it makes a difference. On my onboard Nvidia this was never necessary. The 16-235 fixed my SD problems and didn't affect HD at all.

    Also there should be no need to create a Nvidia setting that "forces Vsync for MP". At least not when using EVR as the EVR renderer under Win7 requests VSync timings from the GPU automatically when creating the D3D device. But it can't hurt to create a profile for MP in case you use other stuff on your machine like games.

    And in the postprocessing for HD make sure ITU.701 is selected as a color space. This is the standard HD color space for your TV. Any h264 stuff should have the correct color space anyway. But it can't hurt in case you have some very bad rips (better delete them and get a one that complies to scene rules). But in general "Auto" works fine here as the GPU when connected over HDMI does the correct stuff. So only change the output space for HD if you are running into issues after setting 16-235.

    You will not get a better picture when using RGB levels (0-255) as this information is not in SD stuff. It only gives you wrong black levels and washed out colors. SD always works in 16-235 space.

    And don't worry about 10 bit colors. Even if your TV can handle this there is NO source that ever used 10bit colors so far to my knowledge (beside some demo discs). And of course turn off as much post processing on your TV as possible that tries to make the picture more "brilliant". It screws up colors and create a loss of detail or even introduce artifacts. Anything that enhances black levels (by changing LEDs on a recent TV) is in general a good choice.

    And when calibrating use some of the profile that looks most like movies. They usually have one that is called this way. Gives you a decent starting point. Just measure with a "puck" which one requires the least amount of tweaking. And you can't calibrate color space with your eyes. Not possible. Only with a puck you can get it right. There are good guides out there how to do this stuff yourself. You won't succeed with your first attempt. You first need to get a feeling how the settings affect each others. Usually after the 3rd try from defaults you get it very close to optimum. Always wait some time between your attempts so you really see the difference.

    When I started calibrating four years ago I was a steep learning curve. First attempt took me hours and I screwed up. Reset to defaults. Tried again next week. Better result and faster. A months later, I tried it again and got it nearly perfect (TV can't do it better). Now I get any TV/projector in about 60-90 minutes close to optimum as possible. For some it is really nasty to deal with stuff in the factory menu if it doesn't have a basic Color Management System. So read up first on the factory menu what the settings are.

    Anyway, have fun creating a better picture! :)

    And keep us posted on your advancements so that other in the community can benefit from your work as well.

    Looking forward to try out some tuned ffdshow settings. As said I just scale and never found time to really try out other stuff.
     
    Last edited:

    FireAza

    Portal Pro
    June 30, 2011
    50
    8
    33
    Thanks for the advice! Lots of information to go over here!

    Also make sure you set you NVidia to output 16-235/YCbCr Output. You might see washed out picture in that case on non h264 files (read xvid). The nice thing is that the GPU converts the color space always to TV specs if it received RGB or anything else. so codecs can't screw it up badly (defaults should be fine). Only a thing for xvid and mpeg2 though to avoid black levels becoming grey. The thing here is that xvid/mpeg2 don't really obey to a color space and always assume 16-235.

    Depending on GPU and driver it might be a good strategy to force the output color levels in the ffdshow postprocessing and let the driver just sent RGB. It won't contain anything out of the color range of SD this way and won't affect HD stuff. But as said the driver should not do 16-235 conversions for HD material when connected to a HD display over HDMI. Just give it a try how it works in your environment.
    Where would I find the 16-235/YCbCr Output option in the nvidia control panel? I didn't see it in there. I did find the RGB: 16-235 output levels option in ffdshow raw though, should I enable it here and ignore nvidia's setting like you suggested? Or would there be some advantage to finding the setting in the nvidia control panel?

    And in the postprocessing for HD make sure ITU.701 is selected as a color space. This is the standard HD color space for your TV. Any h264 stuff should have the correct color space anyway. But it can't hurt in case you have some very bad rips (better delete them and get a one that complies to scene rules). But in general "Auto" works fine here as the GPU when connected over HDMI does the correct stuff. So only change the output space for HD if you are running into issues after setting 16-235.
    I'm a little confused here, you say I should make sure to set both my HD profiles to use the ITU.701 color space, but then you say "auto" should be fine? Do you mean I should set 16-235 output levels for my HD profiles, but leave YCbCr specification set to "auto" unless the picture becomes screwed up (incorrect colors I'm assuming?) after doing this?

    You will not get a better picture when using RGB levels (0-255) as this information is not in SD stuff. It only gives you wrong black levels and washed out colors. SD always works in 16-235 space.
    It sounds like I should definitely set output levels in ffdshow raw to 16-235 for my SD profile! There's nothing I should be aware might happen if I do this I'm assuming?

    And of course turn off as much post processing on your TV as possible that tries to make the picture more "brilliant". It screws up colors and create a loss of detail or even introduce artifacts. Anything that enhances black levels (by changing LEDs on a recent TV) is in general a good choice.
    Of course, I've already got all those "enhancement" settings on my TV turned off :D

    And when calibrating use some of the profile that looks most like movies. They usually have one that is called this way. Gives you a decent starting point. Just measure with a "puck" which one requires the least amount of tweaking. And you can't calibrate color space with your eyes. Not possible. Only with a puck you can get it right. There are good guides out there how to do this stuff yourself. You won't succeed with your first attempt. You first need to get a feeling how the settings affect each others. Usually after the 3rd try from defaults you get it very close to optimum. Always wait some time between your attempts so you really see the difference.
    How do you know which of these "pucks" are good? I seem to remember a brand called "Spider" or something, are they good?


    And keep us posted on your advancements so that other in the community can benefit from your work as well.
    So far, I've implemented your suggestions of turning off post-processing, changing the position of blurr and setting resize to "resize to screen resolution". I agree that my previous settings seemed to be geared towards trying to patch up crappy xvid videos, but this would only be doing harm to the majority of my videos which are high-quality rips.
     

    Scythe42

    Retired Team Member
  • Premium Supporter
  • June 20, 2009
    2,065
    113
    46
    Berlin
    For Nvidia with LAV do the following:

    Setting the Output Color Format:
    1. Open Nvidia Control Panel
    2. Go to "Display" -> "Adjust Desktop Color Settings"
    3. Under "Apply the following Enhancements" set "Digital Color Format" to "YCbCr444"
    4. Under "Apply the following Enhancements" set "Content Type Reported to the Display" to "Full-screen videos"
    Just check if YCbCR444 looks better on your display. RGB can result in washed out colors if the display can't handle RGB properly. You usually see a difference right away. If there is no difference for your display stay on RGB.

    Setting the Video Color:
    1. Open Nvidia Control Panel
    2. Go to "Video" -> "Adjust Video Color Settings"
    3. Under "How to you make color adjustmenst" select "With Nvidia Settings"
    4. Click on the "Advanced" tab. Set "Dynamic Range" to "Limited (16-234)"
    Just check if "Limited (16-235)" looks better on your display. If not set this to "Full (0-255)". Especially look for crushed blacks if you don't spot a difference right away. If you can't make out any difference at all keep "Full (0-255").

    Always try with different files to be on the safe side. Usually MPEG2 is affected (xvid to some extend).

    If you still have washed out colors or crushed blacks the problem is related to a codec settings. Shouldn't happen with LAV, though.

    Setting the Deinterlacing Mode:
    1. Open Nvidia Control Panal
    2. Go to "Video" -> "Adjust Video Image Settings"
    3. Under "Deinterlacing" check "Use inverse telecince"
    LAV Codec:
    lav.png


    Base ffdshow Profiles:
    • 1080p = for 1080p content
    • 720p = for 720p content
    • DVD = for DVDs or SD MPEG2 content (for TV you might want to create another profile based on this with a different container in the auto load settings).
    • x264 = for newer SD rips
    • DivX = for all the old eye cancer crap, to make them at least somewhat watchable
    Notes on the Base Profiles:
    • All profiles set Deinterlacing in case the source is interlaced. If it is already progressive nothing will happen.
    • Color Output is forced to RGB32 (if you have CPU issues, set this to Auto).
    • All profiles except 1080p scale to 1080p
    • 1080p also does a bit of sharpening
    • DVD does not use "Blur&NR" to avoid loss of detail.
    • x264 adds "Blur&NR" for new SD TV rips
    • DivX adds postprocessing (deblocking) to soften the picture
    No special settings for TV, as I don't use MP for watching TV.

    Adjust LAV and Profiles as needed!

    Using Avisynth for scaling/sharpening can enhance stuff a bit more than general ffdshow with the right "scripts". Also gives you more flexibility. But it goes hard on the CPU. Didn't notice a real difference when I tried it some years ago.

    PS: Postprocessing in MP will only work if you do not use an automatic codec setup!

    PPS: I use the Shark007 Win7Codec Pack without any issues because I am too lazy to upgrade codecs.
     

    Attachments

    Last edited:
    Top Bottom