home
products
contribute
download
documentation
forum
Home
Forums
New posts
Search forums
What's new
New posts
All posts
Latest activity
Members
Registered members
Current visitors
Donate
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Search titles only
By:
Menu
Log in
Register
Navigation
Install the app
Install
More options
Contact us
Close Menu
Forums
MediaPortal 1
MediaPortal 1 Talk
My guide to elminating juddering/stuttering play back (plus an upscale guide!)
Contact us
RSS
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Scythe42" data-source="post: 875995" data-attributes="member: 95833"><p>Just play a bit with it. Always try to perserve details. And give the "add noise" strategy a try instead of deblocking for lower quality stuff. It makes the artificats less visibile to the eyes and persevers the details. But of course it is more grainy. You need to choose what you like more.</p><p> </p><p>If it is really bad (like very compressed TV stream where you see a ton of artifacts on camera pans) there is no other choice than deblocking. Or in other words - if it looks like Lego: deblock <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite2" alt=";)" title="Wink ;)" loading="lazy" data-shortname=";)" /> (= blur the hell out of it!).</p><p> </p><p>Also make sure you set you NVidia to output 16-235/YCbCr Output. You might see washed out picture in that case on non h264 files (read xvid). The nice thing is that the GPU converts the color space always to TV specs if it received RGB or anything else. so codecs can't screw it up badly (defaults should be fine). Only a thing for xvid and mpeg2 though to avoid black levels becoming grey. The thing here is that xvid/mpeg2 don't really obey to a color space and always assume 16-235.</p><p> </p><p>Depending on GPU and driver it might be a good strategy to force the output color levels in the ffdshow postprocessing and let the driver just sent RGB. It won't contain anything out of the color range of SD this way and won't affect HD stuff. But as said the driver should not do 16-235 conversions for HD material when connected to a HD display over HDMI. Just give it a try how it works in your environment.</p><p> </p><p>It can't hurt to set the correct color spaces in the postprocessing though! Usually codecs have the correct settings right away and you don't need to let them do stuff. Best thing is to do color corrections in a post processing step. When doing it in codecs it can really screw stuff up (learnt that the hard way). Only exception is if a codec really screws up color space. For example CoreAVC had some issues when I tried it and required a 16-235 settings or it washed out everything...</p><p> </p><p>So best is to leave codecs alone to their default color space handling (shit in -> shit out)</p><p> </p><p>Or if you are unsure force 16-235 for SD and RGB levels (ITU701) for HD stuff. Just check if it makes a difference. On my onboard Nvidia this was never necessary. The 16-235 fixed my SD problems and didn't affect HD at all.</p><p> </p><p>Also there should be no need to create a Nvidia setting that "forces Vsync for MP". At least not when using EVR as the EVR renderer under Win7 requests VSync timings from the GPU automatically when creating the D3D device. But it can't hurt to create a profile for MP in case you use other stuff on your machine like games.</p><p> </p><p>And in the postprocessing for HD make sure ITU.701 is selected as a color space. This is the standard HD color space for your TV. Any h264 stuff should have the correct color space anyway. But it can't hurt in case you have some very bad rips (better delete them and get a one that complies to scene rules). But in general "Auto" works fine here as the GPU when connected over HDMI does the correct stuff. So only change the output space for HD if you are running into issues after setting 16-235.</p><p> </p><p>You will not get a better picture when using RGB levels (0-255) as this information is not in SD stuff. It only gives you wrong black levels and washed out colors. SD always works in 16-235 space.</p><p> </p><p>And don't worry about 10 bit colors. Even if your TV can handle this there is NO source that ever used 10bit colors so far to my knowledge (beside some demo discs). And of course turn off as much post processing on your TV as possible that tries to make the picture more "brilliant". It screws up colors and create a loss of detail or even introduce artifacts. Anything that enhances black levels (by changing LEDs on a recent TV) is in general a good choice.</p><p> </p><p>And when calibrating use some of the profile that looks most like movies. They usually have one that is called this way. Gives you a decent starting point. Just measure with a "puck" which one requires the least amount of tweaking. And you can't calibrate color space with your eyes. Not possible. Only with a puck you can get it right. There are good guides out there how to do this stuff yourself. You won't succeed with your first attempt. You first need to get a feeling how the settings affect each others. Usually after the 3rd try from defaults you get it very close to optimum. Always wait some time between your attempts so you really see the difference.</p><p> </p><p>When I started calibrating four years ago I was a steep learning curve. First attempt took me hours and I screwed up. Reset to defaults. Tried again next week. Better result and faster. A months later, I tried it again and got it nearly perfect (TV can't do it better). Now I get any TV/projector in about 60-90 minutes close to optimum as possible. For some it is really nasty to deal with stuff in the factory menu if it doesn't have a basic Color Management System. So read up first on the factory menu what the settings are.</p><p> </p><p>Anyway, have fun creating a better picture! <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p><p> </p><p>And keep us posted on your advancements so that other in the community can benefit from your work as well.</p><p> </p><p>Looking forward to try out some tuned ffdshow settings. As said I just scale and never found time to really try out other stuff.</p></blockquote><p></p>
[QUOTE="Scythe42, post: 875995, member: 95833"] Just play a bit with it. Always try to perserve details. And give the "add noise" strategy a try instead of deblocking for lower quality stuff. It makes the artificats less visibile to the eyes and persevers the details. But of course it is more grainy. You need to choose what you like more. If it is really bad (like very compressed TV stream where you see a ton of artifacts on camera pans) there is no other choice than deblocking. Or in other words - if it looks like Lego: deblock ;) (= blur the hell out of it!). Also make sure you set you NVidia to output 16-235/YCbCr Output. You might see washed out picture in that case on non h264 files (read xvid). The nice thing is that the GPU converts the color space always to TV specs if it received RGB or anything else. so codecs can't screw it up badly (defaults should be fine). Only a thing for xvid and mpeg2 though to avoid black levels becoming grey. The thing here is that xvid/mpeg2 don't really obey to a color space and always assume 16-235. Depending on GPU and driver it might be a good strategy to force the output color levels in the ffdshow postprocessing and let the driver just sent RGB. It won't contain anything out of the color range of SD this way and won't affect HD stuff. But as said the driver should not do 16-235 conversions for HD material when connected to a HD display over HDMI. Just give it a try how it works in your environment. It can't hurt to set the correct color spaces in the postprocessing though! Usually codecs have the correct settings right away and you don't need to let them do stuff. Best thing is to do color corrections in a post processing step. When doing it in codecs it can really screw stuff up (learnt that the hard way). Only exception is if a codec really screws up color space. For example CoreAVC had some issues when I tried it and required a 16-235 settings or it washed out everything... So best is to leave codecs alone to their default color space handling (shit in -> shit out) Or if you are unsure force 16-235 for SD and RGB levels (ITU701) for HD stuff. Just check if it makes a difference. On my onboard Nvidia this was never necessary. The 16-235 fixed my SD problems and didn't affect HD at all. Also there should be no need to create a Nvidia setting that "forces Vsync for MP". At least not when using EVR as the EVR renderer under Win7 requests VSync timings from the GPU automatically when creating the D3D device. But it can't hurt to create a profile for MP in case you use other stuff on your machine like games. And in the postprocessing for HD make sure ITU.701 is selected as a color space. This is the standard HD color space for your TV. Any h264 stuff should have the correct color space anyway. But it can't hurt in case you have some very bad rips (better delete them and get a one that complies to scene rules). But in general "Auto" works fine here as the GPU when connected over HDMI does the correct stuff. So only change the output space for HD if you are running into issues after setting 16-235. You will not get a better picture when using RGB levels (0-255) as this information is not in SD stuff. It only gives you wrong black levels and washed out colors. SD always works in 16-235 space. And don't worry about 10 bit colors. Even if your TV can handle this there is NO source that ever used 10bit colors so far to my knowledge (beside some demo discs). And of course turn off as much post processing on your TV as possible that tries to make the picture more "brilliant". It screws up colors and create a loss of detail or even introduce artifacts. Anything that enhances black levels (by changing LEDs on a recent TV) is in general a good choice. And when calibrating use some of the profile that looks most like movies. They usually have one that is called this way. Gives you a decent starting point. Just measure with a "puck" which one requires the least amount of tweaking. And you can't calibrate color space with your eyes. Not possible. Only with a puck you can get it right. There are good guides out there how to do this stuff yourself. You won't succeed with your first attempt. You first need to get a feeling how the settings affect each others. Usually after the 3rd try from defaults you get it very close to optimum. Always wait some time between your attempts so you really see the difference. When I started calibrating four years ago I was a steep learning curve. First attempt took me hours and I screwed up. Reset to defaults. Tried again next week. Better result and faster. A months later, I tried it again and got it nearly perfect (TV can't do it better). Now I get any TV/projector in about 60-90 minutes close to optimum as possible. For some it is really nasty to deal with stuff in the factory menu if it doesn't have a basic Color Management System. So read up first on the factory menu what the settings are. Anyway, have fun creating a better picture! :) And keep us posted on your advancements so that other in the community can benefit from your work as well. Looking forward to try out some tuned ffdshow settings. As said I just scale and never found time to really try out other stuff. [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
MediaPortal 1
MediaPortal 1 Talk
My guide to elminating juddering/stuttering play back (plus an upscale guide!)
Contact us
RSS
Top
Bottom