home
products
contribute
download
documentation
forum
Home
Forums
New posts
Search forums
What's new
New posts
All posts
Latest activity
Members
Registered members
Current visitors
Donate
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Search titles only
By:
Menu
Log in
Register
Navigation
Install the app
Install
More options
Contact us
Close Menu
Forums
HTPC Projects
Hardware
Video-Cards
Upgrading my Video Card
Contact us
RSS
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="edterbak" data-source="post: 1117394" data-attributes="member: 69192"><p>well, late reply. You probably got an anwser already by buying the hw you liked. <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /> </p><p></p><p>For the purpose you describe, 1080p, the radeon should be more than sufficient. Coming from an AMD card the benefit seems pretty small. </p><p>The delay you describe will not be solved by replacing the gfx card. </p><p>The only thing a new gpu card will be better in is de-interlacing certain interlaced media sources. Like your TV signal is most likely interlaced. If you want to improve this image with interlacing a mid range HD 5xxx or up is adviced to use vector adaptive de-interlacing. Normal media files are in general not interlaced. Your gfx card does not have to do much to be able to produce the image. So the benefit for that is near to zero. </p><p>There are some settings available in ATI CCC quality settings to improve picture quality, but a digital source is often not improved much (read: at all) by those tricks. (flesh-tone, white-balanse, sharpening...etc.) I always disable them as they're useless IMO. </p><p></p><p>The washed out colors you write about could have something to do with: </p><p>1 - The RGB pixel format setting in AMD CCC. You have 4:4:4, 4:4:2, etc... If yuor TV doesnt support the FULL color range of 0-255 (which is likely), use RGB limited. Try a different setting.</p><p>2 - LAV codec settings. Go to MP configuration > Codecs. There click the small icon behind a LAV video codec. In the lower left area there is something written about outputting video to PC monitor, TV screen or leave it as is. (0-255 and 16-250) You can try playing with those settings as well.</p><p></p><p>Just remember to write down what the original settings were before you change them <img src="data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7" class="smilie smilie--sprite smilie--sprite1" alt=":)" title="Smile :)" loading="lazy" data-shortname=":)" /></p><p></p><p>Good luck.</p></blockquote><p></p>
[QUOTE="edterbak, post: 1117394, member: 69192"] well, late reply. You probably got an anwser already by buying the hw you liked. :) For the purpose you describe, 1080p, the radeon should be more than sufficient. Coming from an AMD card the benefit seems pretty small. The delay you describe will not be solved by replacing the gfx card. The only thing a new gpu card will be better in is de-interlacing certain interlaced media sources. Like your TV signal is most likely interlaced. If you want to improve this image with interlacing a mid range HD 5xxx or up is adviced to use vector adaptive de-interlacing. Normal media files are in general not interlaced. Your gfx card does not have to do much to be able to produce the image. So the benefit for that is near to zero. There are some settings available in ATI CCC quality settings to improve picture quality, but a digital source is often not improved much (read: at all) by those tricks. (flesh-tone, white-balanse, sharpening...etc.) I always disable them as they're useless IMO. The washed out colors you write about could have something to do with: 1 - The RGB pixel format setting in AMD CCC. You have 4:4:4, 4:4:2, etc... If yuor TV doesnt support the FULL color range of 0-255 (which is likely), use RGB limited. Try a different setting. 2 - LAV codec settings. Go to MP configuration > Codecs. There click the small icon behind a LAV video codec. In the lower left area there is something written about outputting video to PC monitor, TV screen or leave it as is. (0-255 and 16-250) You can try playing with those settings as well. Just remember to write down what the original settings were before you change them :) Good luck. [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
HTPC Projects
Hardware
Video-Cards
Upgrading my Video Card
Contact us
RSS
Top
Bottom