home
products
contribute
download
documentation
forum
Home
Forums
New posts
Search forums
What's new
New posts
All posts
Latest activity
Members
Registered members
Current visitors
Donate
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Search titles only
By:
Menu
Log in
Register
Navigation
Install the app
Install
More options
Contact us
Close Menu
Forums
HTPC Projects
Software
Codecs
State of HW acceleration
Contact us
RSS
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="NvIon" data-source="post: 505440" data-attributes="member: 97588"><p>ATI sucks for H.264 DXVA decoding, ATI can only do L4.1, 4 reference frames max. Go over the limits and everything goes to hell, green screen, macroblocking errors, massive corruption, no hardware acceleration etc etc. Not to mention ATI randomly breaks DXVA support and post processing features with each driver release.</p><p></p><p>Nvidia supports L5.1, 16 reference frames decoding since 178.24 WHQL drivers to the latest 190.62 WHQL drivers. Even out of spec encodes will be decoded properly with hardware acceleration on Nvidia GPUs with VP2/VP3 unit. G80 GPUs(8800 Ultra/GTX, 8800 GTS320/640) do not have the VP2 unit, only VP1 which is the same as Geforce 7 series and so cannot do full decoding on the GPU, only partial decoding acceleration so CPU usage reduction isn't as good as GPUs with VP2/VP3. MPC-HC's decoder only supports the full hardware acceleration mode, bitstream decoding/variable length decode(VLD) so G80 users are out of luck.</p><p></p><p>So basically if you want the most compatible H.264 decoding hardware, you should get Nvidia graphic cards. The upcoming Nvidia Geforce GT 220 supports full hardware decoding for H.264, VC-1 and MPEG2 as well as 8 channel LPCM support.</p><p></p><p><img src="https://forum.team-mediaportal.com/attachment.php?attachmentid=46290&stc=1&d=1252143540" alt="" class="fr-fic fr-dii fr-draggable " style="" /></p></blockquote><p></p>
[QUOTE="NvIon, post: 505440, member: 97588"] ATI sucks for H.264 DXVA decoding, ATI can only do L4.1, 4 reference frames max. Go over the limits and everything goes to hell, green screen, macroblocking errors, massive corruption, no hardware acceleration etc etc. Not to mention ATI randomly breaks DXVA support and post processing features with each driver release. Nvidia supports L5.1, 16 reference frames decoding since 178.24 WHQL drivers to the latest 190.62 WHQL drivers. Even out of spec encodes will be decoded properly with hardware acceleration on Nvidia GPUs with VP2/VP3 unit. G80 GPUs(8800 Ultra/GTX, 8800 GTS320/640) do not have the VP2 unit, only VP1 which is the same as Geforce 7 series and so cannot do full decoding on the GPU, only partial decoding acceleration so CPU usage reduction isn't as good as GPUs with VP2/VP3. MPC-HC's decoder only supports the full hardware acceleration mode, bitstream decoding/variable length decode(VLD) so G80 users are out of luck. So basically if you want the most compatible H.264 decoding hardware, you should get Nvidia graphic cards. The upcoming Nvidia Geforce GT 220 supports full hardware decoding for H.264, VC-1 and MPEG2 as well as 8 channel LPCM support. [IMG]https://forum.team-mediaportal.com/attachment.php?attachmentid=46290&stc=1&d=1252143540[/IMG] [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
HTPC Projects
Software
Codecs
State of HW acceleration
Contact us
RSS
Top
Bottom