Normal
I'm going to give a post from someone that wasn't liked very well, but I thought the info he gave regarding GPU encoding was very interesting. If open source developers could just get their hands (maybe I should say if HW vendors would just open up source code to their cads then maybe some fantastic work would be seen) on the SDK of the latest and greatest graphics cards and what the GPU is suppose to be able to handle (besides giving a great display in today's world).That all said to give this post. The thing I thought was interesting was the link he gave in his post. I think it is great when hardware can be manipulated by s/w developers!What had me thinking back on his post was the ref to ATI decoder. The article which was supplied is ref an encoder but to encode takes more CPU than to decode. I was thinking it would be totally cool when these GPU chips can use s/w algorithms to enhance and filter the video streams!Mike
I'm going to give a post from someone that wasn't liked very well, but I thought the info he gave regarding GPU encoding was very interesting. If open source developers could just get their hands (maybe I should say if HW vendors would just open up source code to their cads then maybe some fantastic work would be seen) on the SDK of the latest and greatest graphics cards and what the GPU is suppose to be able to handle (besides giving a great display in today's world).
That all said to give this post. The thing I thought was interesting was the link he gave in his post. I think it is great when hardware can be manipulated by s/w developers!
What had me thinking back on his post was the ref to ATI decoder. The article which was supplied is ref an encoder but to encode takes more CPU than to decode. I was thinking it would be totally cool when these GPU chips can use s/w algorithms to enhance and filter the video streams!
Mike