I guess it is possible to let the graphic card to do the deinterlacing even without DXVA. Is it true? Experts, please comment.
according to my research untill now:
- basically possible
- can depend on the video mixer (VMR, EVR) and therefore on OS (XP/Vista)
- can depend on your graphic card+driver
- can depend on the directshow codec
It is important, because it could make playing 1080i HDTV broadcasts possible with low-range graphic cards (like mine x1250 without h264 HWAcc support): we let to decode the stream with software (CoreAVC) without deinterlacing and give this duty to the graphic card.
I guess that's why I could not get 1080i work with XP but very easily with Vista. Deinterlacing can cost up to 30% CPU even for dual cores. Just check out the first 2 test results: 8 Mbps 1920*1080 h.264 i vs p with CoreAVC 52% vs. 26% CPU load!
AMD/ATI and NVIDIA Graphics Cards in Video Decoding Tasks: June 2007
according to my research untill now:
- basically possible
- can depend on the video mixer (VMR, EVR) and therefore on OS (XP/Vista)
- can depend on your graphic card+driver
- can depend on the directshow codec
It is important, because it could make playing 1080i HDTV broadcasts possible with low-range graphic cards (like mine x1250 without h264 HWAcc support): we let to decode the stream with software (CoreAVC) without deinterlacing and give this duty to the graphic card.
I guess that's why I could not get 1080i work with XP but very easily with Vista. Deinterlacing can cost up to 30% CPU even for dual cores. Just check out the first 2 test results: 8 Mbps 1920*1080 h.264 i vs p with CoreAVC 52% vs. 26% CPU load!
AMD/ATI and NVIDIA Graphics Cards in Video Decoding Tasks: June 2007
Hungary