MediaPortal Version: 0.2.3.0 RC2 + latest SVN
MediaPortal Skin: BlueTwo wide
Windows Version: Windows XP Prof. SP2
CPU Type: AMD Athlon XP 2700+
HDD: Seagate 200GB ATA100
Memory: 768MB - DDR 333 (PC3200)
Motherboard: Gigabyte 7VM400M-RZ
Motherboard Chipset: VIA KM400
Motherboard Bios:
Video Card: Gigabyte - Radeon 9550
Video Card Driver: ATI Radeon v6.4
Sound Card: On board VIA VT1617
Sound Card AC3: using 5.1 analog outputs
Sound Card Driver: VIA Vinyl Audio v6.50a
1. TV Card: Dvico FusionHDTV DVB-T
1. TV Card Type: DVB-T
1. TV Card Driver: v 3.50.02
2. TV Card: DNTVLive! LP DVB-T
2. TV Card Type: DVB-T
2. TV Card Driver: v 2.0.0.4
MPEG2 Video Codec: Nvidia PureVideo
MPEG2 Audio Codec: MPA
HTPC Case: custom built
Cooling: custom built - super quiet
Power Supply: custom built - 200W
Remote: MCE - Australian version
TV: Mitsubishi HC1100 DLP projector
TV - HTPC Connection: D-sub (15 pin)
I can receive the following video resolutions via DVB-T
I have been doing extensive testing recently to find a video decoder that gives good performance with all those video resolutions, and can process .dvr-ms recorded files. The closest I have found (out of MPV, Descaler, CyberLink, Intervideo, Dvico DxVA decoder, and Nvidia PureVideo) as the best all-rounder is the Nvidia PureVideo decoder.
However, using the Nvidia PureVideo decoder gives some odd results:-
So, it looks like the “Smart” setting should determine if the video stream being decoded is interlaced and thus use the “Video” setting and de-interlace the video, OR, if the video stream being decoded is progressive and thus use the “Film” setting and not try to de-interlace the video.
But, the “Smart” setting is not doing that. The “De-interlace control” setting of "Smart" seems to be always stuck on “Video”, and will not change to "Film" for progessive channels.
I have also tried the "Automatic" setting under the “De-interlace control”, but it seems to be idental to the "Smart" setting, as I get the same results.
Now, I don’t know if this is a bug in the Nvidia PureVideo decoder (which hasn’t been updated by Nvidia since June last year) or if it is a bug in MediaPortal such that a pin connection between the filters is incorrect or missing?
Can anyone else comment on this, or suggest what else I might try? .....
UPDATE:-
I have just re-read the NVIDIA PureVideo Decoder User's Guide for version 1.02-223 and under the section "Feature Change" it says:-
I am not sure what they mean by "driver cadence", but it implies that this "feature" is only available with Nvidia video cards? (I have an ATI video card)
However, the De-Interlace control of the Nvidia PureVideo decoder has the option "Automatic", which according to the User manual:-
"..... reads the source type and automatically selects video or film mode, depending on the source."
However, I cannot get this "Automatic" mode to do what the manual describes - for me the de-interlacing is always in video mode irrespective of the source video .....
.
MediaPortal Skin: BlueTwo wide
Windows Version: Windows XP Prof. SP2
CPU Type: AMD Athlon XP 2700+
HDD: Seagate 200GB ATA100
Memory: 768MB - DDR 333 (PC3200)
Motherboard: Gigabyte 7VM400M-RZ
Motherboard Chipset: VIA KM400
Motherboard Bios:
Video Card: Gigabyte - Radeon 9550
Video Card Driver: ATI Radeon v6.4
Sound Card: On board VIA VT1617
Sound Card AC3: using 5.1 analog outputs
Sound Card Driver: VIA Vinyl Audio v6.50a
1. TV Card: Dvico FusionHDTV DVB-T
1. TV Card Type: DVB-T
1. TV Card Driver: v 3.50.02
2. TV Card: DNTVLive! LP DVB-T
2. TV Card Type: DVB-T
2. TV Card Driver: v 2.0.0.4
MPEG2 Video Codec: Nvidia PureVideo
MPEG2 Audio Codec: MPA
HTPC Case: custom built
Cooling: custom built - super quiet
Power Supply: custom built - 200W
Remote: MCE - Australian version
TV: Mitsubishi HC1100 DLP projector
TV - HTPC Connection: D-sub (15 pin)
I can receive the following video resolutions via DVB-T
576i
576p
720p
1080i
576p
720p
1080i
I have been doing extensive testing recently to find a video decoder that gives good performance with all those video resolutions, and can process .dvr-ms recorded files. The closest I have found (out of MPV, Descaler, CyberLink, Intervideo, Dvico DxVA decoder, and Nvidia PureVideo) as the best all-rounder is the Nvidia PureVideo decoder.
However, using the Nvidia PureVideo decoder gives some odd results:-
Using the Nvidia PureVideo decoder and setting the “De-interlace control” to “Smart” (the default setting) gives jerky video and 40% CPU on the progressive channels (576p, 720p), and perfect video with 20% CPU on the interlaced channels (576i, 1080i).
Changing the “De-interlace control” setting to “Film” gives perfect video and 20% CPU on progressive channels, but poor de-interlacing on interlaced channels.
And changing the “De-interlace control” setting to “Video” gives perfect video and 20% CPU on interlaced channels, but jerky video and 40% CPU on the progressive channels.
Changing the “De-interlace control” setting to “Film” gives perfect video and 20% CPU on progressive channels, but poor de-interlacing on interlaced channels.
And changing the “De-interlace control” setting to “Video” gives perfect video and 20% CPU on interlaced channels, but jerky video and 40% CPU on the progressive channels.
So, it looks like the “Smart” setting should determine if the video stream being decoded is interlaced and thus use the “Video” setting and de-interlace the video, OR, if the video stream being decoded is progressive and thus use the “Film” setting and not try to de-interlace the video.
But, the “Smart” setting is not doing that. The “De-interlace control” setting of "Smart" seems to be always stuck on “Video”, and will not change to "Film" for progessive channels.
I have also tried the "Automatic" setting under the “De-interlace control”, but it seems to be idental to the "Smart" setting, as I get the same results.
Now, I don’t know if this is a bug in the Nvidia PureVideo decoder (which hasn’t been updated by Nvidia since June last year) or if it is a bug in MediaPortal such that a pin connection between the filters is incorrect or missing?
Can anyone else comment on this, or suggest what else I might try? .....
UPDATE:-
I have just re-read the NVIDIA PureVideo Decoder User's Guide for version 1.02-223 and under the section "Feature Change" it says:-
Removed the Smart De-interlacing mode
Use the NVIDIA Control Panel to control driver cadence
Use the NVIDIA Control Panel to control driver cadence
I am not sure what they mean by "driver cadence", but it implies that this "feature" is only available with Nvidia video cards? (I have an ATI video card)
However, the De-Interlace control of the Nvidia PureVideo decoder has the option "Automatic", which according to the User manual:-
"..... reads the source type and automatically selects video or film mode, depending on the source."
However, I cannot get this "Automatic" mode to do what the manual describes - for me the de-interlacing is always in video mode irrespective of the source video .....
.