Normal
Hello again Cal Ahhh, that would explain why the frequencies are so different then. Excellent. Well this isn't entirely true. The quality measure is usually derived from the BER (bit error rate) of the stream, which is in turn is determined by error checks that the tuner driver performs. Each packet in the transport stream is transmitted with extra bytes (16 for DVB, 20 for ATSC) that help the tuner to determine whether the stream is corrupted or okay - in some cases it is possible for the tuner to correct errors. So in summary, the measure is definitely useful even for digital, and as you say there appears to be a significant difference in the quality of the signal received by the two tuners. Interestingly, we can tell that only the second tuner is affected by the result of your test with only tuner 2 - the signal quality is still significantly lower than the signal quality for the first tuner, even though the first tuner is not operating at the time. This is strange. How did you perform this test - in the manual control section of TV Server configuration? If so, did you have a particular tuner selected when you clicked start timeshifting? This is indeed a surprising result. I have a suspicion that the logging of continuity/discontinuity errors may have been disabled in MP 1.1.x, and that the actual action of logging the errors [in MP 1.2.3] may be causing more disruption than the stream corruptions themselves. This should be relatively easy to test - I'll provide a TsWriter.ax with the logging disabled as soon as I can find some time (at work at the mo, but may not be this evening as I'm quite busy this week). No, I'm game to keep looking - just a little time-constrained with other things at the moment. Lets see how we go... mm
Hello again Cal
Ahhh, that would explain why the frequencies are so different then.
Excellent.
Well this isn't entirely true. The quality measure is usually derived from the BER (bit error rate) of the stream, which is in turn is determined by error checks that the tuner driver performs. Each packet in the transport stream is transmitted with extra bytes (16 for DVB, 20 for ATSC) that help the tuner to determine whether the stream is corrupted or okay - in some cases it is possible for the tuner to correct errors.
So in summary, the measure is definitely useful even for digital, and as you say there appears to be a significant difference in the quality of the signal received by the two tuners. Interestingly, we can tell that only the second tuner is affected by the result of your test with only tuner 2 - the signal quality is still significantly lower than the signal quality for the first tuner, even though the first tuner is not operating at the time.
This is strange. How did you perform this test - in the manual control section of TV Server configuration? If so, did you have a particular tuner selected when you clicked start timeshifting?
This is indeed a surprising result. I have a suspicion that the logging of continuity/discontinuity errors may have been disabled in MP 1.1.x, and that the actual action of logging the errors [in MP 1.2.3] may be causing more disruption than the stream corruptions themselves. This should be relatively easy to test - I'll provide a TsWriter.ax with the logging disabled as soon as I can find some time (at work at the mo, but may not be this evening as I'm quite busy this week).
No, I'm game to keep looking - just a little time-constrained with other things at the moment. Lets see how we go...
mm