Normal
Hi mm, First something I forgot aboutSilicon dust doesn't show my provider (local, small time group) -- only comcast and satellites. Yes, indeed. Thanks. I did 'mostly' understand the situation with broadcast/cable frequencies and multiple channels, and it's even more clear now with your (repeat) explanation. I never thought about MP extracting multiple channels from one tuned frequency since as far as I knew/know, MP doesn't do that. (instead allocating a separate tuner to every channel, without regard to the channel's frequency)In other words ---- I wasn't understanding that.But now I do. So on to the tuner tests --- tv-new-dll-1sttuner has test results on tuner 1: signal quality 100. tv-new-dll-2ndtuner, on tuner 2: signal quality << 100.I seem to remember from somewhere that quality isn't really a useful measure for digital, since the signal is either there or it isn't. Either way, there is a difference in the tuners. tv-new-dll-2tuner has results with both tuners enabled: It looks to me like TVserver took the free tuner (#3) for the second channel, even though the busy tuner (#2) was already tuned to the correct frequency. Again, tuner 2 signal quality << 100. Seeing the persistent lower quality on tuner 2, but knowing that 2 qam channels record fine on 1.1.0.0, I simulated the test on 1.1.0.0 by starting two recordings from MP, since that TV server does not permit two simultaneous manual tunes.tv-1100 does indeed show lower quality on tuner 2, but that doesn't show up in the actual recording ?!?!?!?! I watched the tuner 2 program for a numbing 20 minutes (a cooking show), and at most saw two minor flickers. Nothing at all like the abominations on tuner 2 with 1.2.3. Are there 2 problems here? Does 1.1.0.0 handle errors better? Again, many thanks for you help. I won't be surprised if you want to leave this for a while ---Cal
Hi mm,
First something I forgot about
Silicon dust doesn't show my provider (local, small time group) -- only comcast and satellites.
Yes, indeed. Thanks. I did 'mostly' understand the situation with broadcast/cable frequencies and multiple channels, and it's even more clear now with your (repeat) explanation. I never thought about MP extracting multiple channels from one tuned frequency since as far as I knew/know, MP doesn't do that. (instead allocating a separate tuner to every channel, without regard to the channel's frequency)
In other words --
-- I wasn't understanding that.
But now I do.
So on to the tuner tests ---
tv-new-dll-1sttuner has test results on tuner 1: signal quality 100. tv-new-dll-2ndtuner, on tuner 2: signal quality << 100.
I seem to remember from somewhere that quality isn't really a useful measure for digital, since the signal is either there or it isn't. Either way, there is a difference in the tuners.
tv-new-dll-2tuner has results with both tuners enabled: It looks to me like TVserver took the free tuner (#3) for the second channel, even though the busy tuner (#2) was already tuned to the correct frequency. Again, tuner 2 signal quality << 100.
Seeing the persistent lower quality on tuner 2, but knowing that 2 qam channels record fine on 1.1.0.0, I simulated the test on 1.1.0.0 by starting two recordings from MP, since that TV server does not permit two simultaneous manual tunes.
tv-1100 does indeed show lower quality on tuner 2, but that doesn't show up in the actual recording ?!?!?!?! I watched the tuner 2 program for a numbing 20 minutes (a cooking show), and at most saw two minor flickers. Nothing at all like the abominations on tuner 2 with 1.2.3.
Are there 2 problems here? Does 1.1.0.0 handle errors better?
Again, many thanks for you help. I won't be surprised if you want to leave this for a while ---
Cal