Hello everyone,
So I am trying to understand the advantages and disadvantages to the quality control settings on the TV server configuration tab. I see that there are 3 Bitrate modes. Constant, Variable rate average, and variable rate peak. Can someone tell me how these compare in regards to performance and file size? Dose the constant give the largest files, but the best picture? Does the variable bit peak give the smallest files but image quality is sacrificed?
Then there is the bitrate for playback and recording. How do these factor into the above performance variables? I am encoding an analog cable signal.
If this is answered else ware and I have missed it in my search please let me know.
Thank you very much.
Steve
So I am trying to understand the advantages and disadvantages to the quality control settings on the TV server configuration tab. I see that there are 3 Bitrate modes. Constant, Variable rate average, and variable rate peak. Can someone tell me how these compare in regards to performance and file size? Dose the constant give the largest files, but the best picture? Does the variable bit peak give the smallest files but image quality is sacrificed?
Then there is the bitrate for playback and recording. How do these factor into the above performance variables? I am encoding an analog cable signal.
If this is answered else ware and I have missed it in my search please let me know.
Thank you very much.
Steve