I'm trying to achieve pixel perfect screens on an HDTV which does not have any sort of "direct" HDMI mode. We surely don't want the computer to scale one way and the TV to rescale the other way.
My assumption is that a graphics system always outputs 1920x1080 pixels if that is what is selected.
But I don't understand what the system does to achieve over- or underscan. I hope it doesn't send more or fewer pixels, or scale an over- or underscanned representation back to 1920x1080. That would make Radeon's CCC overscan setting "impossible" to set correctly.
I also assume an HD LCD TV has exactly 1920x1080 pixels.
Does anybody know how all this works, and what is right or wrong in the above?
Currently, I have CCC's overscan set to an intermediate point where I guessed 0 would be if it was marked, and the TV's width and height set accordingly.
My assumption is that a graphics system always outputs 1920x1080 pixels if that is what is selected.
But I don't understand what the system does to achieve over- or underscan. I hope it doesn't send more or fewer pixels, or scale an over- or underscanned representation back to 1920x1080. That would make Radeon's CCC overscan setting "impossible" to set correctly.
I also assume an HD LCD TV has exactly 1920x1080 pixels.
Does anybody know how all this works, and what is right or wrong in the above?
Currently, I have CCC's overscan set to an intermediate point where I guessed 0 would be if it was marked, and the TV's width and height set accordingly.