I used to run my videos on my htpc at 1080/60hz. For the most part I could play everything with little problems. A few really high bitrate movies would slow down or lag every now and then, but for the most part that problem did not exist. Now I wanted to see if videos looked better seeing as now I have a tv that is capable of playing 1080p/24, but noticed something odd. I have an ati 4550 video card and set the display to 1080p/24. I use ffdshow to decode, and usually it uses both cores on my cpu equally. Now it looks like one of the cores gets maxed out and cpu usuage goes up. Is this normal, and would maybe a newer version of ffdshow fix this? Thanks
Why would system resources be used more when displaying stuff at 24hz instead of 60hz? Would reclock fix this as well?
Why would system resources be used more when displaying stuff at 24hz instead of 60hz? Would reclock fix this as well?