[ATI] Why does cpu usuage increase when setting display to 1080p/24hz (1 Viewer)

Mces97

New Member
December 22, 2010
1
0
Home Country
United States of America United States of America
I used to run my videos on my htpc at 1080/60hz. For the most part I could play everything with little problems. A few really high bitrate movies would slow down or lag every now and then, but for the most part that problem did not exist. Now I wanted to see if videos looked better seeing as now I have a tv that is capable of playing 1080p/24, but noticed something odd. I have an ati 4550 video card and set the display to 1080p/24. I use ffdshow to decode, and usually it uses both cores on my cpu equally. Now it looks like one of the cores gets maxed out and cpu usuage goes up. Is this normal, and would maybe a newer version of ffdshow fix this? Thanks
Why would system resources be used more when displaying stuff at 24hz instead of 60hz? Would reclock fix this as well?
 

mm1352000

Retired Team Member
  • Premium Supporter
  • September 1, 2008
    21,577
    8,224
    Home Country
    New Zealand New Zealand
    Hi there

    What frame rate do the source videos have?
    My personal opinion is that your display (computer screen or TV) refresh rate should always be a multiple of the source content (TV or video) frame rate. In other words, if it were me I would only try and set my TV to 24Hz if the video frame rates are 24fps. My reason for that preference is that "something" (often the display) will have to do frame rate conversion/interpolation if the display and source content frame rates don't match. This can introduce artifacts or result in blurring or loss of detail. Cheap displays/receivers may even speed up or slow down the video because they can't perform frame rate conversion.

    In your case I'm guessing the extra CPU usage is from frame rate conversion. I could be wrong though. Hope that all makes sense...
     

    Users who are viewing this thread

    Top Bottom