[mythtv-users] [ATrpms-users] Confused: why did video performance DECREASE (up to 70%) after upgrading video card by 2 generations??? Please validate my analysis
george_mythusers at mari1938.org
Mon Dec 8 03:55:00 UTC 2008
Jeffrey J. Kosowsky wrote:
> Instead I got the following:
> GF4600 GeForce 6200
> ______ ____________________________
> glxgears 4700 fps 1360 fps (decrease by 70%!!!)
> quake3 demo 203 fps 74 fps (decrease by 64%)
> In mythtv, the visual performance is about the same even though the
> older 4600 had mythfrontend using 80% of CPU on HD programs vs about
> 40% for the newer 6200.
Uhmmm...isn't this a good thing? Doesn't this mean your 6200 is better
for video than your old card?
> In both cases it seems like the video lags the
> sound with the sound pausing every few seconds to wait for the "slower
> motion" video to catch up giving the sound a jerky appearance.
> Also with the older card when viewing HD programs (but not SD
> programs) I got a lot of "NVP: prebuffering pause" errors in addition
> to the "WriteAudio: buffer underruns" that occur with each sound
> pause. I assume this is consistent with the fact that CPU usage is
> becoming a bottleneck on the 4600 for HD.
> I tried the GeForce 6200 with the 96.xx, 173.xx, and 177.xx drivers
> without any difference.
> The GF4600 only works with the 96.xx drivers.
> Both cards have glx working with direct rendering
> Both are AGP 4x with SBA and Fast Write turned on (same behavior with
> it off too).
> Both have XvMC on (though XvMC seems to only affect cpu usage and does
> not improve or worsen the video itself with the 6200)
Try turning off Xvmc with the 6200. You might not need it. Don't go by
strictly by CPU usage - go by, "does the video look nice and smooth with
no stuttering?" XvMC isn't meant to change the quality of the video -
just make it possible for marginal CPUs to decode video that otherwise
cannot. The problem is it's not perfect - it's a little harder to live
with than plain Xv decoding, as far as things like OSD appearance,
setup, fewer de-interlacing options, etc.
> Both are using identical xorg.conf (with UseEvents=True and
> Both have NvAGP=1 (though changing to the kernel agppart didn't make
> any difference).
> So, it seems to me that either the 6200 is a bum card or I must be
> missing some configuration to unleash its power.
Your 6200 isn't a bum card - it's just that it's recommended a lot on
this list as the least expensive card that is still effective at
accelerated video decoding for SD and HD, and has good quality video
output, and different options for rendering, such as OpenGl. Nobody
said it was a knock-your-socks off card for gaming.
The clock speed, memory bandwidth, marketing numbers, etc. of the card
have no bearing on the performance of the video overlay used for video
decode and acceleration - either for Xv or Xvmc. That's my
understanding, anyway. As far as I know, there is no absolute spec of
video decoding performance for any video card, at least as it relates to
our usage for MythtV. The closest thing would be the table listed on
Mythtv wiki under the subject: Xvmc.
My advice - if your looking for better gaming performance, get a better
gaming card. Something in the 7xxx series from Nvidia is still
available for AGP. It will still be "good enough" for displaying video.
More information about the mythtv-users