[mythtv-users] Fixed!: Nvidia Sync to Vblank pegging cpu in X
adeffs.mythtv at gmail.com
Wed Dec 20 13:53:18 UTC 2006
On 12/19/06, Jim herold <jim.herold at comcast.net> wrote:
> On 12/17/06, J. Miller <mythtv.org at elvenhome.net> wrote:
> > > I've been fighting the scourge of myth causing X to consume 100% of the
> > > CPU when Xv Sync to Vblank is turned on in nvidia-settings. Everything
> > > works just fine with it unchecked but that is the only sync setting that
> > > gives me a perfect picture with no tearing. Myth's native OpenGL sync
> > > still tears a little. The problem turned out to be the modeline I was
> > > using. It was a 1368x768 56Hz modeline which apparently triggered the
> > > issue. I switched to a 1360x768 60Hz modeline and now everything works
> > > peachy. X still starts consuming large amounts of CPU occasionally but
> > > it quickly drops back down to the 2-4% range. I'm not 100% sure that it
> > > was the refresh but a modeline change did the trick. Hopefully posting
> > > this here will help point others that have the same problem in a new
> > > direction.
> > NVIDIA have added an option in (I think) 9626 to poll() instead of
> > for vblank, which fixes the high cpu usage when sync to vblank is switched
> > on. It's called "UseEvents".
> My Modeline was already 1280x720 @ 60Hz. I've tried tweaking it a number of ways but
> never came up with a magic modeline that fixed the issue here.
> Never thought to try the sync to vblank setting (thanks also to Steven Adeff for this
> suggestion). I knew it was recommended to disable that for XvMC, but never connected
> that it'd be a good idea for Xv as well. With the sync to vblank off everything looks great
> on my setup, and the X cpu usage hovers around 10% when watching 720p content.
> I tried the "UseEvents" thing. Interestingly, it works, but increases the "wa" cpu usage in
> top, and therefore leaves less cpu idle time by roughly 20% when compared with just
> disabling the sync to vblank.
normally the Xv Sync to vblank in nvidia-settings is what causes most
people's high Xorg cpu usage, so disabling it would fix that problem.
My, and from what I gather some others as well, issue was, doing this
would cause random tearing during very high motion camera pans, and a
constant warble-ish tear of the top 10% of the screen (not very
noticeable unless your looking for it, but still a problem). So I'm
hoping this will fix that, though I don't think I'll get the chance to
try it until next week (vacation).
One thing I did notice with using the 9xxx drivers was that X would
only let me use a 24hz 1080p modeline instead of the other builtin's I
use in 8776 (1920x1080_30 or 1920x1080_60). Which "works" with my tv,
it just causes other output issues since the video is 30hz... Perhaps
using an actual modeline will fix this. I'll have to play around and
Before you ask, read the FAQ!
then search the Wiki, and this list,
Mailinglist etiquette -
More information about the mythtv-users