[mythtv-users] Tearing *with* OpenGL Vsync enabled?!

Scott list-mythtv at bluecamel.eml.cc
Mon Oct 2 23:35:29 UTC 2006


On Oct 2, 2006, at 9:01 AM, Steven Adeff wrote:

> I got my PCIe 6200 on friday and installed it. After playing around
> with settings, etc. I think I've managed to get 1080p output without
> tearing (like my 5200 had).
>
> I actually purchased two cards, one an MSI low-profile 6200TC and the
> other an EVGA 6200TC (both fanless). After using both, I was unable to
> get the EVGA card to not show tearing, whereas the MSI seems to do
> fine as long as I disable OpenGL Vsync! Otherwise I get a little bit
> of tearing that covers the top 15% or so of the video output. I've
> tried different deinterlacers and no deinterlacing, etc. all variety
> of option combinations and the only one that seems to make a
> difference is the Vsync option...
>
> Has anyone else experienced/noticed this? I find it quite odd since I
> thought it was supposed to do the exact opposite.

I have the fanless Giga-byte 7600GT PCIe card. I've seen the tearing  
visual you meantion but don't recall exactly how I replicated it. I  
did find that using nvidia-settings makes a difference too. For me  
I'm using standard Xv (not xvmc) and enabled the vsync option for Xv  
in the nvidia-settings panel. I then disabled the OpenGL Vsync and  
let MythTV use RTC.

This combo seems to work fine for me.

--
Scott



More information about the mythtv-users mailing list