[mythtv-users] XvMC and CVS
Daniel Thor Kristjansson
danielk at mrl.nyu.edu
Fri Sep 10 12:17:50 EDT 2004
On Wed, 8 Sep 2004, dean vanden heuvel wrote:
]Thanks for the reply...
]It is the video output itself that is the problem. When I enable XvMC
]and disable interlace (from the frontend), and use a 1920x1080i output
](which you are correct, my TV can handle), the video looks very
]*blocked*, sort of like each pixel is VERY large. Diagonal lines are
]truly stepped, etc.
That does read like deinterlacing interacting badly with the resampler
in the DLP. I'll make a patch for you when I get a chance, look out for
it on the dev list.
]In the past, I was using XvMC (no deinterlace, as none was possible in
]0.15) driving the 1080i, resulting in a reasonable picture. However,
]it seemed like scaling from the recorded 480i up to 1080i, only to have
]my TV scale back down to 720p (its native display mode) could be
]causing some degradation. I figured that using Myth to scale from 480i
]to 720p, thus doing NO SCALING in my TV might improve things. It may
Having XvMC convert to progressive mode is probably a bad solution if
your projector can handle 1080i, the filters in a projector or TV will
usually do a better job than the point sampling in XvMC (which will
always make diagonal lines jagged at full resolution). It is really
intended for displays not capable of such high resolutions, and/or
with poor scaling hardware.
]have (I cannot really tell because it takes too long to change from one
]to the other), but the bob deinterlace (without XvMC) makes the picture
]jumpy and movement in the XvMC mode seems to be slightly jerky, so I
]thought a return to 1080i might be in order...thus my question.
That's probably because of the processors in your projector or your
computer struggling with the framerate. Are you getting any warnings on
the console with "mythfrontend -v playback" ? (There have also been
reports of OpenGL vsync not working well with 5x.xx series nvidia
More information about the mythtv-users