[mythtv-users] Nvidia tvout picture quality
thomas at boerkel.de
Wed Sep 15 13:17:15 EDT 2004
After much fiddling, I have now a pretty good NVidia TV-out (card is
ASUS GeForce FX 5200).
I have these things set:
- display depth = 16 (however I don't think this is needed)
- jitter reduction on
- extra audio buffer on
- kernel deint
- mythfrontend reniced -2, X reniced -12
This looks very good (regarding interlace artifacts and smoothness)
compared to the default settings.
David Collett wrote:
> I'm interested in other peoples opinions on the best settings (deint,
> xvmc etc) to use with NVIDIA cards also.
> Randy, are you using the TV-out of your nvidia card or is your TV
> vga-capable? Which TV-out connector are you using (composite/s-video)?
> I used to use TV-out on an mx440, and it also sucked, then I got a
> secondhand H+(dxr3) mpeg2 decoder card with TV-out, man that thing
> looked so much better its not even funny. It wont work with mythtv to my
> knowledge though. (I was using VDR at the time)
> Now I am back to using an mx440 (nforce2 onboard video to be specific)
> connected via VGA to my new 32" HD CRT and it is a lot better than the
> TV-out (svideo/composite) of the nvidia card. But not as good for
> smoothness as the old H+.
> A few points I have noticed:
> (FYI, my TV is doing 1280x720 progressive @60Hz, fed by the VGA of
> nforce2 onboard video, video signal is dvb HDTV (original format
> 1440x1080 interlaced), mythtv version is cvs the day B4 0.16; "deint"
> below refers to any deint algorithm.
> - no deint + no xvmc = total crap (visible interlacing artifacts), CPU=?
> - no deint + xvmc = very smooth, no visible interlacing artifacts (xvmc
> doesnt seem to need a deinterlaced signal?), but visual quality is not
> great(colours not as vibrant, appears 'grainier'), CPU=20%
> - deint + xvmc = smooth, but blocky, the deint seems to screw with xvmc
> to the point where it looks better without it. Also, I have had frontend
> crashes with some deint algos + xvmc. CPU=20%
> - deint + no xvmc = excellent visual quality (varies slightly depending
> on deint algo), but not quite as smooth as xvmc. CPU=60%
> Based on smoothness and CPU usage, xvmc would be perfect, but the image
> quality just is not there, is this ever likely to change? or is it
> beyond the control of the application?
> NVIDIA users: What deint/xvmc combinations are you using???
> On Tue, 2004-09-14 at 12:50, randy ferrill wrote:
>>My system :
>>AMD Athlon xp 1800
>>120gb Maxtor hd
>>Issue, the picture quality using my antenna compared to my normal tv
>>signal is not as bright nor are the whites really white. If I adjust
>>the brightness to get the colors brighter then in brighter scenes it
>>is overly bright and washed out. Currently this is ok on my 27” tv but
>>I have a projector in the works (2nd revision) and it is capable of a
>>picture over 8’ wide indoors (used it outside and got a great 25’x16’
>>on the side of my house) So I want to drive it with my Mythtv box and
>>this is my stumbling point.
>>Also I am assuming that the flick during high speed panning is the
>>video card not being fast enough to smoothly display , Can anyone
>>recommend a inexpensive Nvidia replacement ? How does the 5200 rate?
>>Outgoing mail is certified Virus Free.
>>Checked by AVG anti-virus system (http://www.grisoft.com).
>>Version: 6.0.760 / Virus Database: 509 - Release Date: 9/10/2004
>>mythtv-users mailing list
>>mythtv-users at mythtv.org
> mythtv-users mailing list
> mythtv-users at mythtv.org
More information about the mythtv-users