[mythtv-users] How I got great quality TV-out on my nVidia MX4000

Joe Votour joevph at yahoo.com
Sun Mar 13 09:17:56 UTC 2005


Yeah, all those free-running components screwing
things up.  :)  Then again, my Commodore 64 can't do
MPEG-2, or else I'd have written a MythTV client in
assembly for it already.  :D

Just to add my (so far) success story to this thread,
I decided to take some of the suggestions and see what
would happen.

The first step was rebuilding the MythTV RPMs (from
ATrpms) with OpenGL support.  After about five
rebuilds and much searching of the atrpms-devel
mailing list (thanks to Jarod Wilson for posting his
macros!), I got it working.

XvMC is a bust, it doesn't seem to work on my machine
(Athlon64, running in 32-bit mode, nVidia driver on a
5200FX), I just get a frozen image or massive
stuttering.  OpenGL support on its own helps out
massively (on my system), the image (say, on CNN)
doesn't jitter as much as when using either the DRM or
RTC methods.

Anyway, to conclude, my method:
- MythTV 0.17 (from RPMs, with the frontend/backend
RPMs recompiled)
- Enable OpenGL vsync support (MythTV now shows "Video
timing method: SGI OpenGL")
- No XvMC support (doesn't work)
- Use libmpeg2 instead of ffmpeg (text doesn't seem as
"fat" when using it)
- Enable Bob deinterlacing
- Turn on " Use Video for Timebase"
- Changes to the nvidia-settings-rc file as described
in the thread

At the moment, it's looking pretty good, we'll see
what happens once something with jagged lines (like a
cartoon) comes in to see if Bob flickers like mad.

Gees, I really need to stop watching this late night
TV...  When there's "judges" called "Extreme Akim" on
the TV, well...  (So, will this combo seem like it
works when I'm fully awake in the morning?)

-- Joe

--- Cory Papenfuss <papenfuss at juneau.me.vt.edu> wrote:
> > The key to MythTV (or any program, really) being
> able
> > to render a display without tears or choppiness is
> > really in two things:
> > 1. Being able to know when the vertical sync is,
> and,
> > 2. Being able to react to the vertical sync event
> in a
> > timely manner
> >
>  	There's a 3rd issue here.  Most (all?) linux video
> cards run with 
> a free-running clock.  If you want to avoid beat
> frequency issues and 
> tearing, you really want to have the mpeg stream
> itself trigger the card 
> to send out a new vertical field.  Otherwise, you
> have the MPEG stream 
> running a 29.97Hz field rate, and the video card
> running at a "close," but 
> not phase/frequency locked rate of, say, 29.98Hz. 
> That leaves a 0.01Hz 
> beat frequency which can show up as screen tearing
> that moves slowly.
> 
>  	I guess if you've got the VBI, you can do without
> this by simply 
> using the blanking interval time as a time buffer. 
> If the card runs too 
> fast and the MPEG stream doesn't have another frame
> yet, show the old 
> frame again.  If the card runs too slowly and
> *another* frame is ready 
> before the previous has been shown, drop it.  As
> said before, the 
> fundamental problem is that linux is not realtime
> (hard or even soft).
> 
> -Cory
> 
>
*************************************************************************
> * Cory Papenfuss                                    
>                    *
> * Electrical Engineering candidate Ph.D. graduate
> student               *
> * Virginia Polytechnic Institute and State
> University                   *
>
*************************************************************************
> 
> > _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
>
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> 


		
__________________________________ 
Do you Yahoo!? 
Yahoo! Small Business - Try our new resources site!
http://smallbusiness.yahoo.com/resources/ 


More information about the mythtv-users mailing list