[mythtv-users] Refresh Rate reported for interlaced output (was: New nVidia 169.07 and interlace changes)
Alex Halovanic
halovanic at gmail.com
Thu Dec 27 04:17:33 UTC 2007
My understanding was that for most purposes X doesn't really distinguish
between a progressive and interlaced display; it outputs a 1920x1080 picture
at ~ 60fps and then the hardware device handles sending the interlaced
picture to the tv, presumably by simply discarding half the field lines for
each frame.
Most of the time when you just display an interlaced video in myth, the fields
being displayed for the video don't match up with those fields getting sent
to the device. Applying interlacing to a 60fps video gives you about
effectively 30fps for moving objects; applying another round of interlacing
introduced by the unsynced display makes it more like 15fps and very juddery
for motion.
Bob2x deinterlaces the picture by showing all 1080 fields at 60fps. It does
this by showing just the odd fields for 1 frame, which measure 1920x540,
stretching it out vertically to 1920x1080 (making each field 2 pixels high
instead of 1). This causes some very slight bounciness in thin elements or
the edge of motion, as some pixels are rapidly switching between showing an
edge and showing the space adjacent.
Even with just displaying static elements, you still necessarily get this
bounciness on a 1080i display (check out some of the menu elements in the
myth main menus for example). I think that any bounciness introduced by
bobbing is covered up by the inherent bounciness in the 1080i display, even
when the fields are being displayed in reverse order, and you're basically
now just interlacing a progressive 60fps video, instead of reinterlacing an
already interlaced video. Therefore: a perfect 1080i display at the expense
of a lot of apparently unneccessary cpu work.
Incidentally, I finally figured out how I got a perfect 1080i picture without
bob. It absolutely required the double refresh rate patch, otherwise the
video drifts in and out of sync every few seconds even with everything else
identical. I've watched the CBS broadcast of the Buffalo-Cleveland NFL game
for over 30 minutes and 100,000 interlaced frames now and it's never once
lost sync. I'd appreciate if others could test this, especially with display
devices other than the onboard component out (such as DVI with the predefined
1920x1080_60 tv modeline)
My set-up:
-XV video sync enabled in nvidia-settings
-OpenGL video sync enabled (I require both to stop tearing, ymmv)
-Use Video As Timebase enabled (this is crucial or else the fps drifts all
over the place)
-No zoom or overscan in Mythfrontend's playback settings, at least vertically
(there's no point trying to sync things if their sizes aren't identical)
-The double refresh rate patch applied to SVN (0.20 will probably do fine as
well)
Obviously you can't be dropping frames from a bad recording or a loaded CPU or
hard disk, and I'm not sure if it will work if you're stuck with a station
that is broadcasting a combination of interlaced and progressive frames.
When you start playing the recording, there's a good chance it won't be in
sync. If it's not, pause and unpause the video, check any motion for judder,
and keep pausing and unpausing until it starts synced up. I think it's about
a 50% chance of it being synced whenever it starts or restarts playback so it
shouldn't take more than 5 pause-unpauses to get it right. Now, sit back and
watch the perfect 1080i picture, resisting the urge to pause or skip around
and throw things back out of sync ;)
-Alex
More information about the mythtv-users
mailing list