[mythtv] Idea for interlaced playback.

Bjørn Konestabo bjornko at math.uio.no
Sat Nov 19 08:12:36 EST 2005


Bryan Mayland wrote:

> Bjørn Konestabo wrote:
>
>> different tv-out chips I suppose. For instance, my correct overscan 
>> setting is "25". Setting it to "0" gives me a tiny picture with large 
>> black
>> borders..
>
>
>    For the record, mine is the exact same way.  I'm NTSC using the 
> 640x480 modeline, but I'm following this thread intensely hoping 
> you'll find a good solution. 

It's nice to know I'm not the only one. Unfortunately I haven't had much 
progress. I recently tried swapping the FX5200 for
the 6800GT I have in my desktop computer, but the TV-out on that one 
behaved exactly like the FX5200 in every directly observable way.
I still can't get it to have proper 25Hz refresh.

I did however find out why only the bobdeint filter results in a really 
smooth looking video. Despite my modeline which gives me perfect mapping
between framebuffer lines and tv-out lines, I still get blending between 
the fields. Setting the environment variable NO_XV=1  revealed that
XV is to blame. I got perfect "field separation" after that. I'm 
guessing the video overlay window is scaled somewhat. Luckilly I'm able to
photograph the phenomenon using my digital camera set to 1/50th of a 
second, and really see if fields are blended or not, so I don't have to
rely on my eyes only.

OpenGL still ensures sync, but not using XV gives a performance penalty 
which leads to unacceptable skipping. I need to fix this video overlay
scaling issue before the filter I wanted to make becomes useful.  I also 
need to get MythTV to build, but that's another issue.

Is the snapshot in SVN more Fedora Core 4 friendly, or should I stay 
with the 0.81 source release?


More information about the mythtv-dev mailing list