[mythtv-users] Stupid question - deinterlacing
Michael T. Dean
mtdean at thirdcontact.com
Mon Feb 23 00:18:13 UTC 2009
On 02/22/2009 10:51 AM, Paul Gardiner wrote:
> Michael T. Dean wrote:
>> So, use a 1920x540 modeline (that X/NVIDIA drivers think is
>> progressive), and it will render properly and send the signal to the
>> TV, which it will see as 1080i--assuming the input allows, as
>> mentioned by Yeechang.
> Oh wow, I didn't pick that up before. Does that really work? So
> presumably you use Bob(x2) to split the interlaces from the 1080i
> source, which will turn the two fields into 1920x1080 frames.
> Then scaling takes this to a pair of 1920x540 frames, and these
> get sent one after another, interpreted by the TV as a single
> 1080i frame. So with this trick you don't even run into the
> synchronisation problem? Oh hang on, yes you do: the interlaces
> could become spatially reversed... no worse than
> other methods though.
> But I thought with an interlaced signal one interlace starts
> horizontally half way across the screen... or is that the
> case only for Standard Definition? Can this trick be used
> for 576i and 480i?
I'm thinking you play back the video without deinterlacing to the
half-height progressive display and all works well. With a full-height
display, it would look like it needs deinterlacing, but with the
half-height, you just get each field in turn and then your TV deinterlaces.
Lots of people used to do this with on-the-edge processors. I'm
thinking there were a ton of threads talking about the idea when using
Westinghouse 1080p LCD TV's (maybe the 37" ones). And, I'm thinking
Jarod Wilson was one who of the ones who did this.
More information about the mythtv-users