[mythtv-users] VGA -SCART and interlace (2x)

Paul Gardiner lists at glidos.net
Wed Sep 23 16:05:45 UTC 2009


John wrote:
> Paul Gardiner wrote:
>> John wrote:
>>> Thanks for the reply.  What I was doing was moving between 
>>> autodetect, interlaced and progressive under the menu. The picture 
>>> looks the same with interlace (x2) and Progressive (no -interlacer ) 
>>> set. Same affect when I choose none as the interlacer under the TV 
>>> settings menu.
>>>
>>> Looks like I need another graphics card,
>>
>> I don't think you've quite got the point I was making. 
> You too :-)
> 
> What I see is perfect "still" quality. The problem is interlacing 
> artefact's on  movement, just as you have clearly described. What I was 
> trying to say was that there was absolutely no reduction in interlacing 
> artefact's between interlaced 2x, and none. So doubling the field rate, 
> and repeating the fields does not lead to a constant synchronization 
> between recorded and displayed interlacing.  The effect is 
> disconcerting, and is clearly noticable on all scene changes, as well as 
> movement.
> 
> VGA to scart gives excellent quality, and the quality is also acceptable 
> on movement using high quality deinterlacers, but the whole point is 
> that we shouldn't need to de-interlace to display it :-)

Ok, that makes more sense now I understand what you are seeing. You
need to get "none" working intermittently well before using "Interlaced x2".

Do you see clear fine-lined combs, or a strange larger-scale sine-wave
type pattern on the edge of moving objects?

P.



More information about the mythtv-users mailing list