[mythtv-users] Much improved 1080i display
jyavenard at gmail.com
Thu Apr 2 13:42:38 UTC 2009
2009/4/2 Paul Gardiner <lists at glidos.net>:
> Below is a post from the thread "New deinterlacer for perfect image
> quality when using an interlaced display, mode that matches the
> source" I just thought I'd repost it under a different title because
> the original title gave the impression that use of the deinterlacer
> was only for interlaced displays. This should give improved
> quality on any TV with good 1080i support.
I haven't had much luck with it.
Video card is a nVidia 9400M (integrated IGP)
TV is a Sony LCD native 1920x1080.
Set the output to be 1080i @ 50Hz
Created a video profile to use the Interlaced 2X de-interlacer
Started watching a 576i video
And I can't see the Sony doing any deinterlacing whatsoever ... It's
like none were applied
-v playback output:
009-04-03 00:35:38.727 VDP: LoadBestPreferences(720x576, 60)
2009-04-03 00:35:38.727 Using 0 CPUs for decoding
2009-04-03 00:35:38.727 AFD: InitVideoCodec() 0xa0fd0ce0
id(MPEG2VIDEO) type (Video).
2009-04-03 00:35:38.727 VideoOutputXv: InputChanged(720,576,1.77778)
2009-04-03 00:35:38.727 VDP: GetFilteredDeint() : xv-blit ->
Initialize Fieldorder Deinterlacer. In-Pixformat = 1 Out-Pixformat=1
2009-04-03 00:35:38.728 Using deinterlace method fieldorderdoubleprocessdeint
2009-04-03 00:35:38.728 VideoOutputXv: DiscardFrames(1)
video_output' mean = '39998.12', std. dev. = '1424.51', fps = '25.00'
'video_output' mean = '40003.60', std. dev. = '218.25', fps = '25.00'
'video_output' mean = '39995.43', std. dev. = '227.13', fps = '25.00'
So it's definitely playing at 50Hz interlaced (25fps)
Any suggestions ?
More information about the mythtv-users