[mythtv-users] New 50" Plasma, setting Myth for it?
dlknight at sdf.lonestar.org
Wed Oct 21 10:01:48 UTC 2009
On Tue, October 20, 2009 11:17 pm, Jean-Yves Avenard wrote:
> 2009/10/21 David Knight <dlknight at sdf.lonestar.org>:
>> It also has a VGA input which I used in the past and HDMI (currently
>> using). The VGA table has several mode lines (I will just list the
>> 1024x768 upwards):
>> 1024x768 @60Hz, 1024x768 at 70Hz, 1024x768 at 75Hz, 1024x768 at 85Hz
>> 1280x1024 @60Hz, 1366x768 @60Hz.
>> HDMI table supports 720/60p/50p, 1080/60i/50i/60p/50p/24p
>> Back in the specifications table again it says:
>> PC Signals = VGA, SVGA, XGA or SXGA(compressed)
>> So I take it my native resolution is 1024x768? if this is so would I be
> looks like it.
>> best using the VGA input with a 1024x768 modeline rather than the HDMI
> It's a matter of what you care the most for...
> Personally, I am more sensitive to judder, than the resolution,
> especially at the distance I sit from the TV.
> If your TV only accepts 1024x768 @ 60Hz, watching BD rip , few channel
> torrent show often encoded at 23.976fps is going to create judder with
> a monitor at 60Hz.
> So for me, I think it is more important to get the refresh rates right
> than the resolution.
> For me the choice was easy, my 46" TV is both a 1920x1080 panel, and
> it supports all known refresh rates.
> With your panel you have to make compromise in either the resolution
> or the refresh rate.
> If you only watch FTA TV, then 1024x768 @ 60 or 50Hz is going to be
> the best solution. letting myth do the upscaling/downscaling and the
> When trying to watch 1920x1080/24p media, set your TV to that mode...
> Currently, you can't configure mythtv to change to a particular
> resolution based on the refresh rate ; the profiles are all based on
> the resolution only.
> Otherwise thing you could do is if trying to watch content at 24/23Hz
> , then automatically swtich to 1920x1080 mode at that refresh rate as
> your tV supports it.
Well my new 8500 GT card came yesterday so I fitted it yesterday evening,
it doesn't have a HDMI output so I set it up using VGA.
1024x768 looks stretched, so I managed to find a 1360x768 modeline (NVIDIA
can't display 1366x768 as it always has to be a multiple of 8 unless using
EDID). This is listed as the highest VGA resolution in the TV manual.
#Modeline "1366x768" 85.500 1360 1440 1552 1792 768 771 777 795 +hsync +vsync
This looks OK once the picture has been centred using the TV controls, you
also lose a couple of pixels at either side because its not the full
native resolution, but this is hardly noticeable.
Main thing I have noticed is Mythfrontend is so much more responsive now
running in the lower resolution!
Question is what happens when I play 1080i content (BBC HD @ 1440x1080i)
which is probably my main viewing source along with some mkv 720p files. I
guess myth/mplayer is downscaling this to fit in the resolution? Would it
be better to let the PC or the TV do the scaling?
I've been using the Top Gear Polar Challenge (recorded from BBC HD @
21Mbs) as a test video.
Another thing I have noticed is the credits roll at the end of a program
is now totally smooth, it kind of juddered when my resolution was set to
Going forward I think the best bet is to use HDMI and then configure
seperate profiles in MythTv as you have indicated. Before I swapped the
video card I briefly changed to 720p and can't notice much difference
between 720p via HDMI and 1366x768 via VGA.
More information about the mythtv-users