[mythtv-users] 1080/720 *seems* to be working. Is it?

Jim Shank jim.shank at gmail.com
Mon Jul 30 04:59:56 UTC 2007


I finally made the last step by getting a Samsung 46" LCD 1080p (LN-T4665F)
TV last week. I have my Myth setup connected to the TV via a single-channel
DVI to HDMI cable from my ASUS N6200. I am using the nVidia built-in
modelines for simplicity:

Section "Monitor"
    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Samsung"
    ModelName      "LN-T4665F"
    HorizSync       30.0 - 70.0
    VertRefresh     59.0 - 71.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Videocard0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce 6200"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Videocard0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option          "UseEvents" "True"
    SubSection     "Display"
        Depth       24
        Modes      "1920x1080_60" "1280x720_60" "720x480_60"
    EndSubSection
EndSection

I did a xvidtune -show and get: "1920x1080"   148.50   1920 2008 2056 2200
1080 1084 1089 1125 +hsync +vsync

My TV reports 1920x1080 on anything coming out of Myth and from what I can
tell, the picture seems perfect on each format. I am running myth from the
MD4 package,Library API version: 0.20.20060828-4. I am using the proprietary
9755 nVidia driver as well. 

I have played some 1080p content using Xine (Pirates 2 trailer) and it looks
fantastic. The local broadcast stations seem to be fine in HD (HDHomerun
OTA) but how can I really tell if I am getting the right resolution or if
Myth is just scaling everything to the desktop size. My only issue/concern
other than if it's really working is that Bob 2x makes everything jumpy, I
went back to one-field and it works fine for deinterlacing. 

Is it all really working? How do I know? What can I improve?





More information about the mythtv-users mailing list