[mythtv-users] Nvidia TV Encoder not listing any support HD modes

Skitals hondacrxsi at gmail.com
Sun Mar 23 12:50:18 UTC 2008




Bonj wrote:
> 
> Justin Nolan wrote:
>> I'm giving up trying to output 1080i via DVI->HDMI without all types of 
>> weird judder, so I've turned to tv-out/component video. I've never seen 
>> any different between Component and DVI on my Sony 1080i RPTV, so it 
>> shouldn't be any loss. Although I've ran into a major hurdle: I can't 
>> get any HD resolutions working w/ TV out. My graphics card is a XFX 
>> 7200GS, and I'm using a nvidia 7-pin HDTV breakout cable plugged into 
>> the TV out port (gfx card manual says the port supports 4, 7, and 9 pin 
>> cables/adapters).
>> 
>> I'm using the latest driver from nvidia, and my xorg.conf is as such:
>> 
>>> Section "Monitor"
>>>     Identifier     "Generic Monitor"
>>>     HorizSync       30.0 - 130.0   
>>>     VertRefresh     50.0 - 160.0
>>>     Option         "DPMS"
>>> EndSection
>>>
>>> Section "Device"
>>>     Identifier     "nVidia Corporation G72 [GeForce 7300 SE]"
>>>     Driver         "nvidia"
>>> EndSection
>>>     
>>> Section "Screen"
>>>     Identifier     "Default Screen"
>>>     Device         "nVidia Corporation G72 [GeForce 7300 SE]"
>>>     Monitor        "Generic Monitor"
>>>     DefaultDepth    24
>>>     Option         "UseDisplayDevice" "TV"
>>>     Option         "TVOutFormat" "COMPONENT"
>>>     Option         "TVStandard" "HD1080i"
>>>     SubSection     "Display"
>>>         Depth       24
>>>         Modes      "1920x1080"
>>>     EndSubSection
>>> EndSection
>> 
>> According to the xorg log, it doesn't seem to complain about the HD1080i 
>> mode. But then:
>> 
>>> (WW) NVIDIA(0): No valid modes for "1920x1080"; removing.
>>> (WW) NVIDIA(0):
>>> (WW) NVIDIA(0): Unable to validate any modes; falling back to the 
>>> default mode
>>> (WW) NVIDIA(0): "nvidia-auto-select".
>>> (WW) NVIDIA(0):
>>> (II) NVIDIA(0): Validated modes:
>>> (II) NVIDIA(0): "nvidia-auto-select"
>>> (II) NVIDIA(0): Virtual screen size determined to be 800 x 600
>> 
>> What actually happens on the screen is that for the first few minutes 
>> its a garbled mess of flickering red lines. After a few minutes the 
>> screen appears in a small 800x600 window in the center of the screen 
>> surrounded by black. My tv displays grey bars on the left and right on 
>> everything except an HD signal. That tells me it actually is receiving a 
>> 1080i signal, but with only a 800x600 pixel image.
>> 
>> And here's the part that is really making me pull out my hair. The 
>> nvidia tv encoder on the card isn't reporting any supported HD modes:
>> 
>>> (--) NVIDIA(0): Connected display device(s) on GeForce 7300 SE/7200 GS
>>> at
>>> (--) NVIDIA(0): PCI:1:0:0:
>>> (--) NVIDIA(0): NVIDIA TV Encoder (TV-0)
>>> (--) NVIDIA(0): NVIDIA TV Encoder (TV-0): 400.0 MHz maximum pixel clock
>>> (--) NVIDIA(0): TV encoder: NVIDIA
>>> (II) NVIDIA(0): TV modes supported by this encoder:
>>> (II) NVIDIA(0): 1024x768; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI,
>>> (II) NVIDIA(0): PAL-N, PAL-NC
>>> (II) NVIDIA(0): 800x600; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 720x576; Standards: PAL-BDGHI, PAL-N, PAL-NC
>>> (II) NVIDIA(0): 720x480; Standards: NTSC-M, NTSC-J, PAL-M
>>> (II) NVIDIA(0): 640x480; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 640x400; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 400x300; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>>> (II) NVIDIA(0): 320x240; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC 
>>> (II) NVIDIA(0): 320x200; Standards: NTSC-M, NTSC-J, PAL-M, PAL-BDGHI, 
>>> PAL-N,
>>> (II) NVIDIA(0): PAL-NC
>> 
>> What would cause no HDTV modes to be listed? I've been struggling with 
>> this for going on 8 hours, so any input is much appreciated. Thanks!
> 
> I use a 7200GS outputting via component breakout
> cable to a 76cm widescreen Sony CRT at 1080i. At 1080i I experience 
> combing effects without deinterlacing turned on. This is a side effect 
> of using an interlaced mode and cannot be avoided without deinterlacing.
> I have only recently upgraded my box to this hardware, so I'm still 
> playing with the deinterlacers. The current setting is the "one field" 
> deinterlacer, which gets rid of the combing, but isn't the best visually 
> IMHO. I didn't have much time to play with it before it got commandeered 
> for actual TV watching, so I have yet to experiment further, but I can 
> say that XvMC with Bobx2 didn't look real flash either... must be a 
> problem of either my XvMC setup, or the driver/card itself, but it 
> seemed to have excessive buffering pauses.
> 
> Anyway, below is my xorg.conf:
> 
> Section "Monitor"
>      Identifier     "Generic Monitor"
>      Option         "DPMS"
> EndSection
> 
> Section "Device"
>      Identifier     "Generic Video Card"
>      Driver         "nvidia"
>      Option         "NvAGP" "1"
>      Option         "DPI" "100x100"
>      Option         "UseEvents" "1"
>      Option         "AddARGBVisuals" "1"
>      Option         "AddARGBGLXVisuals" "1"
>      Option         "NoLogo" "1"
>      Option         "UseDisplayDevice" "TV"
>      Option         "TVOutFormat" "COMPONENT"
>      Option         "TVStandard" "HD1080i"
> EndSection
> 
> Section "Screen"
>      Identifier     "Default Screen"
>      Device         "Generic Video Card"
>      Monitor        "Generic Monitor"
>      DefaultDepth    24
>      SubSection     "Display"
>          Depth       24
>          Modes      "1920x1080" "1280x720" "1024x768" "720x480" 
> "800x600" "640x480"
>      EndSubSection
> EndSection
> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users
> 
> 

I just had a thought... my TV seems way more capable of
deinterlacing/scaling than mythtv. I've accidentally set my Xbox 360 to 720p
and left it that way for months unknowing. Over dvi-hdmi with mode
1920x1080_60i, mythtv can playback 1080i content w/o any deinterlacing. It's
480i content that is giving me a headache. It's the upscaling to 1080i that
is multiplying the interlacing artifacts, I suppose. I'm going to try
setting it to switch resolutions on the fly, because I just realized THAT is
what my cable box does.

I only came to this realization now because it wasn't till last night I got
1080i content to look decent at 1080i output. I had to DISABLE opengl
vertical sync (or whatever it's called). That got rid of almost all the
jumpiness. There is a little studder during commercials and such, which I'm
guessing the opengl sync option is supposed to fix.
-- 
View this message in context: http://www.nabble.com/Nvidia-TV-Encoder-not-listing-any-support-HD-modes-tp16226990s15552p16235581.html
Sent from the mythtv-users mailing list archive at Nabble.com.



More information about the mythtv-users mailing list