[mythtv-users] Which video card to buy?
George Mari
george_mythusers at mari1938.org
Sat Oct 6 23:29:15 UTC 2007
mlists wrote:
> On Sat, 2007-10-06 at 07:52 -0500, George Mari wrote:
>> Raphael wrote:
>> [deleted]
>>> Interesting, so a 5200 literally can't do 1080i? I'm slightly surprised;
>>> I would have assumed that between a 5200 and a 6200 there would be very
>>> little difference in mpeg2 decoding speed. But since Norm has a P4
>>> 3.2GHz and it can't play 1080i even though the processor is barely used,
>>> it seems like it must be the graphics card.
>>> I have a 5200 that can play 1080i with XVMC. Only problem is XVMC
>>> randomly causes my system to lock up, causing mythfrontend to need a
>>> restart, or worse the machine needs a hard reset. I haven't bothered to
>>> look into sorting that out yet.
>>>
>>> Raphael
>> I have a GeForce 4 MX 4000 outputting 1080P over DVI. I'm not even
>> using the latest drivers, as this card is no longer supported by them.
>> If it can do 1080P, I don't see how the 5200 can not.
>>
>> Has the OP posted the contents of his/her Xorg.0.log? My guess is the
>> 1080 mode lines are failing validation. I had to turn off some of the
>> mode validation checks that the Nvidia driver goes through. Check the
>> Xorg.0.log for why the modelines are failing validation and then check
>> the Nvidia readme file for details on which options turn off what in
>> relation to modeline validation. I couldn't get above 720p before I did
>> this.
>
>
> Ok as the OP I guess I should clarify :) For whatever reason when I
> select 1920x1080 (and _60 and _60_0) it won't display on the TV -- I see
> half the screen over the full size of the screen.
>
> So from the sounds of things its not the video card but something else
> -- I can display 1280x720 no problem with dvi/hdmi.
>
> Here is my xorg.conf
>
> Section "Monitor"
>
> ### Comment all HorizSync and VertSync values to use DDC:
> Identifier "Monitor0"
> VendorName "Unknown"
> ModelName "DFP-0"
> ### Comment all HorizSync and VertSync values to use DDC:
> # HorizSync 30.0 - 91.0
> # VertRefresh 55.0 - 85.0
> # DisplaySize 580 320
> EndSection
>
> Section "Device"
> Identifier "Videocard0"
> Driver "nvidia"
> VendorName "NVIDIA Corporation"
> BoardName "GeForce FX 6200"
> Option "XvmcUsesTextures" "false"
> Option "AddARGBGLXVisuals" "True"
> Option "DisableGLXRootClipping" "True"
> EndSection
>
> Section "Screen"
> Identifier "Screen0"
> Device "Videocard0"
> Monitor "Monitor0"
> DefaultDepth 24
> Option "ExactModeTimingsDVI" "true"
> Option "UseEDID" "true"
> Option "TwinView" "0"
> option "UseDPI" "100"
> Option "ModeValidation" "DFP-0: NoDFPNativeResolutionCheck,
> NoVertRefreshCheck, NoMaxPClkCheck, NoHorizSyncCheck,NoEdidMaxPClkCheck,
> AllowInterlacedModes, NoMaxSizeCheck"
> SubSection "Display"
> Depth 24
> Modes "1280x720_60_0"
> EndSubSection
> EndSection
>
>
OK - with your "Modes" as above, I believe 720P is the only resolution X
will allow now, as it's the only resolution listed.
What does your Xorg.0.log look like when you allow other resolutions?
More information about the mythtv-users
mailing list