[mythtv-users] Better off with an ATI card?
matt at mossholder.com
Fri Dec 16 09:22:22 EST 2005
On Fri, 2005-12-16 at 10:20 +0100, Marius Schrecker wrote:
> I'm using a Radeon X300 with DVI out. Works fine using latest X-org and
> ATI proprietary drivers. The only caveats I can think of are to do with
> direct access (I've so far only got it to work with xv).
> Others have had problems with overscan, but with my lcdtv this hasn't een
> a problem. It copes fine with a custom modeline of 1368x768.
> The only other problems I've had have to do with the TV not suporting 50Hz
> refresh rate (so I can't use bob deinterlacing), and a module build
> problem using the latest 2.6.14 kernel (compiled fine on 2.6.13).
> There may be issues on X86_64 systems as the proprietary driver has 32 bit
> Which distro are you running?
Thanks for the feedback! In answer to your question, I am using a
diskless Ubuntu frontend in this case. Since my TV only supports 1080i,
deinterlacing isn't much of an issue for me (I tried using 540p, and the
results don't appear as good as the 1080i, for whatever reason). And
since Ubuntu is still using 2.6.12, I shouldn't have problems with
kernel compilation. So it seems that if I do my homework, and get a
card that has HDCP support, I might be better off.
Thanks for the help!
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the mythtv-users