[mythtv] OpenGL developments
mythtv at longhome.co.uk
Wed Feb 21 22:49:28 UTC 2007
Number of shaders, vectors, etc is all to do with 3D rendering, and it aint
gonna help much with 1080 decoding. In fact, most of the value of this card
is in it's 3D capabilities, and you'll probably find most cards supporting
XvMC will have similar capabilities. The key is to make sure it will support
both MC and iDCT to take full advantage of the decoding, and check it's
fully supported by myth.
From: mythtv-dev-bounces at mythtv.org [mailto:mythtv-dev-bounces at mythtv.org]
On Behalf Of Richard Thornton
Sent: 21 February 2007 21:46
To: mythtv-dev at mythtv.org
Subject: Re: [mythtv] OpenGL developments
OK so sounds like this is more than possible, when picking a card,
should I go for the number of shaders or the some other variable
(clock speed maybe) to give me the best chance of offloading 1080p?
I am thinking the Nvidia 8xxx series.
Is there anything the community can do to help, donations?
On 21/02/07, Richard Thornton <richie.thornton at gmail.com> wrote:
> I am planning to build a good Myth front-end, the plan is to be able
> to play 1080p video.
> It will include a dual-core processor, 1GB RAM and a 8xxx series
> GeForce card (probably 8600GT when it comes out).
> I was wondering if the "OpenGL Video Renderer" system will bring GPU
> assisted H.264 decoding to Myth, or am I on the wrong track?
> Thanks for all the great work!
mythtv-dev mailing list
mythtv-dev at mythtv.org
More information about the mythtv-dev