[mythtv-users] Which NVidia card
Rich.West at wesmo.com
Mon Jan 21 19:14:59 UTC 2008
Michael T. Dean wrote:
> On 01/21/2008 09:34 AM, Steve Smith wrote:
>> On 21/01/2008, Cal <cal at graggrag.com> wrote:
>>> Marius Schrecker wrote:
>>>> After a Loooong break I'm trying to breathe new life into my
>>>> MythTV box.
>>>> Why didn't I listen to you all before when you warned bout ATI
>>>> cards? I'd have saved myself a lot of grief! Anyway:
>>>> Now I'm going to get a card that does proper XvMc (Nvidia the
>>>> only choice here??)
>>> Forget about XvMC. It's a relic of a past era where our poor
>>> struggling sub-2G single core cpus _needed_ some graphics
>>> processing to be offloaded in order to survive. So, the poor little
>>> graphics card endured constantly cooking up a storm doing XvMC,
>>> and without even a fan to cool its brow. Now, I believe you're
>>> better off letting a nice little dual core do the hard work. It's
>>> better equipped to deal with the heat output too.
>>> Worst of all, XvMC comes with some tedious artifacts. You don't get
>>> anything for free. My suggestion is have the graphics card do no
>>> more than it has to, and provide the necessary grunt via cpu
>>> instead. I have a 6200 and a 7300 (two systems), and I really can't
>>> pick between them. They're both excellent as long as I don't try
>>> to include XvMC in the mix.
>> That's an interesting take on the use of XvMC.
> I completely agree with Cal.
>> My opinion is use a specialized tool for a specialized job.
> But the problem is that the specialized tool was designed only for video
> decoding. It was not designed to be specialized for MythTV. Software
> decoding gives you full benefits of Myth--you can display /anything/ on
> the output--on top of the video, around the video, ... whatever people
> might think of in the future.
> Another way of looking at is by taking a look at the simpler example of
> audio decoding. If you do "hardware decoding" of AC-3/DTS (i.e.
> passthrough to an external amp/receiver/processor), you lose out on
> things like timestretch, adjusting the audio to compensate for A/V sync
> issues in the video, ... If, instead, you decode the AC-3/DTS in
> software, Myth has full control over the audio output--enabling it to
> speed it up, perform pitch control, ... eventually perhaps volume
> normalization or whatever. Of course, in the case of audio, the
> passthrough currently has the benefit of multi-channel surround output
> and the ability to use the much higher quality D/A circuits in the
> amp/receiver/processor as compared to the D/A in your sound card (though
> software decoding will gain this benefit after #1104).
What shouldn't be forgotten is that one size (of mythtv) does not fit
all. MythTV can be set up in any number of ways (combined FE/BE,
separate FE & BE, multiple FE's and multiple BE's, etc., etc.), and it
doesn't make a lot of sense to generalize.
Off-loading tasks like XvMC to the video card is to relieve the stress
on the CPU and, thus, reducing the requirements. In other words, you
don't *need* that dual-core processor in a measly frontend-only system
if all that system is doing is displaying video.
In my setup, I don't have a need for high-powered frontend systems (I
have the multiple FE with cat5 back to a BE system). SD & HD (via XvMC)
display just fine with my Nvidia 6200 (passively cooled) ($30 after
rebate at the time) with an AMD Athlon 64 3000+ (Socket 754) ($50 at the
time) cooled via heat pipe in a shuttle box ($99) with 512MB of RAM ($9.99).
Cost is most definitely a factor, and it tends to be forgotten in these
types of discussions. Building multiple high powered (multi-core)
systems that are quiet (silent is a whole different arena), capable of
performing the tasks, and doesn't suck up $50 worth of electricity a
month, certainly puts a strain on the wallet, and it makes people
seriously weigh the fact of a DIY mythtv system for $$$ versus renting
that POS from the cable/sat provider for $10/mo..
Oh, and don't expect video cards to give anything back to the processor
any time in the near future.. nVidia, ATI, and others are in business
for a reason. ;-)
IMHO, I don't think that audio is really a more simple example; I think
it is just a different example of the same issue. A receiver will do a
heck of a better job at amplification and decoding within hardware than
software will in the fact that there is no (ok, very little) overhead
(there is no underlying ever-thickening OS that needs to juggle all of
the other stuff and deal with audio) with little to no chance of bugs
introduced down the line. Anyhow, this particular discussion boils down
to the debate between hardware performance vs software performance
(reminds me of the battle of software raid (EMC) vs hardware raid
(Netapp, DDN, etc) which is nearly equal to a religious debate. :)
....But, then again, this is way off topic of what the OP asked about... :)
With regard to the original poster's question: Nvidia is still the
easiest way to go with regard to the choice of a graphics card. XvMC
will utilize the GPU(s) on the video card, thus reducing the processor
requirement and allow you to do more (in a FE/BE combo, a dual-core
system won't have to do much to display HD video and those cores can
then do other (more important) tasks like commercial flagging and
multiple recordings). I don't have much experience with mythtv and the
7000/8000 series cards, but my 6200's with 256MB of onboard RAM are
workhorses and do just fine.
More information about the mythtv-users