[mythtv] H.264 and mythtv

Michael T. Dean mtdean at thirdcontact.com
Wed Aug 31 18:49:40 UTC 2005


Isaac Richards wrote:

>On Wednesday 31 August 2005 01:48 pm, Michael T. Dean wrote:
>  
>
>>MacOS X on top-of-the-line PPC is able to decode H.264 at 1080i at 24-30fps
>>iff it's QuickTime H.264.  Similarly, Windows machines have a hard time
>>with QuickTime H.264, but perform much better with WMV H.264. (Perhaps
>>some optimization or even reverse-optimization for the platform's
>>preferred codec?)
>>
>My 4400+ can play back Apple's 1080p (24fps) trailer for Serenity perfectly in 
>Myth. =)  CPU useage is _very_ close to getting maxed out, though.
>  
>
That's a 4400+ dual-core, right?  I'm guessing that only one core is 
being used for the decoding (leaving the other one to perform other 
system-related tasks).  I'm guessing you're using MPlayer, right?  Have 
you run it with the -benchmark option to see the frequency/number of 
dropped frames?  (Obviously not important if the quality looks good, but 
would be interesting.)

Regardless, the fact that it's possible is bad news for me.  When I 
didn't think it was possible, I didn't have any desire to buy the latest 
and greatest.  Kind of like the 720p TV's that aren't really high-def.  
But now that Samsung has released their HL-R6168W with 1080p support, 
I'm counting the days until I'm home for long enough to place the order 
for the new HDTV.

BTW, I can't believe you found Serenity's HD trailer in QT format.  I've 
only been able to find the WMV (DRM-encumbered) version.  Yes, the 
geniouses actually require you to download a license using WMP to play 
back their advertisement...  Either they realize that Linux geeks are 
going to want to see Serenity, anyway, or they don't want us Linux geeks 
to go to the movie...

>>For H.264 in high def, we really need dedicated silicon to do the
>>decoding--especially if the CPU is doing other things.  The ATI R520
>>will have dedicated silicon to handle H.264 decoding (to be released
>>sometime in October, probably), but whether there will be support for it
>>in the Linux drivers (considering today's ATI drivers don't even have
>>XVMC support) and whether there will be a standard API for using it
>>(don't know of any at this point), is unknown.  NVIDIA cards will not
>>have the H.264 decoding built in.
>>    
>>
>From what I've read, the GF7 series will/does have decode accel as well.
>
NVIDIA's doing it all in PS.  They haven't yet demonstrated PS-based 
decoding working with H.264 (although they've done MPEG-4 and MPEG-2 
successfully), but they're saying driver version 80 will have support 
(obviously, that's a Windows driver version).  IMHO, it's not true 
"hardware decoding," just "software decoding" on a different chip (with 
far more efficient software than general-purpose CPU's require), but 
regardless of semantics, I won't argue the fact that the PS approach 
does provide decode acceleration.

In truth, NVIDIA's solution is probably a better business decision since 
it supports Windows and Linux (via GLSL, now that NVIDIA supports OpenGL 
2.0 on both platforms) and since CPU's will "soon" have the performance 
required for software-based decoding--at which point dedicated silicon 
on the GPU is just costing money--ATI may be wasting money.  That point 
is especially true when you realize that ATI has already demonstrated 
H.264 decoding in hardware and using the PS's on the R520--so they've 
had double the research and development effort involved.  The funny part 
is that NVIDIA's better business decision was a reaction to ATI's 
inclusion of H.264 decoding in the R520 and not a product of their 
long-term strategy decisions...  ATI's trying to put NVIDIA in a bad 
position may have caused NVIDIA to make a better decision than ATI made.  ;)

Mike


More information about the mythtv-dev mailing list