[mythtv-users] HDTV performance in 0.18

Brandon Beattie brandon+myth at linuxis.us
Mon Apr 18 22:21:27 UTC 2005


MythTV is often compiled with no support for MMX or anything more than
pentium pro optimization.  If you didn't enable processor specific
optimization and mmx, and opengl when doing a ./configure then this
could have the bad effect.  I haven't noticed much if any CPU change
between .17 and .18 (Granted I run out of CVS) so I'd check that first.

--Brandon

On Mon, Apr 18, 2005 at 03:58:31PM -0400, David Wood wrote:
> Does anyone find it strange to not have enough horsepower to render HDTV 
> in 0.18 on an Athlon XP 3000+ with 1GB of RAM? (Caveat: this is rendering 
> to 800x600 24-bit S-Video on an fx5200.) XV and OpenGL work, renderaccel 
> is true, AGP functional, and by the way, xine can play HDTV with Xv no 
> problem.
> 
> I'm using libmpeg2 and suid/realtime, and the system can just _barely_ 
> handle it without deinterlacing. It's pegged at 95% CPU, and still drops 
> occasional frames. If I turn on deinterlacing, then it's real bad, 
> dropping frames every few seconds.
> 
> As before, toggling libmpeg, realtime, and opengl appear to have zero 
> effect on performance, despite log output from the frontend suggesting 
> they're working.
> 
> I keep finding references to people who are doing HDTV with 2100's and 
> 2400's and so forth. I wonder what the difference is - rendering to a 
> different screen resolution or color depth perhaps?
> 
> Saving grace is that in 0.18 XvMC finally works, almost. I still get ugly 
> flicker and stutter from the OSD, and so far am averaging about 1 
> hard-lock (reboot required) per day... A major improvement.
> 

> _______________________________________________
> mythtv-users mailing list
> mythtv-users at mythtv.org
> http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


-- 


More information about the mythtv-users mailing list