[mythtv-users] Optimizing Livetv for AMD?
Sepp Herrberger
Seppl at dcemail.com
Fri Jun 18 06:41:25 UTC 2010
Hi all,
i have the GM890 Pro3 Mainboard (Containing a ATI HD 4290)
http://www.asrock.com/mb/overview.asp?Model=890GM%20Pro3&cat=Specific
ations
As TV card i'm using a TEVII S470
I've did the folowing guide:
http://www.linuxquestions.org/questions/linux-desktop-74/amd-hd-serie
s-graphics-guide-optimizing-video-playback-for-mythtv-mplayer-and-oth
ers-786335/
But if i use this:
* Decoder leave at Standard.
* Video Renderer Set to: OpenGL
* OSD Renderer Set to: OpenGL2
* Next Screen (Deinterlacers)
o Primary Deinterlacer Set to: Bob(2x,HW)
o Fallback Deinterlacer Set to: Linear Blend(HW)
CPU Goes up to 80%
And if i use this:
* Decoder change to VIA XvMC
* Next Screen (Deinterlacers)
o Primary Deinterlacer set to One Field
o Fallback Deinterlacer set to One Field
I always get the message
2010-06-13 12:16:14.071 VideoOutputXv: Desired video renderer 'xvmc-blit' not available.
codec 'H264' makes 'xv-blit,xshm,xlib,' available, using 'xv-blit' instead.
2010-06-13 12:16:14.071 VideoOutputXv: Desired video renderer 'xvmc-blit' not available.
codec 'MPEG2' makes 'xv-blit,xshm,xlib,' available, using 'xv-blit' instead.
Anybody can help me according this?
Kind Regards
/Seppl
_____________________________________________________________
Washington DC's Largest FREE Email service. ---> http://www.DCemail.com ---> A Washington Online Community Member --->
http://www.DCpages.com
More information about the mythtv-users
mailing list