[mythtv-users] [BUG] OSD fade causes choppiness
Greg Farrell
greg at gregfarrell.org
Wed Jul 19 10:34:20 UTC 2006
Hi,
I've found that the fading effect of the OSD causes seconds of video
choppiness
and audio-breakup every time it occurs on my via sp13000. This occurs on
every
channel change.
I noticed this in the move from .18.1 to .19-fixes a few months back
and assumed
it was to do with the reworking of the recording method/removal of the
ring buffer
method.
It wasn't until recently that I realised it was during the OSD fade,
which is a couple of
seconds after channel change.
While I know that I can edit the osd.xml to disable fading (from looking
through the
source code) it took me a couple of months of assuming it was a bug in
mythtv
and repeated upgrades to the latest fixes to see if it was sorted.
You could say that it is a configurable option that the user can
disable. However if they
cannot find this option, and I really doubt the average user will, then
it will keep on
appearing as bugged behaviour to them.
Should the fade option default to off or would a configuration option to
force on/off osd
fade with a small explanation of why be added?
I dare say this only manifested itself on my system because of the
crapness of a 1.3ghz
via cpu, but lots of others must be experiencing it too? BTW it occurs
both with xvmc
and normal decoding.
It turns a machine that can happily playback live-tv into one that
cannot. All for a minor
piece of eye-candy.
Greg
More information about the mythtv-users
mailing list