[mythtv-users] video frame data
robert.mcnamara at gmail.com
Fri Mar 6 20:34:02 UTC 2009
On Fri, Mar 6, 2009 at 12:16 PM, Simon Tsai <simon_tsai at yahoo.com> wrote:
> I have a system with ATI Radem 3xxx graphic chip, it can decode
Not in Linux it can't. There's not any currently working hardware
acceleration for ATI card in Linux. That may change soon, but not
yet. Even when UVD (ATI's hardware acceleration) becomes available in
Linux, it looks like it will be 4xxx-series cards or better only.
When we use antenna to receive TV signal
> and go through ATSC converter to convert signal to mpeg2,
> then go thru ATI graphic chip to display. I saw frame data corruption because the antenna signal quality.
Improve your signal quality by getting a better antenna, fixing your
alignment, or amplifying the signal.
> 1. Which functions in mythtv library (cpp files) to receive hardware decoder frame data?
There is no code in MythTV relevant to ATI hardware decode (because
ATI hardware decode does not exist in linux yet). What's more, for
the hardware offload stuff that *does* exist (for nVidia's VDPAU),
there is no functionality to read the frame data back out of the GPU.
> Is RingBuffer class read the frame data?
> 2. Is any way to detect the frame data is incomplete or corrupt, such as from total bytes in the frame?
> Which function can do this?
It's not possible to determine that a full frame is corrupt or not
without decoding the signal-- for obvious reasons, that's not a good
idea while trying to capture Digital Television (why give up the
status quo, where capture of DTV is a fractional amount of CPU?).
libavcodec/libavformat throw errors when they're not able to decode
something during during commflag/transcode/playback.
In short, the solution to broken signal is not to throw out data, but
to fix the broken signal.
More information about the mythtv-users