[mythtv-users] Raspberry Pi now ships with 512MB RAM

Nick Rout nick.rout at gmail.com
Thu Oct 18 06:36:21 UTC 2012


On Thu, Oct 18, 2012 at 5:44 PM, tortise <tortise at paradise.net.nz> wrote:
> On 18/10/2012 11:54 a.m., Nick Rout wrote:
>>
>> On Thu, Oct 18, 2012 at 9:29 AM, tortise <tortise at paradise.net.nz> wrote:
>>>
>>> On 18/10/2012 6:59 a.m., Jay Ashworth wrote:
>
>
>>>>
>>>> Should I assume we have no way to utilize hardware assist to transcode,
>>>> GPUs
>>>> or the like?
>>>>
>>>> Cheers,
>>>> -- jra
>>>
>>>
>>>
>>> Recalling the R'Pis also can encode H264, makes me wonder if a pi might
>>> be
>>> recruited on a back end (?Lo power) to encode / transcode?
>>>
>>> If it were to be then I expect it would be best to interface via usb,
>>> because in any event the Ethernet interface is connected internally on
>>> the
>>> Pi thru USB.
>>
>>
>>
>> It's pretty clear from a bit of googling that the software to do h264
>> encoding on the RPi is not particularly mature or stable, but the
>> posts are also pretty confusing because many people are trying to
>> encode from webcams, as opposed to a stream already encoded as
>> something else.
>>
>> I don't think your USB idea is a runner, the RPi is a USB host AFAIK.
>> A network socket accepting decoded YUV frames would be better I would
>> have thought, but the bandwidth would be quite high.
>
>
> My comment was intended to acknowledge that the Ethernet interface is hard
> wired thru USB to the Pi's CPU (according to wikipedia reference), meaning
> the USB may be a bottleneck and or it might be better to talk direct rather
> thru USB than thru the Ethernet intermediary that it is in this case,
> however that would not preclude interfacing via Ethernet.
>

I am not sure that USB interface provides the required facilities, but
many protocols work over ethernet, or more precisely tcp/ip.

>
>
>
>
>> Or are you thinking of decoding AND encoding on the RPi?
>
>
> No, at least not at the same time.  The context of this discussion has not
> included using the backend for front end use, which was not the context that
> my comment was intended.
>

To re-encode something you need to decode it first. You were
suggesting encoding video to h264 using the RPi hardware. I was asking
where the decoded video frames were coming from in your scenario.
Either from the backend, or from the RPi.

> However a pi might be recruited to do the the H264 et al decoding on a
> backend and effectively be its video card, perhaps not such a silly idea
> given it works and uses much less power (3.5 W) than an nvidia vdpau card
> would (50 + W)!
>

In which case it simply a frontend (whether mythfrontend, xbmc, a upnp
client, a mythtv services api client, or whatever.)


More information about the mythtv-users mailing list