[mythtv-users] Video Card Recommendation Needed
raymond at wagnerrp.com
Thu Mar 3 01:55:11 UTC 2011
On 3/2/2011 18:46, Thrash Dude wrote:
> On Wed, 2 Mar 2011 15:38:59 -0800
> Ajay Sharma<ajayrockrock at gmail.com> wrote:
>> On Wed, Mar 2, 2011 at 9:17 AM, Thrash Dude<thrash.dude at gmail.com>
>>> On Tue, 1 Mar 2011 10:58:23 +1100
>>> Jean-Yves Avenard<jyavenard at gmail.com> wrote:
>>>> On 1 March 2011 07:51,<moodyjunk at frontiernet.net> wrote:
>>>>> I'm running Fedora 12 with kernel 220.127.116.11-175 and MythTV 0.24.
>>>>> I need to replace my current video card (9500GT) due to a bad
>>>>> fan. I'm looking at Asus ENGT430
>>>>> vs Asus ENGT240
>>>>> Does anybody have any recommendations of GeForce GT 430 vs 240
>>>>> for this configuration? ie, is GT 430 supported well in kernel
>>>>> 18.104.22.168-175 and MythTV 0.24
>>>> Get a GT430.
>>>> The GT240 is overkilled ; and doesn't have any of the nice audio
>>>> features found in the GT4xx series.
>>>> mythtv-users mailing list
>>>> mythtv-users at mythtv.org
>>> I'd avoid any of the fermi chips unless you also plan on gaming.
>>> That is until/if Nvidia fixes the mixer code for the fermi chips.
>>> The GT430 is on par with an ION, 8400, GT210.
>>> For VDPAU, GT9500+ GT220+ are the better, and less expensive
>> I'm planning on getting a GT440 pretty soon. Sorry for being dense,
>> but what's wrong with the mixer code for the fermi chips? I read the
>> link that you posted but it didn't describe the problem.
> If you read that thread, it shows vdpau test results for many different
> cards. The results show the new fermi chips being extremely weak in the
> mixer code (IVTC ...) granted, if all you ever decode is 24p or SD
> material you'll most likely not see a problem. If you decode HD
> material that needs IVTC or HD material that broadcasts at 60/59.9FPS
> (Some ABC/Fox shows) the fermi cards will hit a bottleneck.
I really don't think you understand what you are saying. Some basic
terminology... deinterlacing is where you input an interlaced signal and
output a progressive signal at half the field rate. Inverse telecine is
where you take an interlaced signal at 60 fields per second, and output
the original progressive signal at 24 frames per second.
The program you linked to seems to run what it claims are "TEMPORAL +
IVTC" and "TEMPORAL_SPATIAL + IVTC" tests. If you run a temporal, or
temporal-spatial deinterlacer, you output progressive content, and
cannot perform IVTC. Similarly, if you run IVTC, you output progressive
content, and cannot perform deinterlacing. Those two tests yield widely
different results, when effect they should be doing the exact same
thing, with the exact same performance. The only thing I can surmise is
that the test is not using a telecined source, the driver detects this
and drops out of IVTC, instead performing the selected deinterlacer. In
effect, all you're doing is measuring the overhead of the analysis
compared to running the deinterlace filter directly.
Next up, ABC and FOX both broadcast 60 frames per second, progressive.
You cannot deinterlace progressive content. The engineers at nVidia
understand this, and will not attempt to deinterlace progressive
content. You cannot possibly bottleneck, and the card will only have to
do basic video scaling.
More information about the mythtv-users