Rick Bilonick wrote: > Thanks for your comments. I should have mentioned that I'm using the vga > output port of the 5700 card. It also has a dvi port. Remember, I'm > dealing with HDTV. TV-out at best is only s-video (which the 5700 has - Yes, but analogue video, especially HDTV would react badly to the higher frequencies being attenuated by a crappy cable or connector, you would see 'smearing', ghosting / ringing on abrubt changes in intensity. Just wondered if that might have been what you were reporting as deinterlacing. I don't mean to tell you that you don't know what it looks like. > better). Having said that, I had switched cables during testing and so > it cannot be the cable. I used the same cable for both systems. Only the > FC4 system has the problem. Fine then, not the cable. > I haven't swapped the cards. I don't think that is the problem. It's still an interesting test to see if the problem moves with a card. > the FC4 system? Either FC4 is doing the de-interlacing or the nVidia > card via the xorg.conf file is doing the de-interlacing. But I cannot > find an option to control the de-interlacing (to turn it off). Is there nothing in the nvidia usermode applet I seem to recall exists now? What can conspire to make the difference? - Setting inherited from video BIOS card init (different video BIOS action) - Different version of xorg, different driver behaviour - Different nVidia driver version - Different TV out chip on the cards (eg, Conextant or Nvidia) - Different settings squirreled away on your filesystem somewhere being interpreted by the driver and causing different configuration - Infestation by malign Demons (perhaps incantated by ATI) Any more? -Andy
Attachment:
smime.p7s
Description: S/MIME Cryptographic Signature