Rick Bilonick wrote: > Is there nothing in the nvidia usermode applet I seem to recall exists now? > >>What can conspire to make the difference? >> >> - Setting inherited from video BIOS card init (different video BIOS action) >> >> - Different version of xorg, different driver behaviour >> >> - Different nVidia driver version >> >> - Different TV out chip on the cards (eg, Conextant or Nvidia) > The demon explanation seems most likely, but otherwise I'm at a loss. > I'm not sure what you mean by the TV-out chip. I'm only using the 5700's Last time I looked an external TV format conversion chip that accepted pixel data and a clock and output the Composite or RGB TV signals with syncs was used (in fact, I was writing a clean BIOS for the Xbox and had to do as much of this as I could understand). Maybe since a couple of years they now all integrated their TV encoding into the main silicon. Anyway these TV Out chips included RAM line buffers and performed interlacing flicker reduction completely in there, controlled over an I2C bus: the nVidia graphics controller knew nothing about it. Hm yes now I remember poring over the sources for this app, (which you might like to try because it has a buttload of checkboxes and the like for the TV Out section.) http://sourceforge.net/projects/nv-tv-out/ trying to work out what I was doing wrong. I never directly found what I was doing wrong... in the end I dumped the registers in EVERYTHING to do with the video after it was brought up by the MSFT BIOS and by missing bits out eventually found out what was the canonical list of important registers to bring the video up myself from scratch. > By the nvidia applet, I guess you are referring to an application with > the menu name "NVIDIA X Server Settings" under Applications|System > Tools? It's the only user program I've noticed that was installed when I > installed the nvidia driver on the FC4 system. I've checked through all > the options, changed some to see the effect, but none of them appear to > control the pseudo de-interlacing. (I'm saying it's pseudo > de-interlacing because xvidtune reports the correct 1920x1080i modeline > information so I don't think the output is actually de-interlaced.) > I guess I should contact nvidia. Maybe ask in the forums, there were some knowledgeable people there last time I was there. -Andy
Attachment:
smime.p7s
Description: S/MIME Cryptographic Signature