Just wondering if this is a known issue: At work I got a new LCD monitor that runs 1920x1200 native resolution. The nvidia driver (I've loaded the binaries from livna) insists that the monitor is telling it that the max pixel clock is 135 MHz. If I plug the exact same monitor into an ATI card on a different computer, the EDID info that the ATI driver reports says the max pixel clock is 175 MHz. Since I need a pixel clock of 154 MHz just to run the monitor at native resolution, I tend to suspect the ATI reading of the EDID info is correct, and nvidia is off its rocker :-). In any case, I discovered the magic handshake to add to xorg.conf that tells nvidia to stop checking the max pixel clock, and the native resolution of 1920x1200 seems to work just fine (no lava coming out the back of the monitor or anything :-). I'm just wondering what the heck put 135 MHz in its head?