a bit of a digression, is there any sane rationale to the myriad of acronyms for display resolution?
marketing. xga was the IBM eXtended Graphics Adapter which supported 1024x768 albiet interlaced. for the longest time all resultions above that were typically described by h v and refresh rate. Then some marketing droid got a wild hare and now here we are... lcd's simplify things a lot since most of them are designed to work at 60hz as there native refresh rate. There used to be a serious qualitative difference between at crt at 1600x1200x60hz and one at 75 or 80 that doesn't exist when comparing lcd panels in general.
the last two times i went looking to buy a laptop (from dell), i was thoroughly annoyed that they'll list, right up front, that a unit has something like XGA, or SXGA, or XGA+, or UXGA or whatever. and all i want to know is, what is the freaking resolution in pixels?
is there a standard for these acronyms? and does everyone follow that standard, or do we have vendors just making this stuff up out of thin air? i fully expect to see new laptops offering SDXGA (sooper dooper XGA) as the next available resolution.
is there a list? who do i have to kill to read it?
rday
--
-------------------------------------------------------------------------- Joel Jaeggli Unix Consulting joelja@xxxxxxxxxxxxxxxxxxxx GPG Key Fingerprint: 5C6E 0104 BAF0 40B0 5BD3 C38B F000 35AB B67F 56B2