On Thu, 2005-01-13 at 17:33, Markku Kolkka wrote: > Kam Leo kirjoitti viestissään (lähetysaika torstai, 13. tammikuuta 2005 22:53): > > RESOLUTION = WIDTH x HEIGHT = 1280 x 1024 = 1310720 > > > > COLOR_DEPTH (256 colors) = COLOR_CHANNELS x BITS_PER_PIXEL = 3 > > x BITS_PER_PIXEL = 3 x 8 bits_per_pixel = 3 bytes_per_pixel > > X actually uses 32 bits for each pixel in 24 bit color mode, > have a look at /var/log/Xorg.0.log: > > (**) RADEON(0): Depth 24, (--) framebuffer bpp 32 > (II) RADEON(0): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps) > > > RESOLUTION x COLOR_DEPTH <= VIDEO MEMORY > > > > 1280 x 1024 x 3 = 3,932,160 bytes <= 4 MB > > 1280 x 1024 x 4 = 5242880 bytes > 4 MB. I'm actually trying to run at 16 bit depth, but looking at the log I notice that it is only seeing 2MB instead of 4MB. I may have been mistaken about this box but I'm pretty sure I've seen the same thing happen on other machines, even some that I know have 8MB and windows will run at 1600x1280. These are various older Dell boxes with ATI chips on the motherboard. What's the best way to find the real size of the video RAM? Is the built-in probe usually correct now? -- Les Mikesell les@xxxxxxxxxxxxxxxx