Re: setting X server DPI

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Tim:
>> No.  The DPI should be set to the values that actually represent the
>> hardware.

Tom Horsley
> Actually, that attitude is the one that is utter nonsense. If you
> want to get slavish about "actual representation", then you need to
> know the distance of the viewer and specify font sizes by the angular
> diameter the viewer will experience with the font :-).

No, you don't.  Certainly not from the point you'd advocate, of giving
some false meaning to font sizes.

Yes, if designing wall posters, or the like, you'd work out how it'd be
viewed, then pick a font size that's appropriate to the thing.  You
wouldn't redefine 12 points to be something else so you could say that's
12 point text up there (when it most definitely is NOT), just because it
seems the same as when I hold 12 point text on a sheet of A4 in my
hands.

In the case of posters, you might know that it's going to be read at
five feet away, mostly front on, and usability guides might say that you
should have 3 inch high text for certain parts of the text.  You'd
specify the height, and it'd work out the point size to use, the real
point size, not some artificially made up thing.

> The reason 96 should be the default is that 96 can at least be read
> on every display device I've ever seen, so you'll at least be able
> to see what your are doing while getting things set the way you
> actually want them.

The reason is just cargo cult mentality.

> The "actual representation" of a 9 point font (a perfectly readable
> size on a laser priner) makes about 3 or 4 pixels available to render
> lower case letters on a 46 inch 1920x1080 HD display. Great fun trying
> to navigate to the font settings dialog when all the menu items
> are 4 pixels high.

And that's what happens when you use measurement systems
inappropriately.  DPI has a real meaning, and so does "point" sizes.
When you misuse one, then another, you compound the problem.

Point sizes are *absolute*, specific "fractions of an inch," if you want
a simple one-phrase explanation.  The computer uses your DPI, combined
with a description of the size of the display medium, to work out how
many dots to use get text at the point size specified.

i.e. 12 point text is the same size whether printed on 2 inches of
paper, or 20 inches of paper.

If you want to scale fonts to be readable at certain distances, of an
*apparent* size, but not actually the same size on the different
displays, THEN YOU DON'T SPECIFY SIZES IN POINTS!

Specifying fonts in pixel sizes is the wrong way to go about it, for the
same reasons.  You can only use such font sizing schemes when designing
graphics for a fixed size display.

> Or consider Samsung HD displays. The one I just got has EDID info
> that claims it is 160 x 90 millimeters - that gives 305 DPI as the
> "actual representation" which makes the fonts so large the font
> settings dialog won't fit on the screen.

Your attempting to use a broken ruler to support broken facts.

> Then there are projectors. You can't possibly tell what the
> "actual representation" is because it depends on how far away the
> screen it.

That's where you're wrong.  Firstly, you can tell the distance from the
screen (given semi-decent hardware which actually takes note of the
focus settings - focus is a distance-dependent thing).  Knowing the
optics and the distance, you can know the screen size (and therefore
know the actual DPI - where you're changing the size, rather than the
number of dots, this time).

But still, as I said above, if you're expecting certain point size text
to display at different sizes, then you're doing it wrong.  You've got
two choices at doing it right:  Not using point sizes.  Or not doing
something stupid like wanting to specify 12 point text when you're going
to do a projected display.

> By all means let the actual representation fanatics set the DPI
> to the actual representation if that is what they want, but for
> Gods sake don't make it the default setting. Make the default
> something everyone will be able to read.

Make the default settings stop *misusing* printing systems.  It's just
pandering to the ignorant to call something 12 point text when it's not
12 point text, simply because people are used to it.  Even when misused
in the way that most people expect it, it doesn't work how you want it
to work.  12 point text is different on one thing to another, whether
you measure it with a ruler, or try playing scaling games for viewing
distance.  It's as bogus as Wattage stated in PMPO.

As it stands, on computer software, when you pick 12 points, or even 12
pixels, for text, the unit is actually meaningless.  It's 12 variable
*somethings*.  You might as well call it centimetres, then claim that we
use different centimetres than everyone else.

-- 
[tim@localhost ~]$ uname -r
2.6.27.15-78.2.23.fc9.i686

Don't send private replies to my address, the mailbox is ignored.  I
read messages from the public lists.



-- 
fedora-list mailing list
fedora-list@xxxxxxxxxx
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list
Guidelines: http://fedoraproject.org/wiki/Communicate/MailingListGuidelines

[Index of Archives]     [Current Fedora Users]     [Fedora Desktop]     [Fedora SELinux]     [Yosemite News]     [Yosemite Photos]     [KDE Users]     [Fedora Tools]     [Fedora Docs]

  Powered by Linux