It is a challenge to make ClearType irrelevant

by Michael S. Kaplan, published on 2006/10/25 03:01 -04:00, original URI:

DPI (dots per inch) which I have discussed before previously) has two entirely different and somewhat-at-odds uses. It can

Of course most people only ever see the first use, because if they change the DPI that is what they see. There is no setting to scale down the font sizes in the Shell automatically while increasing the DPI, do you get to watch everything seem to grow.

But as I point out here, it is quite possible to make everything look quite good and have apps be the same apparent size as they were when it was 96 DPI if you scale down the Shell font sizes to match as you jack up the DPI setting.

In a wider sense, the reason ClearType exists is because this setting does not happen even with new LCD screens where it could happen, and it works by faking a higher DPI. But after hearing Peter Constable talk about using an LCD screen's "natural resolution" (in my case on a Latitude D820, 1680 x 1050) and by combining that with scaling down the Shell font sixes (7 pt Segoe UI instead of 9 pt), apps look about the same size as they always did but sharper.

In theory (or maybe even in practice!) you could take that even further, jack it up to 300 DPI and push the font sizes down even further. You get to the point where everything is still the same size but ClearType becomes irrelevant, like you cannot even tell the difference between when it is on and when it isn't (other than the fact that you don't have to worry about the ClearType problems I talked about in You say it 'looks good on paper?' It must not be using ClearType, of course!).

However, it is so uncommon for people to muck with the font settings, and honestly it is not very common to even use anything other than 96dpi or 120dpi due to the difficult UI for both of these settings and the lack of any intuitive/automatic connection between them. To add insult to injury, if you change the system locale, custom DPI settings are lost, a bug that exists in every version of XP and is still not fixed in Vista.

Is it an accident that most spell checkers suggest that Segoe is a misspelling of Segue? What does that mean? :-)

If I were a more paranoid and cynical person than I am and I were outside of Microsoft, I would wonder if this was not a huge NLS/typography/shell/ClearType conspiracy to keep ClearType as a relevant technology even as monitors push the envelope to make it less potentially relevant.

Of course being on the inside I am pretty confident that there is no such cabal, as the interaction of all of these things is mostly accidental. :-)

Maybe a cool DpiPlusSysemLocale applet needs to be put together that coordinates these disparate settings so that they ll work together. Now THAT would be a PowerToy!


This post brought to you by (U+0f85, a.k.a. TIBETAN MARK PALUTA)

# Dean Harding on 25 Oct 2006 3:23 AM:

Actually, I blogged about this the other day:

The thing is, if your DPI settings actually match your displays PHYSICAL DPI, then a 9pt font will always looks the same apparent size, no matter what the physical number of pixels are.

The reason you had to decrease your shell font size to 7pt, I think, is probably because you're used to running XP at "96DPI" on a display that had more than 96 actual pixels per inch.

# Michael S. Kaplan on 25 Oct 2006 3:54 AM:

Indeed, you appear to be correct, Mr. Harding. :-)

On the other hand, I like that other size but crisper, and since I am sort of advocating pushing it even further, I expect to be needing to set the fonts to 1pt soon just to keep the same size!

# Chris Becke on 25 Oct 2006 4:06 AM:

You know, Id like to see some screenshots to see aexactly what you mean.  Unless youre talking about changing the font size, DPI and resolution all at the same time I can't see how boosting the (virtual) dots per inch setting will negate the need for cleartype.

Perhaps im being stupid and not comprehending the post.

# Michael S. Kaplan on 25 Oct 2006 4:18 AM:

Hi Chris,

I never assume stupidity; stupidity is something that has to be proven. :-)

You can look at this post for some examples of what I mean in relation to ClearType becoming less noticeable (click the screenshots for the non-ClearType versions), but you see the outrageous shell stuff that makes the windows bigger.

Basically this post is about avoiding the second issue while getting the benefit of the first....

# Ian Griffiths on 25 Oct 2006 8:50 AM:

Michael, one of the following things must be true, but I can't work out which it is:

1) This blog entry is a joke.

2) You are in posession of a magical LCD screen which, when you change the software-controlled DPI setting in Windows, physically mutates itself so that the size of the pixels on screen actually matches that of the software-configured setting. (Presumably this screen either changes size, or the number of pixels on your screen changes.)

3) You're not using an LCD - you're using a CRT.

4) You've not understood how flat panel screens work.

LCDs have a fixed number of pixels. You can't change that for the same reason that you can't throw a switch on a car's dashboard to switch between a 4 cylinder and a 6 cylinder mode. Unlike CRTs where pixels don't have an exact physical identity (pixels are just overlapping splats projected onto some phosphorestcent dots), in LCDs, a pixel is a distinct physical thing.

Any LCD panel will have as many pixels today as it had when it was made. (Unless some pixels died. Dropping things on your monitor can reduce pixel count. It tends not to increase it though.)

For best results with an LCD, you want the number of pixels supplied by your graphics card to match the number of pixels on the screen. Anything else comes out blurred. Having got the pixel dimensions of your graphics card to match the pixel dimensions of your LCD panel, then you can use the DPI settings to adjust the physical size of stuff on screen.

So the second option you list here - increasing sharpness while leaving things the same size - is ony an option with a flat panel screen if (a) your system is currently misconfigured such that the pixel dimensions of what comes out of the graphics card doesn't match the screen pixel deminsions, or (b) you are plugging in a new screen that has a higher pixel density than your current screen...

Your chosen DPI settings make no difference to the intrinsic sharpness. A given shape at a given physical size will look its sharpest if and only if your graphics card is set for the right pixel dimensions for the screen.

And this is not the reason ClearType exists by the way. The reason ClearType exists is that *if* you have a colour flat panel screen, each pixel is made up of 3 physical elements (a red one, a green one, and a blue one), so you can monkey with the colours to get an effective tripling of horizontal resolution. E.g. a screen with 150 pixels per inch actually has 450 physical elements per inch horizontally because each pixel is divided into three vertical stripes. By selecting colours carefully you can turn the individual coloured segments on and off independently, thus letting you exploit this full resolution.

This only works by messing with colours. You'll never do it any other way because monitors don't offer any other way to individually control the stripes within a single pixel. And it only works if you can accurately target exactly one physical pixel with one frame buffer pixel, which is why ClearType makes it even more important to have your graphics card pixel dimensions match your display pixel dimensions. And it's why ClearType doesn't work propertly on CRTs.

(Some people prefer ClearType over ordinary text on CRTs too. This is actually a different effect: as well as the subpixel stuff, ClearType also enables anti-aliasing, which makes letters appear more like their proper shapes on any monitor. Unfortunately, Windows doesn't let you enable anti-aliasing for small text unless you also turn on ClearType. I've never understood that - anti-aliasing is more important for small text than any other size, but amazingly, Windows font smoothing stops working at exactly the size you most need it...)

# Dean Harding on 25 Oct 2006 10:02 AM:

Chris: See my post for another example. As you increase the number of pixels per inch (or per centimetre if your prefer) the need for anti-aliasing decreases. ClearType is just fancy anti-aliasing.

In my post above, if you look at the two letter 'a's. Imagine they've both been magnified 10x. When viewed at 1/10th the size they're displayed there, the one on the right does not need any anti-aliasing, since any difference in a single pixel will be totally unnoticable.

So basically, when we're using 17" monitors that can do 10,000x8,000 pixel resolutions, then ClearType will become obsolete.

# David Nesting on 25 Oct 2006 12:14 PM:

A) When will it be appropriate for us to start trying to associate things like DPI and points to real-world dimensions (i.e. the actual size of your monitor), as they were intended?

B) When that time comes, will it be too late?  

12pt should be 12pt = 1/6th of an inch, regardless of the display technology or its capabilities.  If we're asking users today to always assume an arbitrary DPI of 96, why use points at all on the screen?  Why not measure font sizes in pixels?  This has the side benefit of actually meaning something.  :)

# Mihai on 25 Oct 2006 12:36 PM:

The other problem is that most applications totally ignore the font system settings, and go with what Visual Studio hard-coded at design time.

# Mike on 25 Oct 2006 1:35 PM:

Ian, I think what you're missing is that you are (not suprisingly) thinking that DPI in Windows means "dots-per-inch".  This is pretty much completely untrue when applied to displays.  It's really just a scaling factor.  120 DPI really means 125% the size of whatever things are at 96 DPI.  This came about because people noticed that as they increased DPI setting (without increasing physical DPI) thier fonts got bigger so that's what they used it for.  You still see this sort of wrong-headedness in Vista today where the personalization applet uses the phrase "Adjust Font Size (DPI)".

# Michael S. Kaplan on 25 Oct 2006 4:13 PM:

I think ClearType is further clouding the issue here too -- and making it less likely to get untangled (either in the UI or in the minds of users) any time soon....

# John Hudson on 25 Oct 2006 11:59 PM:

Yes, ClearType can be understood as just another form of antialiasing*, and at some resolution it will be as unnecessary as antialiasing in print. But that is a pretty high resolution. I've examined ClearType text-size type on 200dpi screens, and to my eye there remains a significant benefit compared to either greyscale antialiasing or b/w bitmaps in terms of typeface design fidelity. And antialiasing is all about typeface design fidelity: b/w bitmaps still perform better than either greyscale or colour subpixel rendering in legibility tests. What happens as the screen resolution increases is that the design fidelity gains of antialiasing begin to merge with an increase in typeform density, as the antialiased 'fringe' of the glyph outline becomes a smaller and smaller field relative to the solid pixels. So the increased design fidelity of antialiased glyphs gains the the more legible typeform density of b/w bitmaps and comes closer to the norm of highly readable text: printed letters. Because of the relative sharpness of the LCD pixels and subpixels compared to lower resolution print media, screen text will achieve good clarity and typeform density at lower resolution than most laser printers and typical consumer inkjet printers.

* The most significant difference between sub-pixel rendering and 'oldstyle' greyscale antialiasing is that in the former the technology used to render glyphs in one direction differs from the technology used in the other direction. This is actually a bit of a nightmare for font developers seeking to produce optimised screenfonts for use in sub-pixel rendering environments like ClearType, and much virtual ink and a bit of blood has been spilt on the subject on, for instance, the ATypI member discussion list.

# Frank Richter on 26 Oct 2006 12:50 AM:

DPI in Windows is a "scaling factor"? This is a messed up, awful perversion of the purpose of setting the display resolution. With right treating of the DPI value applications would know how many pixels a centimeter needs (that'd be relevant for WYSIWIG), fonts would always appear in the same perceived height.

AFAIK modern flat panels can report their pixel resolution as well as their physical dimensions (well, at least the one in my laptop can), this would the OS enable to obtain the right display resolution automatically, so higher resolution displays would lead to better looking interfaces automatically.

Alas, it seems Windows just didn't get that right. If you want to see how the DPI setting is supposed to look, take a look at GNOME. After setting the correct DPI of the screen (yeah, there an automatic detection is unfortunately provided either), fonts and UI in general scale up nicely. It's just beautiful.

# Michael S. Kaplan on 26 Oct 2006 1:43 AM:

Hi Frank,

I am not going to argue with you on this -- moaat of the posts from me and others that complain about ClearType, DPI, and such are all due to the bugs, problems, and confusion associated with these settings....

# Michael S. Kaplan on 26 Oct 2006 1:47 AM:

Hi John,

I don't know whether it has a 200dpi or better screen, but to my eye on a Latutude D800 with dpi set on the computer to 300dpi, I was unble to distinguish between ClearType and non-ClearType. At that point the only problem was the huge shell fonts, which would have to be scaled down to shrink most of the rest of the Windows UI to match....

go to newer or older post, or back to index or month or day