Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Agreed. I made this in the past:

http://i.imgur.com/4dGbR.png (Ubuntu)

http://i.imgur.com/mfyjw.png (OS X)

http://i.imgur.com/BqOdg.png (Windows 7)



Interesting: in terms of visual quality, I would rank those in order of decreasing preference as OS X, Windows 7, Ubuntu.


Is Windows even using the same font? I think Windows prioritizes making the text look sharp over getting all the character strokes in the right place. OS X and Ubuntu look very similar except for the line spacing, and the lowercase ‘a’, which looks kind of strange/vertically compressed on Ubuntu.


>I think Windows prioritizes making the text look sharp over getting all the character strokes in the right place.

Yup, pretty much. A trio of Spolsky/Atwood posts from 2007 worth reading:

http://www.codinghorror.com/blog/2007/06/whats-wrong-with-ap...

http://www.joelonsoftware.com/items/2007/06/12.html

http://www.codinghorror.com/blog/2007/06/font-rendering-resp...


They should all be using Verdana, as per HN's CSS. I have the font installed in each OS so it should be a fair fight unless I screwed something different up.


The font in the Ubuntu image is not Verdana.

Ubuntu does not install Microsoft fonts by default. It will fallback to Ubuntu's default font.


I always have msttcorefonts installed which includes Verdana. It's part of the script that is the first thing I run on a new install...

(Like, a half hour ago, I checked that Verdana was installed and then recreated the screenshot from the original post and it's identical as far as I can tell.


The fonts are similar, but you can see the numeral "3" is different. They're not the same font.

Right-click the text and choose "Inspect element". In the right-hand column, make sure "Computed" is selected at the top, then look to see what font-family is being rendered. (If you click "Rules" at the top, you can see which rules are active.)


In my experience, this doesn't necessarily give the correct font. Some distros have font aliases in fontconfig. I've found that `fc-match` can easily find the true font being used for a given font name/pattern, e.g. fc-match monospace might give DejaVu Mono


Yeah, it says Verdana: http://i.imgur.com/50Sx6c3.png

The '3' doesn't look any more different than other chars due to rendering difference, to me.

And KDE's font manager's take on Verdana: http://i.imgur.com/4c5cSe3.png

And Firefox on my system, just as a reference: http://i.imgur.com/a2yZ7UQ.png


Thanks for double-checking. The 'a' looks very different on the Windows 7 sample!


Your right, my mistake. I verified with my ubuntu install. My Verdana looks the same as yours.

There are differences with how the text is rendered. Maybe it has to do with kerning and subpixel optimizations.


I find Windows 7 the most appealing (and sharpest).

The others are just way too blurry.


Windows has distorted the font here hasn't it? If you're rating fonts on sharpness, Windows would win here, but by any other criteria (distortion, kerning, adequate aliasing, legibility), it fails. The Windows rendering almost looks like a monospaced font. You're throwing a lot out in order to get things sharper at small sizes, and for many people, that matters far more than perceived sharpness.


> legibility

Seriously? I could agree that it may not fully represent the intended font, but the entire point is that it's more legible when it sacrifices that, and I find it hard to believe that you find Windows's text to be less legible than Mac's/Ubuntu's.

i.e. I think Windows sacrifices the fontmaker's intent in order to please the reader, who couldn't care less about the fontmaker's intentions rather than the actual output (unless the reader is the fontmaker himself).

Do you really think it's failing in that too?


Kerning, character spacing and character shape are all designed to help legibility, and I think frankly that the Windows approach is a naive attempt to optimise one aspect of legibility at the expense of all others. So yes, I'm afraid I do disagree that it is more legible, I think it sacrifices too much, as these screenshots make clear the font is deformed so much it looks like a different font, spacing is all off, and glyphs are uneven.

This is somewhat subjective, and you might have a different opinion, but there is more than one variable in play when it comes to legibility, it doesn't just come down to how sharp the text is.


I know, I didn't mean to imply sharpness is the only factor. It's just one of several important factors.

But I couldn't care less about the font being deformed, as long as it looked better than it used to. The fact that you seem to care so much about the deformation into a different font seems to imply to me that your goal is to preserve the original font, rather than to make it look nice on the screen.

If you have the time, would you mind taking the pictures in the comment below and making a smaller bitmap that shows exactly which character(s) you think are not kerned well, and which Linux or Mac kerns better? I'm really curious what I'm not seeing.


Kerning: Look at the a in was, or the e in difference, or OS X - all of these have too much space and the difference is obvious. Distortion: IMHO Fonts carry meaning and affect legibility, they are not just there to make text look nice on screen. Deforming a font subtracts meaning, and in some cases will make it less legible, particular an automated deformation like this.


I agree with you. I don't care what the fonts were designed to look like. I only care about readability. The default font settings on Ubuntu hurt my eyes. Thankfully this is easy to fix by switching to full hinting and enabling subpixel rendering.


For anyone wanting the Linux font rendering on Windows - check out MacType (https://code.google.com/p/mactype/). Works best on high-dpi monitors, and it's a matter of taste on low-dpi (little blurrier, but the fonts are more authentic).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: