[cairo] Cairo font size should be in units
spitzak at d2.com
Tue Sep 28 11:11:30 PDT 2004
Sounds like an impressive and useful patch! I would think this should all go
in at once.
On Tuesday 28 September 2004 09:12 am, graydon hoare wrote:
> - moves the font matrix and DPI into a specific "font device size"
> type, calculated from the gstate, making fonts "sizeless" objects
> which can be referenced in a gstate push rather than duplicated.
My only concern here is to avoid some problems with some other API's. I am
quite tired of API's where fonts are measured in a different transform
(usually called "points") than every other graphic. Something about fonts
seems to produce some mental block so that people keep thinking this is a
good idea. But would you suggest that "stroke" scale the path so it is in
points, while "fill" is in units? No? Would anybody even suggest that the
thickness of lines be measured in "points"? I don't think so. Well doing
fonts in anything other than units is equally wrong, please stop doing it!
Please try to make it so that I can send a matrix of [10,0,0,10] to the font
selector and I get something where the line spacing is 10 ***UNITS***. I do
not want to know the "point size". In *EVERY* case I have ever encountered, I
want to make a font that has a size that matches some other graphic on the
screen. I am also quite aware of the current scale and can easily turn a
point size into units myself, so zero functionality is lost.
Also it is important (and I think everybody agrees) that Cairo scales should
not cause the glyph metricies to change, although the glyphs themselves can
change shape a bit due to hinting. Otherwise it is impossible to reproduce a
graphic at different sizes, for instance a simple right-justify will no
longer produce a straight right edge.
Yes I know that modern font renderers change the shapes of the glyphs
depending on the "size". But this should be completely decoupled from the
actual scale of the glyphs. Selecting a font should be something like:
cairo_select_font(name, 2x2matrix, float point_size)
point_size is used to change the glyph shapes. However the actual glyph sizes
are directly controlled by the 2x2 matrix. If the matrix is [10,0,0,10] then
the line spacing should be 10 *UNITS*, whether the point_size is 0 or 10^8.
The shape of the glyph and the font metricies can vary somewhat depending on
the point size, but will never vary too far from 10 units line spacing.
An alternative that I like is to have no point_size argument at all. Instead
it is calculated by taking a [0,1] vector, transforming it using the font
matrix, then multiplying by 72/96 and this is the resulting point size
(notice that only the font matrix, not the Cairo transform, is used for this).
A third alternative, that I think you are using, is that there is a DPI value
that replaces the /96 in the above equation, but can be changed by a Cairo
API. This DPI value must NOT be changed by Cairo transforms!
If Cairo is used to emulate older interfaces that take font sizes in "points"
I recommend they be converted to sizes by a literal, hard-coded 96/72
multiply. It should never depend on any state of the graphics device,
otherwise the user of the interface is forced to discover this state and
replicate it to get predictable font sizes. This is one thing Microsoft got
right (GDI32 has a hard-coded scale from points to pixels, independent of any
DPI claim of the screen). Certainly X has messed me up all too often because
a new driver selects a different DPI and half my fonts are messed up. I have
NEVER seen a case where I actually want the font sizes in pixels changed.
More information about the cairo