FMi on text symbol encodings
Tue, 2 Mar 1999 09:00:50 -0500
Frank Mittelbach wrote:
> Thierry Bouche writes:
> > » This isn't a question of particularly current interest however, so we may
> > » well leave things as they are, for now.
> > Well, it is, if you consider the same problem for T1 fonts missing
> > Eng, etc. Should it be T1, should it be something else?
> > Is latex sufficiently robust to manage a T1a exactly as pure T1?
Even "pure" T1 is not always what is optimal (e.g. the question whether
to fake ffl only because it is in T1, for printing a dvi a fake is
necessary, for compiling the source you would not want it to be in the
> it is not (right now) and i wouldn't recommend it within NFSS2 but TS1 is
> it is true that T1 is defective as well (not only in this respect --- ask
> Berthold and he will tell you (and i do agree with most of those arguments))
> but T1 is *mostly* okay and all we got right now without a major redesign ie a
> further fontencoding that really exists.
When the scommaaccent/scedilla clash is to be resolved, when will the
nowhere existing tcedilla removed?
> but in TS1 half or more of the glyphs are not there in most fonts so that's
> quite a different dimension
I don't think that the non-existing fonts are the problem: one could
easily make an aux symbol font with leaves and the like in Type1 format
to supply the exotic parts for most practical cases. The problem is that
there is no systematic approach to extent the capabilities of Tex, e.g.
to cover Baltic characters.
My suggestion would be rather to implement a way to switch
"codepages"(i.e. a clear way to change encodings) instead to put all
what doesn't fit in the historical scheme into a disordered "symbol
The real font world is considerably bigger than the huge number of EC/TC
font files ;-)