Pioneers Who Mapped Scripts in the Tech Shadows
Digitization is by no means the only step in reviving a traditional script or supporting an emerging on, but it is a very valuable one. Yet there has never been a well-funded authority dedicated to this process, no course of study that would validate the goal or teach the necessary skills.
So in the early days, roughly from 1990-2005, a new kind of underpaid (or entirely unpaid) linguistic footsoldier emerged, one who understood not only languages but typographic design and coding, and most importantly was committed to working with specific language communities to discern whether and how a script was used, to sort out its often conflicting variants (which in themselves might be the manifestation of conflicting groups), and to come to a workable if not ideal consensus that could map every single character, numbers and punctuation and special characters included, into a predetermined little digital box.
In some cases, as I say, this was the work of individuals; elsewhere it was a collaboration between people whose strength was digital font development and others who understood script analysis and language use.
Nowadays we take it for granted that our digital tools will offer us a lot more than Times New Roman, and the number of scripts with downloadable fonts is well over a hundred—which is a great thing, a decentralizing thing, a thing that allows people in a minority culture to text each other without having to use the script of a colonizer or conqueror.
All the more reason to recognize and applaud the pioneers who started this work at a time when it was not technically easy nor ideologically encouraged.
Here are the names of some of them; by all means contact me to let me know about others I have unintentionally left out:
Plus, of course, Debbie Anderson, who led the Script Encoding Initiative despite a lack of funding that now seems criminal.