Easy-to-follow video tutorials help you learn software, creative, and business skills.Become a member
How does type work on a computer? Before I answer that question, let me ask you these questions. Have you ever had issues with fonts that didn't seem to work right on your computer? Have you ever opened up a document in your computer and seen boxes, question marks, or weird characters where text was supposed to be? Have clients ever called you with a complaint that the type in the PDF files that you sent them was missing or incorrect? Answers to these and other type-related questions are generally tied to a single concept called encoding. At a basic level, encoding is simply a method for describing characters, such as the letters in our alphabet, for example.
This isn't something terribly new. Back in 1800's, Samuel Morse invented the Morse code, a method of using a series of dots and dashes to represent characters. During World War II, the famous work done by British and US Intelligence to break Germany's Enigma and Japan's Purple codes follow the same concept. Or those fun cryptograms games where messages are hidden using a code of letters in the alphabet. These are all examples of encoding. Now, let's apply this concept to a computer.
When you type a letter on your keyboard, that key-press is stored as a numeric code. Let's say 41. Your computer then uses an encoding chart to look up that number, sees what it represents, and then displays the appropriate glyph on your screen. We will actually talk a lot more about glyphs in the next chapter, but for our discussion here, a glyph is simply a picture. In the English language, we call this shape an uppercase A, for example. But let's take a step back for a moment. Where did your computer get this encoding chart from? Who decided that 41 represents an uppercase A? When computers were first introduced in the 1960s, a code called ASCII was created.
It supported 128 slots, later expanded to 256, that were assigned to specific characters. Sounds pretty straight forward right? Well, not everyone used these slots for the same characters. Some countries had their own alphabet or special characters, and some companies, like IBM, added specific characters or codes that would add support for its own hardware. In other words, there were many encodings that were being used across different countries, languages, and computer manufacturers.
So, for example, you might have the code 41 assigned to the letter A on your computer, but when that file is opened up on another computer that's using different encoding, that same number 41 could be pointing to a completely different character altogether, or there may not even be a character assigned to that number at all. Originally, this wasn't such a big problem because for the most part, the basic alphabet is pretty consistent across different encodings. But things got messy with special characters, especially when type designers started pushing fonts with ligatures and em dashes or other professional type features.
Also, a global economy and the advent of email meant more documents were being transferred across the world. This is the primary reason why there are always compatibility issues between Macs and PCs in the past, and general text issues when sharing documents between systems in different countries. These computers simply all had different character encodings. That's why in 1991 the Unicode standard was introduced. One code with support for over 100,000 slots would now be used across all computers, no matter the manufacturer, the region, or the language.
Everything would always be encoded the same, ensuring a consistent method for describing and displaying text. In regard to operating systems, both Windows, as of '95, and all versions of Mac OS X support Unicode. As for applications, things like Microsoft Office support Unicode, as does Adobe Illustrator, as a version 10. In fact, the main reason why Adobe completely revamped the text engine in version 10 was specifically to add support for Unicode.
All Adobe applications as of CS4 now support Unicode as well. Of course, in order to take full advantage of Unicode, you not only need computer operating systems and applications to support it; tou also need fonts to support it, and that's what our next chapter is all about.
Get unlimited access to all courses for just $25/month.Become a member
119 Video lessons · 47427 Viewers
117 Video lessons · 34412 Viewers
113 Video lessons · 80289 Viewers
116 Video lessons · 70209 Viewers