AI Definitions: Tokenization

Tokenization - The process where an LLM creates a digital representation (a token) of a real thing—everything gets a number; words are translated into numbers. Think of a token as the root of a word. “Creat” is the “root” of many words, for instance, including Create, Creative, Creator, Creating, and Creation. “Create” would be an example of a token. Examples

More AI definitions here