AI Definitions: Natural language processing
/Natural language processing - This is type of machine learning that makes human language intelligible to the machines. The first step is tokenization. Text is divided into units called tokens and then transformed into vectors—where the words are represented by numbers. These word vectors are lists of numbers. More than 1,000 numbers can be used to represent a single word—meaning that word vector has a high dimension—it’s very nuanced. A low dimension for a word vector means the list of numbers is low—not as nuanced but easier to work with. A deep learning model (typically a transformer model) uses these vectors to understand the meaning of words and determine how they relate to one other. For instance, king would relate to man while queen would relate to woman.
More AI definitions here