AI Definitions: GPT

GPT (Generative Pre-trained Transformer) – G for Generative because it generates words. P for Pre-trained because it’s trained on a lot of text. This step is called pre-training because many language models (like the one behind ChatGPT) go through important additional stages of training known as fine-tuning to make them less toxic and easier to interact with. T for Transformer which is a relatively recent breakthrough in how neural networks are wired. They were introduced in a 2017 paper by Google researchers, and are used in many of the latest AI advancements, from text generation to image creation. So GPT refers to a LLM (large language model) type of AI that first goes through an unsupervised period (no data labeling by humans followed by a supervised "fine-tuning" phase (some labeling).

More AI definitions here