Back to Glossary
AI Basics
What is GPT (Generative Pre-trained Transformer)?
The AI technology that powers ChatGPT, capable of understanding and generating human-like text.
GPT (Generative Pre-trained Transformer) is the AI model architecture developed by OpenAI that powers ChatGPT and many other language AI applications.
What Does GPT Stand For?
- Generative: Creates new content (text, in this case)
- Pre-trained: Learned from massive amounts of text data before being fine-tuned
- Transformer: The neural network architecture it uses
GPT Versions
- GPT-1 (2018): Proof of concept
- GPT-2 (2019): 1.5 billion parameters
- GPT-3 (2020): 175 billion parameters
- GPT-3.5: Powers free ChatGPT
- GPT-4 (2023): Most capable, powers ChatGPT Plus
- GPT-4o (2024): Faster, multimodal
How GPT Works
GPT predicts the next word in a sequence based on all previous words. Through training on billions of text samples, it learned:
- Grammar and language structure
- Facts and knowledge
- Reasoning patterns
- Writing styles
Limitations
- Can "hallucinate" (generate false information)
- Knowledge cutoff date
- May produce biased content
- Cannot truly "understand" meaning