Foundation model
In one line: A large general-purpose AI model (like GPT-4 or Claude Sonnet) that's been trained on broad data and can be adapted for many tasks.
A foundation model is a large AI model trained on broad, general-purpose data — designed to be a 'foundation' that can be adapted to many downstream tasks via prompting, fine-tuning, or RAG.
Examples: GPT-4, GPT-4o, Claude Sonnet 4, Gemini 2.5 Pro, Llama 3, DeepSeek R1.
The term was coined by Stanford researchers in 2021 to describe the shift from task-specific models (one model per task) to general-purpose models (one model for many tasks).
See it in action — ask any AI about foundation model on AskAI.free.
Try it free →