Parameter
In one line: A single number inside a neural network. LLMs have billions of parameters — the bigger the count, the more capable (usually).
A parameter is a single learnable number inside a neural network. Each connection between neurons has a weight; each neuron has a bias; each weight and bias is a parameter.
Modern LLMs are huge:
- GPT-3: 175 billion parameters
- GPT-4: estimated ~1.7 trillion (mixture-of-experts)
- Llama 3 70B: 70 billion
- DeepSeek R1: 671 billion
Parameter count loosely correlates with capability — more parameters means more knowledge can be stored and more nuanced reasoning is possible. But architecture, training data and training compute matter just as much.
See it in action — ask any AI about parameter on AskAI.free.
Try it free →