AskAI.Free
Beta
Navigation
Back Professions
Back Dating
Back Writing Tools
Back Programming Tools
📚 Glossary

Inference

In one line: Running a trained model to get an answer. Distinct from training, which is teaching the model in the first place.

Inference is the act of using a trained model — sending it your prompt and getting back an answer. Inference is what happens when you ask ChatGPT a question.

Distinct from training, which is the (much more expensive) process of teaching the model in the first place. Training a foundation model can cost $10M-$100M; one inference call costs a fraction of a cent.

Inference cost is what you pay per token via the API. AskAI.free's pricing is based on monthly inference allowances — Pro gives you ~1M tokens/month, Max gives 5M.

See it in action — ask any AI about inference on AskAI.free.

Try it free →