📚 Glossary
Context window
In one line: How much text an AI can 'see' at once. Measured in tokens. Bigger context = can analyse longer documents.
The context window is the maximum number of tokens an LLM can process in a single conversation. It includes your prompt + the model's response + any previous chat history you've kept.
Current context windows on AskAI.free:
- ChatGPT 4o: 128K tokens (~96,000 words)
- ChatGPT 4.1: 1M tokens (~750,000 words)
- Claude Sonnet 4: 200K tokens (~150,000 words)
- Gemini 2.5 Pro: 2M tokens (~1.5M words — entire books)
For documents bigger than the context window, you need RAG or chunking strategies.
See it in action — ask any AI about context window on AskAI.free.
Try it free →