LLM Context Window Calculator

See how much content fits in a model's context window.

GPT-4o Context Capacity

Tokens

128,000

Characters

512,000

Words

96,000

Pages

384

Test Your Content

Paste content to see how much of the context window it uses.

Understanding Context Windows

A context window is the maximum amount of text a model can process in a single request, including both your input and the model's response. Larger context windows allow you to include more reference material, longer conversations, or entire codebases. Gemini Pro leads with a 2M token context (roughly 1.5M words), while GPT-4o and Claude models offer 128K-200K tokens.