GIGIlistening…

AI agents

Context window

How much text/history the agent can see at once.

The context window is the maximum amount of input (in tokens, roughly characters) an LLM can consider in a single response. A larger context window means the agent can read more of your data — your whole portal description, the last 50 messages with a visitor — before answering, which usually means better answers.

← Back to the full glossary