Overloading an agent’s context with irrelevant, redundant, or loosely related information dilutes the signal of your core instructions — the agent spends more of its attention parsing what is not important and gives less precise, less focused responses as a result.
More Is Not Always Better
This is one of the most counterintuitive things about AI agents: after a certain point, adding more context makes them worse, not better. Educators who discover that their agent can hold hundreds of thousands of words often respond by loading everything they have ever written about their business — every email, every lesson outline, every FAQ ever asked. The agent does not refuse. It just starts underperforming in ways that are hard to diagnose.
Think of it like giving a new colleague a stack of 300 documents to read before their first meeting and saying “everything you need to know is in there somewhere.” They will show up overwhelmed, unable to quickly retrieve the specific thing that matters in the moment, and defaulting to cautious, general answers because they are not sure which of the 300 documents applies to this particular question.
The Dilution Problem
AI agents use attention mechanisms to determine which parts of the context are most relevant to the current question. When the context is lean and well-organized, relevant instructions stand out clearly. When the context is bloated with low-relevance content, the relevant instructions get diluted — they have to compete with everything else for the agent’s attention, and they do not always win.
Specific symptoms of context overload include: answers that are technically accurate but miss the specific nuance of what was asked, instructions that the agent follows inconsistently across similar questions, and responses that blend information from different parts of the context in ways that produce confused or contradictory answers. All of these are signs that the agent is struggling to separate signal from noise.
What This Means for Educators
For coaches building campus agents, the discipline is editing your context down rather than adding to it. For every piece of content you consider including, ask: does the agent need this to do its job right now, or is this background information I am adding to feel thorough? Background information belongs in a knowledge base the agent retrieves on demand — not in the system prompt competing with the instructions that actually govern how the agent behaves.
The Simple Rule
If your agent’s context is longer than 1,000 words, audit it ruthlessly. Cut anything that is not a behavioral instruction or essential audience context. Move everything else to a knowledge base. A lean, focused context almost always produces a sharper, more reliable agent than one carrying a heavy load of everything you have ever written.
