Issue Explanation

An error occurred in the API response: "This model's maximum context length is xxx tokens. However, your messages resulted in yyy tokens." This issue arises when the length of the messages exceeds the model's maximum context length.

Solution

  • Reduce Table Size: Decrease the selected table size to reduce the number of tokens.
  • Check API Limits: Verify the maximum context length allowed by your AI provider and ensure your input stays within these limits.