• After the last update of the plugin the widget is not working properly. Please help me with the following error in the site wiget: This model’s maximum context length is 4097 tokens, however you requested 5850 tokens (5650 in your prompt; 200 for the completion). Please reduce your prompt; or completion length.

    Please note that no matter what I type, even if I type the prompt, I get this error. I don’t have such a limitation set in my AI settings. I have no limitations in the widget settings either.

    The page I need help with: [log in to see the link]

Viewing 1 replies (of 1 total)
Viewing 1 replies (of 1 total)
  • The topic ‘Error tokens after second answer’ is closed to new replies.