• Resolved tararais

    (@tararais)


    Hi guys, first of all excellent work, is a great plugin.

    The chatbot works really well except when I add my own fine tunned model, every time I try to use my model I get this kind of error

    “System:Error while calling OpenAI: This model’s maximum context length is 2049 tokens, however you requested 2163 tokens (115 in your prompt; 2048 for the completion). Please reduce your prompt; or completion length.”

    It seems that is trying to load over 2k tokens which is a nonsense as I just typed “hello!” and my prompt is less than 70 tokens. Weird

    I guess there must be something wrong with it, I hope u guys can help! thanks a lot

Viewing 3 replies - 1 through 3 (of 3 total)
  • Plugin Author Jordy Meow

    (@tigroumeow)

    Hi @tararais,

    Do you have an URL where I can test the chatbot?

    Normally, I handle the Max Tokens in a way that shouldn’t happen, so I will have another look at it. But that said, it’s weird that you are getting such big prompts. Do you get this right away? Or after using the chatbot for a while?

    Thread Starter tararais

    (@tararais)

    Right away, now after the update I am getting right responses as long as I asked what’s on the fine-tunned model, if I ask something else it simplies replies with the same question or just random!

    Plugin Author Jordy Meow

    (@tigroumeow)

    Hi @tararais,

    I actually made a big fix yesterday. Was it working before? It should work much better now. However, yes, if you didn’t train your model enough, and start asking questions which are unrelated, the AI will try to reply anyway ??

Viewing 3 replies - 1 through 3 (of 3 total)
  • The topic ‘Fine-tunned model’ is closed to new replies.