• Resolved chcw

    (@chcw)


    I encounter another error. It said “The prompt is too long! It contains about 5014 tokens(estimation). The model turbo only accepts a maximum of 4096 tokens.”

    I am using ?1.3.94 as I rollback to this version.

    The page I need help with: [log in to see the link]

Viewing 10 replies - 1 through 10 (of 10 total)
  • Plugin Author Jordy Meow

    (@tigroumeow)

    Do you still have this problem? How can I replicate it easily?

    Thread Starter chcw

    (@chcw)

    Hi, Jordy,

    I do not encountere similar issue any more. I will try to figure out how this happens.

    Hi,

    I have the same issue…

    https://www.onlinemarktplatz.de/chatbot-was-moechten-sie-wissen/

    Thanks for your help!

    Frank

    Plugin Author Jordy Meow

    (@tigroumeow)

    Hi,

    Those issues can be the result of many different factors. The prompt is built based on the whole discussion (or limited by the sentences buffer), the context, and also the local context (if embeddings or content-aware is used, for example). So you need to explain a bit about all this first, to make sure someone can understand the issue and help better ??

    Also, don’t hesitate to come on the Discord, where this kind of issues is often discussed: https://www.ads-software.com/support/topic/meow-apps-discord-channel/.

    OK, thank you very much for the quick reply!

    I’m getting same error after longer conversations.

    AI Engine: The prompt is too long! It contains about 4144 tokens (estimation). The model turbo only accepts a maximum of 4096 tokens.

    settings on V1.6.62 are 1024 token, 3 sentences, 128 input

    It looks like post vars are filled and reaching token limits.

    • This reply was modified 1 year, 6 months ago by DirkF.

    I also get the same error after long chats. We are not using embeddings, just a straight prompt and a long conversation of over 1500 words.

    This is the error: “AI Engine: The prompt is too long! It contains about 4469 tokens (estimation). The model turbo only accepts a maximum of 4096 tokens.”

    Maybe an easy solution is adding a message to the error. ie

    “Please press the clear button to continue with a new chat.”

    This gives the user a solution to the error.

    Thanks,

    Jason

    Plugin Author Jordy Meow

    (@tigroumeow)

    Hi guys,

    I am so sorry for that issue, and indeed, I realized what’s wrong ?? The calculation of the max tokens was made on the entire conversation, and not after those messages were cleaned out (as it is supposed to do, per settings).

    I will released a new 1.6.67 version in one hour or two. That should fix fix. Let me know how it goes!

    Hey,

    I am still facing the same issue when long conversations occur. Do you believe that you will be able to fix it? If the user “clears” the conversation, then its’ memory of what is being said is also erased, so it is not a solution to the problem. Thank you very much for your effort and I wanted to say that you are doing great job.

    Plugin Author Jordy Meow

    (@tigroumeow)

    Guys, please join the Discord if you want to talk about this. This will be an issue forever (or for a very very long time), as there will be always a tokens limit. The issues related to it are too various to be discussed in only one thread – it should be handled by either someone who will optimize the process (including the prompt, max messages, model used, etc), or by creating tricks. That said, don’t hesitate to share your thoughts, potential solutions, etc. I’ll also develop some options to auto switch from 4k models to 16k models when it is possible.

Viewing 10 replies - 1 through 10 (of 10 total)
  • The topic ‘Error “The prompt is too long!”’ is closed to new replies.