• Resolved usamaamanra

    (@usamaamanra)


    Hello,

    I’m using AI engine Chat bot with the Context feature.

    The problem is that I have a context of about 300 KB of text. Which I added to the context field of the AI engine chatbot’s settings.

    So basically, In the text I have a list of questions and their answers which was asked by clients and the answers that was provided by us. So i wanted to use that text as the context so that the Chatbot knows what to answer when asked a similar question.

    But now the problem is that after adding the context, when i ask a question it takes forever for the chatbot to reply. When I investigated the issue in the network’s tab, I found out that there is 504 server timeout error after waiting for a minute.

    Now, I could’ve increased the server timeout time but I don’t want the customer to wait so long for a reply. i don’t want it to take that much time. I want it to be fast. Currently, I’m using a cloudways’ server having 2GB RAM.

Viewing 1 replies (of 1 total)
  • Plugin Support Val Meow

    (@valwa)

    Hello @usamaamanra ,

    Just wanted to mention that the turbo model (assuming you’re using it) has a limit of 4096 tokens. So if you have a really long context like this, it won’t work. (300kb of text would be around 11626 tokens)

    AI Engine will automatically truncate the context size, but you can adjust that in the settings. To provide information to the chatbot for user questions, it might be more effective to use Embeddings or Fine-Tuning, depending on the amount of data you have.

    Embeddings are actually a Pro feature, which means I can’t go into more detail about it here.

    Hope this helps! ??

    • This reply was modified 1 year, 4 months ago by Val Meow.
    • This reply was modified 1 year, 4 months ago by Val Meow.
Viewing 1 replies (of 1 total)
  • The topic ‘504 Timeouts – Speed up responses’ is closed to new replies.