• Resolved jes88

    (@jam2988)


    Hi Jordy,

    Good day.

    At times, the content generated has unfinished paragraphs. Is there any way to address this?

    I have tried adjusting the maximum tokens; sometimes it hits the limit of 4000, which I adjust accordingly.

    It is just generating a lot of content in each paragraph, and at times it’s getting cut off.

    This happens from time to time, but not every time.

    I think it’s getting cut off when it reaches the model token limit.

Viewing 1 replies (of 1 total)
  • Plugin Support Val Meow

    (@valwa)

    Hey @jam2988! ??

    Indeed, if the model attempts to generate more content than it can output (or what you have set as the maximum), the content gets cut off instead of throwing an error.

    What you can do is use a model that can handle a higher number of tokens. Hope this helps!

Viewing 1 replies (of 1 total)
  • The topic ‘Content generated is getting cut off.’ is closed to new replies.