This model’s maximum context length is 2049 tokens
-
This model’s maximum context length is 2049 tokens, however you requested 2168 tokens (168 in your prompt; 2000 for the completion). Please reduce your prompt; or completion length. Is the error I get when I select my custom Ai and then try to ask it a question. Tried adjusting the dataset size and still get this no matter what
Trying to train my Ai to my use case, but keep getting the above error no matter what I do to try to change it
Viewing 5 replies - 1 through 5 (of 5 total)
Viewing 5 replies - 1 through 5 (of 5 total)
- The topic ‘This model’s maximum context length is 2049 tokens’ is closed to new replies.