• Resolved myonlyeye

    (@myonlyeye)


    If there are too many words or tokens in an article, it would be useful if the extension automatically split the text in several parts and used GPT-3 to create a summary for each part of the text and use these summaries as a context for the content-aware function. Just an idea… I have to many blog pages with to many tokens for the content aware function :-/

    Odin

Viewing 1 replies (of 1 total)
  • Plugin Author Jordy Meow

    (@tigroumeow)

    Hi @myonlyeye,

    Are you using the Pro Version? If yes, contact me directly (https://meowapps.com/support). Indeed, I could either create a summary and keep it in some cache. But then, when/how to reset the cache, that is the question ??

    Meanwhile, another idea would to limit the content used by the context to a certain length, to avoid issues and using too many tokens. It’s much easier and faster to do, and that would work fine in most cases (except if the user ask a question about some parts of the article which is right at the end). What do you think?

Viewing 1 replies (of 1 total)
  • The topic ‘A suggestion – automatic GPT-3 summaries for the content aware function’ is closed to new replies.