You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
Gpt2simple works fine when one needs to generate a batch of similar texts, for instance, using a pre-trained model + one prefix.
While trying to gen for instance 100 texts from 100 intros face hanging the system during 3-5th cycle step. Guess that's because of memory leaks when opening/closing the generation sessions.
Is it possible to proceed with the only generation session and send a row of prepend texts to obtain a number of different texts during the only step?
The text was updated successfully, but these errors were encountered:
Hi!
Gpt2simple works fine when one needs to generate a batch of similar texts, for instance, using a pre-trained model + one prefix.
While trying to gen for instance 100 texts from 100 intros face hanging the system during 3-5th cycle step. Guess that's because of memory leaks when opening/closing the generation sessions.
Is it possible to proceed with the only generation session and send a row of prepend texts to obtain a number of different texts during the only step?
The text was updated successfully, but these errors were encountered: