You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I used gpt-2-keyword-generation to take my dataset and tokenize it. In the end, the file was about 700MB. When I try to train with any model size, the colab notebook runs out of memory. I know my dataset is pretty big, but is there anything I can do to get away with it?
The text was updated successfully, but these errors were encountered:
Not on free tier, as far as I know. They used to give extra memory if your instance crashed but they don't now.
You could try and use the TPU-edited version.
I used gpt-2-keyword-generation to take my dataset and tokenize it. In the end, the file was about 700MB. When I try to train with any model size, the colab notebook runs out of memory. I know my dataset is pretty big, but is there anything I can do to get away with it?
The text was updated successfully, but these errors were encountered: