Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out Of Memory On Google Colab When Training With Big Dataset #257

Open
nikhilanayak opened this issue Feb 24, 2021 · 1 comment
Open

Out Of Memory On Google Colab When Training With Big Dataset #257

nikhilanayak opened this issue Feb 24, 2021 · 1 comment

Comments

@nikhilanayak
Copy link

I used gpt-2-keyword-generation to take my dataset and tokenize it. In the end, the file was about 700MB. When I try to train with any model size, the colab notebook runs out of memory. I know my dataset is pretty big, but is there anything I can do to get away with it?

@777yeet
Copy link

777yeet commented Jun 19, 2021

Not on free tier, as far as I know. They used to give extra memory if your instance crashed but they don't now.
You could try and use the TPU-edited version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants