-
-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Fixed GPU RAM estimation #64
Conversation
Codecov Report
@@ Coverage Diff @@
## main #64 +/- ##
==========================================
- Coverage 94.35% 92.93% -1.43%
==========================================
Files 10 10
Lines 656 665 +9
==========================================
- Hits 619 618 -1
- Misses 37 47 +10
|
checkout to this branch fisrt, and install it in notebook as following command:
got result:
But it still prints negative size:
|
Thanks but are you positive this is the snippet you used to install it?
Let me know if that fixes the problem :) |
d'oh! I missed option the reason for "torchscan" is, that is the name of directory I unzipped thanks for letting me know :) |
Script netG = Generator().to(device)
summary(netG, (nz, 1, 1))
netD = Discriminator().to(device)
summary(netD, (3, 64, 64)) Output
Installed version with commit 76aca8b There is no more negatives 👍 |
Ah perfect :) |
This PR fixes the GPU RAM estimation problem by:
What this PR will not solve:
Closes #63
cc @joonas-yoon