You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a simple way to set the random seed to a constant at the beginning of finetuning, so that finetuning the same model on the same corpus for the same number of iterations will not result in different parameters each time? I have tried calling "tf.random.set_random_seed(0)" just before "sess = gpt2.start_tf_sess()", but that doesn't seem to have worked.
If I have to dig into the source code to do this, any tips on where/how would be appreciated.
The text was updated successfully, but these errors were encountered:
Is there a simple way to set the random seed to a constant at the beginning of finetuning, so that finetuning the same model on the same corpus for the same number of iterations will not result in different parameters each time? I have tried calling "tf.random.set_random_seed(0)" just before "sess = gpt2.start_tf_sess()", but that doesn't seem to have worked.
If I have to dig into the source code to do this, any tips on where/how would be appreciated.
The text was updated successfully, but these errors were encountered: