Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disallow skipping dynamo #109476

Closed

Conversation

tugsbayasgalan
Copy link
Contributor

@tugsbayasgalan tugsbayasgalan commented Sep 18, 2023

Stack from ghstack (oldest at bottom):

Based on William's recent diff on preserving node metadata on retracing, we no longer need to skip dynamo on retracing. This softens our previous restriction of not allowing any new constraints from user side because we can utilize dynamo to analyze through constraints now. As a result, re-export can technically happen with any new constraints. This opens up another problem that "Is it ok to use more loose constraints on the retracing?" If we allow loose constraints, we can technically diverge from eager behaviour because for example we could have eliminated unsafe control flow based on previous assumption. But we can also argue this is ok because we can say we treat the Exported callable to be an independent callable from its' original source code.
We can technically ban loose constraints inside export, but my concern is we are breaking abstraction by doing special case checks on ExportedProgram.

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Sep 18, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/109476

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 48f9c76 with merge base 6b7b9c7 (image):

FLAKY - The following job failed but was likely due to flakiness present on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

tugsbayasgalan added a commit that referenced this pull request Sep 18, 2023
ghstack-source-id: f44d59d4f8b99e0e10778e64fa046e1ce910bed0
Pull Request resolved: #109476
@tugsbayasgalan
Copy link
Contributor Author

@tugsbayasgalan has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

tugsbayasgalan added a commit that referenced this pull request Sep 18, 2023
ghstack-source-id: bf93423a38db4818ff50b5eb039b56b02a148b1c
Pull Request resolved: #109476
@tugsbayasgalan tugsbayasgalan requested review from zhxchen17, avikchaudhuri and gmagogsfm and removed request for avikchaudhuri September 18, 2023 06:33
@tugsbayasgalan
Copy link
Contributor Author

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label Sep 21, 2023
@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@pytorchmergebot
Copy link
Collaborator

Merge failed

Reason: 1 mandatory check(s) failed. The first few are:

Dig deeper by viewing the failures on hud

Details for Dev Infra team Raised by workflow job

Failing merge rule: Core Maintainers

@tugsbayasgalan
Copy link
Contributor Author

@pytorchbot rebase

@pytorchmergebot
Copy link
Collaborator

@pytorchbot started a rebase job onto refs/remotes/origin/viable/strict. Check the current status here

Based on William's recent diff on preserving node metadata on retracing, we no longer need to skip dynamo on retracing. This softens our previous restriction of not allowing any new constraints from user side because we can utilize dynamo to analyze through constraints now. As a result, re-export can technically happen with any new constraints. This opens up another problem that "Is it ok to use more loose constraints on the retracing?" If we allow loose constraints, we can technically diverge from eager behaviour because for example we could have eliminated unsafe control flow based on previous assumption. But we can also argue this is ok because we can say we treat the Exported callable to be an independent callable from its' original source code. 
We can technically ban loose constraints inside export, but my concern is we are breaking abstraction by doing special case checks on ExportedProgram. 


[ghstack-poisoned]
@pytorchmergebot
Copy link
Collaborator

Successfully rebased gh/tugsbayasgalan/153/orig onto refs/remotes/origin/viable/strict, please pull locally before adding more changes (for example, via ghstack checkout https://github.com/pytorch/pytorch/pull/109476)

pytorchmergebot pushed a commit that referenced this pull request Sep 21, 2023
ghstack-source-id: 8925de6987da6801f7d5cecccf30e24cd192aaaa
Pull Request resolved: #109476
Based on William's recent diff on preserving node metadata on retracing, we no longer need to skip dynamo on retracing. This softens our previous restriction of not allowing any new constraints from user side because we can utilize dynamo to analyze through constraints now. As a result, re-export can technically happen with any new constraints. This opens up another problem that "Is it ok to use more loose constraints on the retracing?" If we allow loose constraints, we can technically diverge from eager behaviour because for example we could have eliminated unsafe control flow based on previous assumption. But we can also argue this is ok because we can say we treat the Exported callable to be an independent callable from its' original source code. 
We can technically ban loose constraints inside export, but my concern is we are breaking abstraction by doing special case checks on ExportedProgram. 


[ghstack-poisoned]
Based on William's recent diff on preserving node metadata on retracing, we no longer need to skip dynamo on retracing. This softens our previous restriction of not allowing any new constraints from user side because we can utilize dynamo to analyze through constraints now. As a result, re-export can technically happen with any new constraints. This opens up another problem that "Is it ok to use more loose constraints on the retracing?" If we allow loose constraints, we can technically diverge from eager behaviour because for example we could have eliminated unsafe control flow based on previous assumption. But we can also argue this is ok because we can say we treat the Exported callable to be an independent callable from its' original source code. 
We can technically ban loose constraints inside export, but my concern is we are breaking abstraction by doing special case checks on ExportedProgram. 


[ghstack-poisoned]
@tugsbayasgalan
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@atalman
Copy link
Contributor

atalman commented Sep 25, 2023

@pytorchbot revert -m "Failing internal CI" -c ghfirst

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a revert job. Check the current status here.
Questions? Feedback? Please reach out to the PyTorch DevX Team

@pytorchmergebot
Copy link
Collaborator

@tugsbayasgalan your PR has been successfully reverted.

pytorchmergebot added a commit that referenced this pull request Sep 25, 2023
@facebook-github-bot facebook-github-bot deleted the gh/tugsbayasgalan/153/head branch September 27, 2023 14:24
@tugsbayasgalan tugsbayasgalan restored the gh/tugsbayasgalan/153/head branch September 28, 2023 00:38
@facebook-github-bot facebook-github-bot deleted the gh/tugsbayasgalan/153/head branch September 28, 2023 14:24
tugsbayasgalan added a commit that referenced this pull request Sep 28, 2023
Previous discussion: #109476

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Oct 26, 2023
Previous discussion: #109476

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Oct 26, 2023
Previous discussion: #109476

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 7, 2023
Previous discussion: #109476

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 7, 2023
Previous discussion: #109476

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 7, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call. 
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph. 
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 7, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call. 
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph. 
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 12, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call. 
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph. 
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 12, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call. 
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph. 
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 12, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call. 
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph. 
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)

[ghstack-poisoned]
tugsbayasgalan added a commit that referenced this pull request Nov 12, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call. 
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph. 
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule. 

cc voznesenskym penguinwu EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng wenzhe-nrv jiayisunx chenyang78 aakhundov kadeng avikchaudhuri gmagogsfm zhxchen17 angelayi Xia-Weiwen

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)

[ghstack-poisoned]
pytorchmergebot pushed a commit that referenced this pull request Nov 14, 2023
Previous discussion: #109476

In this PR, I made following additions to the original PR:
1) Unlifted graph module now runs the runtime assertions in its' forward call.
2) When we retrace, we make sure we run the assertions to make sure user is tracing the module with correct inputs with respect to the assumptions we made during first tracing. The way I do is that I create new graph module type with modified call method. And the runtime assertions happen under torchdynamo.disable so that it is just run in eager directly. The reason is we don't this to be traced part of the graph.
3) Both ep.module and capture_pre_autograd now returns _UnliftedGraphModule.

Differential Revision: [D51078056](https://our.internmc.facebook.com/intern/diff/D51078056)
Pull Request resolved: #110222
Approved by: https://github.com/zhxchen17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ciflow/trunk Trigger trunk jobs on your pull request Merged Reverted topic: not user facing topic category
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants