Has anyone tried to use some modules from VLLM to replace torch.nn.functional.scaled_dot_product_attention? #4546
Unanswered
leprodeveloper
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I mean replace torch.nn.functional.scaled_dot_product_attention only. So, other parts of codes can be kept.
Beta Was this translation helpful? Give feedback.
All reactions