-
Notifications
You must be signed in to change notification settings - Fork 23
Issues: pytorch-labs/attention-gym
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Weird warning on compile:
SingleProcess AUTOTUNE benchmarking takes...
#83
opened Nov 19, 2024 by
ViktorooReps
How to implement Bidirectional Alibi with padding using flex attention?
#74
opened Nov 7, 2024 by
sphmel
Is there any chance to call backward function dircetly instead of using pytorch autograd mechanism?
#73
opened Nov 7, 2024 by
MayDomine
NotImplementedError: There was no rule registered for HOP flex_attention and mode
#70
opened Nov 2, 2024 by
LeoXinhaoLee
AssertionError: Captured buffers that require grad are not yet supported.
#69
opened Nov 1, 2024 by
pengzhenghao
How to manually check if one position or row has correct masking?
#66
opened Oct 28, 2024 by
Leo-T-Zang
How to reason about efficiency of different score/mask mod functions
#63
opened Oct 22, 2024 by
alex-hh
How to do KV Cache with FlexAttention and BlockMask by slicing?
#60
opened Oct 21, 2024 by
Leo-T-Zang
What is the best practice to save and load a BlockMask object?
#58
opened Oct 20, 2024 by
complexfilter
What is the expected gpu memory performance drop wrt flash attention with block masks?
#54
opened Oct 19, 2024 by
arilato
Distributed Attention Methods
question
Further information is requested
#44
opened Sep 20, 2024 by
tsrikris
CUDA OOM Issue When Using Approx Tanh with softcapping score mod
#43
opened Sep 18, 2024 by
kebijuelun
[Feature request] End-to-end transformer example with flex attention
enhancement
New feature or request
#42
opened Sep 16, 2024 by
vladkvit
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.