Skip to content

Commit

Permalink
skip oom configs
Browse files Browse the repository at this point in the history
  • Loading branch information
micmelesse committed Aug 7, 2024
1 parent f5d14f0 commit 261613e
Showing 1 changed file with 3 additions and 0 deletions.
3 changes: 3 additions & 0 deletions tests/test_flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -1719,6 +1719,9 @@ def test_flash_attn_varlen_causal(
if test_backward == True:
pytest.skip("Backward Attention not supported on AMD yet")

if seqlen_q * seqlen_k >= 256 * 512:
pytest.skip(f"{seqlen_q}, {seqlen_k} leads to out of memory on AMD")

if (
max(seqlen_q, seqlen_k) >= 2048
and torch.cuda.get_device_properties("cuda").total_memory <= 16 * 2**30
Expand Down

0 comments on commit 261613e

Please sign in to comment.