Skip to content

TC llama recompile fix - no_grad to inference_mode #1387

TC llama recompile fix - no_grad to inference_mode

TC llama recompile fix - no_grad to inference_mode #1387

Triggered via pull request December 17, 2024 08:33
Status Failure
Total duration 21s
Artifacts

ruff.yml

on: pull_request
Matrix: ruff
Fit to window
Zoom out
Zoom in

Annotations

2 errors and 1 warning
Ruff (F401): vllm/platforms/hpu.py#L3
vllm/platforms/hpu.py:3:8: F401 `torch` imported but unused
ruff (3.12)
Process completed with exit code 1.
ruff (3.12)
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636