Skip to content

Commit

Permalink
Update README-sycl.md
Browse files Browse the repository at this point in the history
Co-authored-by: Meng, Hengyu <[email protected]>
  • Loading branch information
NeoZhangJianyu and airMeng authored Mar 20, 2024
1 parent 4fd4344 commit 37db7ef
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README-sycl.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ For Intel CPU, recommend to use llama.cpp for X86 (Intel MKL building).
## News

- 2024.3
- New base line is ready: tag b2437.
- New base line is ready: [tag b2437](https://github.com/ggerganov/llama.cpp/tree/b2437).
- Support multiple cards: **--split-mode**: [none|layer]; not support [row], it's on developing.
- Support to assign main GPU by **--main-gpu**, replace $GGML_SYCL_DEVICE.
- Support detecting all GPUs with level-zero and same top **Max compute units**.
Expand Down

0 comments on commit 37db7ef

Please sign in to comment.