Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(tabby): Add experimental vulkan support #1588

Merged
merged 4 commits into from
Mar 1, 2024
Merged

feat(tabby): Add experimental vulkan support #1588

merged 4 commits into from
Mar 1, 2024

Conversation

boxbeam
Copy link
Contributor

@boxbeam boxbeam commented Feb 29, 2024

No description provided.

@boxbeam boxbeam requested a review from wsxiaoys February 29, 2024 22:05
Copy link

codecov bot commented Feb 29, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 50.33%. Comparing base (7f90102) to head (affcb48).
Report is 7 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1588      +/-   ##
==========================================
+ Coverage   49.98%   50.33%   +0.34%     
==========================================
  Files         112      112              
  Lines        8994     9068      +74     
==========================================
+ Hits         4496     4564      +68     
- Misses       4498     4504       +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@wsxiaoys
Copy link
Member

wsxiaoys commented Mar 1, 2024

related:
#631
#124

@wsxiaoys wsxiaoys enabled auto-merge (squash) March 1, 2024 21:30
@wsxiaoys wsxiaoys disabled auto-merge March 1, 2024 21:30
@wsxiaoys wsxiaoys enabled auto-merge (squash) March 1, 2024 21:31
@wsxiaoys wsxiaoys merged commit 60f472d into main Mar 1, 2024
7 checks passed
@wsxiaoys wsxiaoys deleted the vulkan branch March 1, 2024 21:47
wsxiaoys pushed a commit that referenced this pull request Mar 1, 2024
* feat(tabby): Add experimental vulkan support

* Add rocm device

* Fix feature flags

* Vulkan support working
@amne
Copy link

amne commented Apr 19, 2024

in src/engine.cc:365 the llama_backend_init(...) function call is missing the mandatory bool argument.
This is breaking the build for me and I don't understand how it passed?

cargo build --features cuda
  cargo:warning=src/engine.cc: In constructor ‘llama::{anonymous}::BackendInitializer::BackendInitializer()’:

  cargo:warning=src/engine.cc:365:23: error: too few arguments to function ‘void llama_backend_init(bool)’

  cargo:warning=  365 |     llama_backend_init();

  cargo:warning=      |     ~~~~~~~~~~~~~~~~~~^~

  cargo:warning=In file included from src/engine.cc:11:

  cargo:warning=llama.cpp/llama.h:267:20: note: declared here

  cargo:warning=  267 |     LLAMA_API void llama_backend_init(bool numa);

  cargo:warning=      |                    ^~~~~~~~~~~~~~~~~~

  exit status: 0
  exit status: 1

@boxbeam
Copy link
Contributor Author

boxbeam commented Apr 19, 2024

@amne I'm wondering if your llama.cpp submodule is up to date. If you cloned Tabby a while ago, you need to run git submodule update --init --recursive. The required parameter was removed in the version we are now using.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants