-
Notifications
You must be signed in to change notification settings - Fork 10.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Misc. bug: All llama executables exit immediately without console output #10929
Comments
What happens when you append Unfortunately, I don't have a windows machine to test. |
Sorry for the duplicated issue #10944
Also, some binaries do work, such as llama-gguf.exe. |
If you are able to share a stack trace, it would be very helpful. The stack trace would allow us to pinpoint the issue. |
I had a similar problem a while ago if I remember correctly I had to reinstall Microsoft Visual C++ redistributable. |
Can you check if #10960 solves this problem? |
Now that I see b4388 contains this commit, I tried llama-b4388-bin-win-avx2-x64.zip, and unfortunately no, it still doesn't work. I tried running b4388 llama-server.exe with NTtrace, and got something, not sure if it'll help: |
@Ikaron Could you run following cmd before execute the tool?
|
I had the same situation too, and reinstall MSVC++ redist resolved this |
Name and Version
Multiple.
SYCL build as well as CPU only build from git rev-parse head eb5c3dc
Also the prebuilt Windows SYCL binary 3bcd40b tag b4040
NOTE:
Prebuilt binary fb76ec3 tag b3038 WORKS!
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
llama-cli, llama-server
Problem description & steps to reproduce
I originally thought this was a problem with the SYCL builds, but I also compiled CPU only with the same result. Note that "main.exe" from the old SYCL prebuilt works as expected.
Not sure if relevant but I don't recognise the OpenMP installs that were found.
Steps to reproduce:
Build via any:
./examples/sycl/win-build-sycl.bat
cmake -B build -G "Ninja" -DGGML_SYCL=ON -DCMAKE_C_COMPILER=cl -DCMAKE_CXX_COMPILER=icx -DCMAKE_BUILD_TYPE=Release -DGGML_SYCL_F16=ON
cmake --build build --config Release -j
(Also, build warning:)
icx: warning: unknown argument ignored in clang-cl: '-machine:x64' [-Wunknown-argument]
Use the "Intel oneAPI command prompt for Intel 64 for Visual Studio 2022" command prompt, then run any llama exe
OR
Execute "C:\Program Files (x86)\Intel\oneAPI\setvars.bat" intel64, then run any llama exe
OR
Run examples\sycl\win-run-llama2.bat
Build config (CPU):
Build config (SYCL):
First Bad Commit
No response
Relevant log output
The text was updated successfully, but these errors were encountered: