Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does it work on Windows #31

Open
PierrunoYT opened this issue Feb 1, 2024 · 6 comments
Open

Does it work on Windows #31

PierrunoYT opened this issue Feb 1, 2024 · 6 comments

Comments

@PierrunoYT
Copy link

I saw Ollama does not support Windows yet

@Kevsnz
Copy link
Contributor

Kevsnz commented Feb 4, 2024

Llama Coder extension works on Windows just fine. But you need to run Ollama in other environment - Linux or MacOS, and configure this extension to use that.

In Windows you could try WSL, Ollama apparently supports it Ollama README

@leolivier
Copy link

You can install ollama on Windows using WSL2. See the ollama install on linux chapter which also applies to WSL2

@leolivier
Copy link

Ollama has now a windows version

@TaylorN15
Copy link

I have Ollama and the Llama Coder installed on Windows. I can see Ollama starting up and working, but it doesn't work with the Llama Coder extension...

@leolivier
Copy link

Did you pull a model in Ollama?
In a windows terminal, run eg
ollama pull codellama

@TaylorN15
Copy link

Did you pull a model in Ollama?
In a windows terminal, run eg
ollama pull codellama

Yes I pulled the codellama:7b-code-q4_K_M image and it runs fine if I run from cmd.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants