Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation on how to use? #14

Open
sherodtaylor opened this issue Dec 18, 2023 · 7 comments
Open

Documentation on how to use? #14

sherodtaylor opened this issue Dec 18, 2023 · 7 comments

Comments

@sherodtaylor
Copy link

I've downloaded ollama I'm not sure what i'm expecting to happen I've pulled the model locally. There is no guidance on what is expected to happen or how to use?

Is it supposed to run on save is there a way to run a command or activate generate?

@sccrosby
Copy link

I was wondering the same thing. Once you start the ollama service, e.g. brew services start ollama, it will be running in the background. Then start up vscode with the extension. Go to th extension settings as select a model, e.g. codellama:7b-code-q4_K_M it will download (4GB) and then start suggestion autocompletes. Unfortunately I found this setup on my computer (M1, 32GB ram) to be too slow to be useful compared to copilot, codeium, etc...

@sherodtaylor
Copy link
Author

^ so i got it to work it was a bit slow but works on tab completion. I also found looking at the logs helps. I think if there was some notification or like thinking gear it might help with ui interaction

@ex3ndr
Copy link
Owner

ex3ndr commented Dec 22, 2023

@sherodtaylor in fact there IS a loading gear in the bottom! This is the only UI is implemented now.

@sherodtaylor
Copy link
Author

Oh nice! I’m wondering about like a gear cursor cause the loading on the bottom is hidden.

@prabbit237
Copy link

prabbit237 commented Mar 22, 2024

Ok, I STILL don't see what it's doing/supposed to do. I loaded it up in a virgin install of VSCode, along with PlatformIO and C/C++ and nothing else. I start typing a line and get two suggestions for digitalW... as shown. I disable llama coder and try typing the same digitalW and get the same two suggestions. If I hit TAB, in either case, I get "digitalWrite" and then.....nothing(?)

So what is it supposed to actually DO?

image

image

image

If I go to powershell and run ollama and type digitalwrite and hit , I actually get something useful (well, maybe not useful, in this case, but at least something applicable. And with a bit of oddity when I make a typo.)
image

What I was expecting is something like Cody-AI (even if not as full-featured.)

image

@prabbit237
Copy link

Never mind...apparently my setup is just being extremely slow. But after letting it sit, I did get some sort of a response.

image

@mediashock
Copy link

Is there a way to choose the completions? or you only get one? I cant seem to generate different results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants