Wen gguf?
#352
Replies: 1 comment 2 replies
-
You can import gguf into ollama, at least according to their docs. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
If I'm reading the description correctly—Perplexica only supports Ollama currently?
Are there any plans to have Perplexica act as middleware or somehow be able to run GGUFs or EXL2 files I download off HuggingFace?
I'm really digging the search engine stuff you guys are doing...but being limited to Ollama shys me away. If you guys were to add that ability, I'd drop Oobabooga in a heartbeat and switch over to Perplexica.
EDIT: Oh, screw it. I now see the value of Perplexica even though I can't run some of my favorite models (e.g. Magnum 123b Instruct Q8 gguf). There's nothing stopping me from using Perplexica side-by-side to enhance the work I do in Oobabooga with my large LLM models.
This could be a powerful new search engine that isn't limited to the manipulated and often-censored results out of Google and Bing. I was originally using Yandex for those searches (nothing illegal/immoral—but rather certain topics are banned because they're considered WrongThink in current year). I will be downloading and testing out Perplexica now. But I still hope you guys offer a way in the future to run any model off of Hugging Face (either as middleware or running the actual model)
Beta Was this translation helpful? Give feedback.
All reactions