Windows fix #11
Replies: 3 comments 1 reply
-
Hi @alexpinel this is a great idea. For some time, I have been looking for something that would let me use natural language for desktop search. I currently use X1 which is a particularly good keyword search tool, but I often need to find something with a query like this: "find a PowerPoint presentation that describes different broadband access technologies". I tried to install Dot on my ageing Windows laptop and tried the query above but ended up with the "dot is typing..." message. I was not sure if this is due to my laptop being underpowered or whether I have done something wrong. I shall check the former by trying it on my desktop later. However, it raises a couple of questions for me:
I hope you can help me (or if I have totally missed something please let me know). Keep up the good work! |
Beta Was this translation helpful? Give feedback.
-
Hi! I stil get the "dot is typing..." freeze. Was this fixed after all? Thanks, Greg |
Beta Was this translation helpful? Give feedback.
-
Hi, sorry for the late reply! I have spent the last few weeks working on it and am hoping to update it in around a week! It turns out the "Dot is typing..." issue is mainly a compatibility problem that stems from decisions I made at the beginning of the project, which, in hindsight, might have been a bit absurd. When I started with Dot, the Node.js packages to run LLMs (mainly LangChain) were barely usable and nowhere near the capabilities of the Python alternatives (which is still an issue in my opinion). So, I decided to bundle a complete Python binary to run the LLM with the app. This offered a very brute force solution, but among other things, it made compatibility a constant issue. I have spent the last few weeks completely replacing the Python backend with newer Node.js modules, and it seems to be working well! It has also made the LLMs run faster and allows for token streaming, which makes it seem much faster! The conversion has also introduced a few downsides, though: the LLM now has no conversation memory, and loading files seems to take more RAM. However, I am planning on fixing those issues later on. I am hoping to release the update next week. I just need to ensure Text to Speech and Speech Recognition also work fine, and that will hopefully sort out many of the issues encountered so far! Also, here are the answers to @HSB-collab's questions (once again, sorry for the late reply; I have been quite busy with exams): 1- Yes! It will load files in subdirectories. I also want to make it work by selecting individual files, as I imagine in most cases docs of interest are not in the same directory. Please feel free to ask more questions, and I will try to keep everyone updated on any new developments :) Alex |
Beta Was this translation helpful? Give feedback.
-
Hi! After some back and forth it looks like I manged to fix the problems with the windows version.
The new release (available on the website here should address the issue where the app was stuck on "dot is typing..." for windows users. The cause of the issue was the lack of a few dependencies required for llama.cpp to function properly.
Also, I recently learned just how expensive it is to code sign an app for windows, and unfortunately, that is waaaay beyond my budget right now which means windows defender might not like Dot at first...
Please let me know if you are facing any issues!
This discussion was created from the release Windows fix.
Beta Was this translation helpful? Give feedback.
All reactions