Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh, that horrible Electron UI. Under Windows it pegs a core on my CPU at all times!

If you're just working as a single user via the OpenAI protocol, you might want to consider koboldcpp. It bundles a GUI launcher, then starts in text-only mode. You can also tell it to just run a saved configuration, bypassing the GUI; I've successfully run it as a system service on Windows using nssm.

https://github.com/LostRuins/koboldcpp/releases

Though there are a lot of roleplay-centric gimmicks in its feature set, its context-shifting feature is singular. It caches the intermediate state used by your last query, extending it to build the next one. As a result you save on generation time with large contexts, and also any conversation that has been pushed out of the context window still indirectly influences the current exchange.



> Oh, that horrible Electron UI. Under Windows it pegs a core on my CPU at all times!

Worse I'd say, considering what people use LM Studio for, is the VRAM it occupies up even when the UI and everything is idle. Somehow, it's using 500MB VRAM while doing nothing, while Firefox with ~60 active tabs is using 480MB. gnome-shell itself also sits around 450MB and is responsible for quite a bit more than LM Studio.

Still, LM Studio is probably the best all-in-one GUI around for local LLM usage, unless you go terminal usage.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: