Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.


But you can't just switch between installed models like in ollama, can you?





Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: