I agree that some sort of decentralized model repository would be awesome, but ollama works with local files too, so I'm not too worried about it. I've used many LLM backends, and ollama is my favorite so far, but given how fast everything is moving, that could change in the future.
this post was submitted on 23 Dec 2023
16 points (94.4% liked)
Free Open-Source Artificial Intelligence
2886 readers
2 users here now
Welcome to Free Open-Source Artificial Intelligence!
We are a community dedicated to forwarding the availability and access to:
Free Open Source Artificial Intelligence (F.O.S.A.I.)
More AI Communities
LLM Leaderboards
Developer Resources
GitHub Projects
FOSAI Time Capsule
- The Internet is Healing
- General Resources
- FOSAI Welcome Message
- FOSAI Crash Course
- FOSAI Nexus Resource Hub
- FOSAI LLM Guide
founded 1 year ago
MODERATORS
I’ve thought the same thing actually, but I haven’t really looked into who’s behind Ollama or how the repository is managed yet. It’s a really great project
What does ollama add on top of llama.cpp?
I use KoboldCPP that also works very well on CPU.
And Oobabooga's UI (with llama.cpp as a CPU backend) is also easy to set up.