I’ve just re-discovered ollama and it’s come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.
I’d like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created !Ollama@lemmy.world for everyone to discuss, ask questions, and help each other out with ollama!
So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!
Thanks!
Fuck AI
People keep bringing LLM trash into the fediverse and then complaining when they aren’t put on a pedestal. Another community to throw in the trash bin.
!localllama@poweruser.forum is already going on for some time. If you want, I can make you a mod there and help you.
It’s just 3 posts from you, the others are from your reddit archival bots, only 7 subscribers, “going on” is a bit exaggerated. Some people may still block your instances because of their history.
Why are you still running all these instances? Most seems really dead to me. I’m just curios.
Yes, they are still running. alien.top has been blocked by some, but the topic-specific instances have no reason to be a source of issues.
Just think of it this way: as slow as the existing community is, the community you want to build is even further behind. If we join forces, we can go a lot further than by trying to keep things separate.