Wondering about services to test on either a 16gb ram “AI Capable” arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don’t control.
Rules:
Be civil: we’re here to support and learn from one another. Insults won’t be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it’s not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don’t duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
I messed around with home assistant and the ollama integration. I have passed on it and just use the default one with voice commands I set up. I couldn’t really get ollama to do or say anything useful. Like I asked it what’s a good time to run on a treadmill for beginners and it told me it’s not a doctor.
Kirkland brand meseeks energy.
Hey now Kirkland brand is respectable, usually premium brands repackaged. Such as how Costco vodka was secretly (“secretly”) Grey Goose
There are some experimental models made specifically for use with Home Assistant, for example home-llm.
Even though they are tiny 1-3B I’ve found them to work much better than even 14B general purpose models. Obviously they suck for general purpose questions just by their size alone.
That being said they’re still LLMs. I like to keep the “prefer handling commands locally” option turned on and only use the LLM as a fallback.
Sounds like ollama was loaded up with an either overly censored or plain brain dead language model. Do you know which model it was? Maybe try mistral if it fits in your computer.
Haha, that is hilarious. Sounds like it gave you some snark. afaik you have to clarify by asking again when it says such things. “I’m not asking for medical advice, but…”
deleted by creator