If you’re using the Home Assistant voice assistant mechanism (not Alexa/Google/etc.) how’s it working for you?

Given there’s a number of knobs that you can use, what do you use and what works well?

  • Wake word model. There’s the default models and custom
  • Conservation agent and model
  • Speech to text models (e.g. speech-to-phrase or whisper)
  • Text to speech models
  • barkingspiders@infosec.pub
    link
    fedilink
    English
    arrow-up
    4
    ·
    5 hours ago

    We’ve been using the previews since they shipped. The Mycroft wake word has worked well enough for the whole family. Tried the chatbot fallback but the syntax of the intent parser is strict enough we were getting routed to the llm way more than we wanted. For example asking it to turn on a light and Claude telling us it couldn’t do that. It fails faster and more reliably with just the intent parser.

    Our favorite use case is shopping lists. “Hey Mycroft add greens to groceries list” is great and won me some WAF. I also regularly use timers, some custom commands (hey Mycroft I fed the dog), and managing lights with scenes (hey Mycroft turn on Daytime).

    I’m hoping to one day transition to a local llm that’s fine tuned for homeassistant specific tasks and it looks like some good ones will arrive soon. The existing implementations haven’t won me over yet.

    Dunno, I’m a big fan and the wife doesn’t hate them, I’m really optimistic about the future of these. I think HA is going about them the right way and we’ll see good things in the future. It’s a little rough right now if you’re not willing to put up with the quirks probably but I think it’s just going to keep getting better.