Interesting. I use HA myself (though not the AI/LLM side yet). I think it's a different layer: Home Memory is the physical reality of the whole house, not just what's wired and smart. How do you picture "running inside"? Chat from the HA UI, or just running on the same hardware? I haven't dug into the feasibility yet. The part I'm fairly sure about: this only shines with a model that reliably uses tools. 23 MCP tools is a lot for a small local model, Claude or GPT tier handles it fine. What conversation agent are you running in HA?
I've been running Home Assistant for 5 years now. No turning back, it's so addictive (in a good sense). I didn't have a chance to start with AI/LLM in HA either, nor did I ever have a chance to use speech with HA. It will all come in this year, I hope. The other day, going through a cluttered drawer, looking for something, it dawned on me that the perfect solution to that problem would be if I could talk to an AI and explain that item X is in drawer Y of cabinet Z in room A. That would be a perfect interface to an inventory management application. And, since I'm using HA more and more for absolutely everything in my life, then, the perfect place to keep all that data would be HA itself. Later on, all that data will be useful in HA anyway, one way or the other. So, yes, if I need to build talk-to-LLM from every room infrastructure, then it would make it much harder to include other projects in the pipeline. HA mobile app already has talk-to-HA functionality which I can't use for Home Memory. And, frankly, having some home related data in Home Assistant, and some other home related data in Home Memory feels like a split brain situation to me. So, ideally, Home Memory should be just an integration/add-on/HACS to Home Assistant, in my humble opinion. What you did there with a windows server is a great start, and I will definitely test it out, but, eventually, I strongly believe it should be fully integrated into Home Assistant. Let me know if I can be of help. Thanks.