jogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 1 month agoSelf-Host Weekly (30 January 2026)selfh.stexternal-linkmessage-square19linkfedilinkarrow-up145arrow-down13
arrow-up142arrow-down1external-linkSelf-Host Weekly (30 January 2026)selfh.stjogai_san@lemmy.world to Selfhosted@lemmy.worldEnglish · 1 month agomessage-square19linkfedilink
minus-squareSanPe_@lemmy.worldlinkfedilinkEnglisharrow-up2·1 month agoThe “else-hosted” LLM AI is really not my thing, but selfhosted even less…
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up1·1 month agoIf I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
minus-squareDavid J. Atkinson@c.imlinkfedilinkarrow-up2·1 month ago@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
minus-squareirmadlad@lemmy.worldlinkfedilinkEnglisharrow-up1·1 month agoI mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.
The “else-hosted” LLM AI is really not my thing, but selfhosted even less…
If I had the proper equipment, I’d run AI if it were self contained and not pinging out to another LLM.
@irmadlad @selfhosted That is precisely the challenge. I’m not sure it is possible.
I mean, I can run a few of the private AI stacks, but it is excruciatingly slow as to make it not worth the time. I would want something pretty responsive.