TKay @TKay - 3d
Was anyone able to run stacks with a local llm like Ollama + Model? @alex confirmed that its possible, but im not sure what model to use, are there any specific configuration i need to turn on in the model it self? #asknostr
thank you. I am unable to open the link though. strange.
strange. the link opens on my phone but not my computer.
Ah this is the normal setup. I can rund this just fine. I want to replace the default AI provider with a local AI.