Trending Tags
    Trending Notes
    Global
    Trending Profiles

      Nostr View


      TKay @TKay - 6mo

      Was anyone able to run stacks with a local llm like Ollama + Model? @alex confirmed that its possible, but im not sure what model to use, are there any specific configuration i need to turn on in the model it self? #asknostr

      0
      0
      3

      TKay @TKay - 6mo

      thank you. I am unable to open the link though. strange.

      0
      0
      3

      TKay @TKay - 6mo

      strange. the link opens on my phone but not my computer.

      1
      0
      3

      TKay @TKay - 6mo

      Ah this is the normal setup. I can rund this just fine. I want to replace the default AI provider with a local AI.

      0
      0
      3

      Showing page 1 of 1 pages