100% LOCAL USE???

#265
by Borna123454321 - opened

Is there a way to literally run all of this locally with Ollama or LMStudio without any external APIs and if we use, for example, LMStudio to have a model selection because when curl http://192.168.56.1:1234/v1/models all models are listed.. I like this project, but with these limitations I would like it locally

I really want to use this with LM Studio.

Sign up or log in to comment