Use the model in LM Studio
Make sure you have enough ram/gpu to run application.
download and install LM Studio
Discover models
In the LM Studio, click "Discover" icon. "Mission Control" popup window will be displayed.
In the "Mission Control" search bar, type "John1604/gpt-oss-120b-gguf" and check "GGUF", the model should be found.
Download the model.
Load the model.
Ask questions.
- Downloads last month
- 109
Hardware compatibility
Log In
to view the estimation
16-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for John1604/gpt-oss-120b-gguf
Base model
openai/gpt-oss-120b