Qwen3 Language Model

This is the Qwen3 large language model, developed by Alibaba Cloud's Tongyi Lab.

Model Details

  • Model Type: Causal Language Model
  • Architecture: Qwen
  • Parameters: [e.g., 4B / 7B / 14B โ€” fill in your variant]
  • Context Length: 32768 tokens
  • License: Apache 2.0

Usage

You can use this model with the Hugging Face transformers library:

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "your-username/your-repo-name"  # Replace with your repo ID

tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    device_map="auto",
    trust_remote_code=True
)

inputs = tokenizer("Hello, how are you?", return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Downloads last month
55
Safetensors
Model size
2B params
Tensor type
F16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using Chuxia-sys/qwen3-tourism-merged-v2 1