Quantum-ToT

Model Details

Quantum-ToT is a fine-tuned variant of Qwen3-1.7B, optimized for Chain-of-Thought (CoT) reasoning in quantum mechanics and quantum computing contexts. This model was trained using the moremilk/CoT_Reasoning_Quantum_Physics_And_Computing dataset — a curated collection of question–answer pairs that go beyond surface-level definitions to show the logical reasoning process behind quantum concepts.

The goal of this fine-tuning is to enhance the model’s ability to:

  • Explain quantum principles with structured, step-by-step logic
  • Reason through conceptual problems in quantum physics and computing
  • Support educational and research applications that require interpretable reasoning chains

Uses

Direct Use

  • Educational assistance in quantum physics and quantum computing
  • AI tutors or reasoning assistants for STEM learning
  • Conceptual reasoning benchmarks involving quantum phenomena
  • Research in reasoning-aware model behavior and CoT interpretability

Out of Scope

  • Predicting new or unverified physical phenomena
  • Running quantum simulations or algorithmic derivations
  • Hardware-level quantum design
  • Real-time physics predictions

Bias, Risks, and Limitations

  • May hallucinate if prompted outside the quantum domain
  • Not suitable for advanced quantum algorithm design or experimental predictions

How to Get Started with the Model

Use the code below to get started with the model.

from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel


tokenizer = AutoTokenizer.from_pretrained("unsloth/Qwen3-1.7B",)
base_model = AutoModelForCausalLM.from_pretrained(
    "unsloth/Qwen3-1.7B",
    device_map={"": 0}
)

model = PeftModel.from_pretrained(base_model,"khazarai/Quantum-ToT")

question = """
Explain the Heisenberg Uncertainty Principle in detail, including its mathematical formulation, physical implications, and common misconceptions.
"""

messages = [
    {"role" : "user", "content" : question}
]
text = tokenizer.apply_chat_template(
    messages,
    tokenize = False,
    add_generation_prompt = True,
    enable_thinking = True,
)

from transformers import TextStreamer
_ = model.generate(
    **tokenizer(text, return_tensors = "pt").to("cuda"),
    max_new_tokens = 3000, 
    temperature = 0.6,
    top_p = 0.95,
    top_k = 20,
    streamer = TextStreamer(tokenizer, skip_prompt = True),
)

For pipeline:

from transformers import pipeline, AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

tokenizer = AutoTokenizer.from_pretrained("unsloth/Qwen3-1.7B")
base_model = AutoModelForCausalLM.from_pretrained("unsloth/Qwen3-1.7B")
model = PeftModel.from_pretrained(base_model, "khazarai/Quantum-ToT")

question = """
Explain the Heisenberg Uncertainty Principle in detail, including its mathematical formulation, physical implications, and common misconceptions.
"""

pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
messages = [
    {"role": "user", "content": question}
]
pipe(messages)

Dataset

Dataset: moremilk/CoT_Reasoning_Quantum_Physics_And_Computing

This dataset contains rich reasoning-based question–answer pairs covering:

  • Core quantum principles: superposition, entanglement, measurement
  • Effects of quantum gates (Hadamard, Pauli-X/Y/Z, etc.) on qubits
  • Multi-qubit reasoning (e.g., Bell states, entangled systems)
  • Basic quantum algorithms and logical operations
  • Probabilistic interpretation of measurement outcomes

Each entry includes:

  • think block → model’s internal reasoning process
  • answer block → final concise explanation or solution

The dataset focuses on conceptual understanding rather than heavy mathematical derivations or complex quantum hardware design.

Framework versions

  • PEFT 0.16.0
Downloads last month
59
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for khazarai/Quantum-ToT

Finetuned
Qwen/Qwen3-1.7B
Finetuned
unsloth/Qwen3-1.7B
Adapter
(15)
this model

Dataset used to train khazarai/Quantum-ToT

Collection including khazarai/Quantum-ToT