Update README.md
Browse files
README.md
CHANGED
|
@@ -62,7 +62,7 @@ We recommend using **vLLM** to speed up inference.
|
|
| 62 |
Run the command below to start an OpenAI-compatible API service:
|
| 63 |
|
| 64 |
```bash
|
| 65 |
-
vllm serve "/PATH/CapRL-
|
| 66 |
--trust-remote-code \
|
| 67 |
--tensor-parallel-size=1 \
|
| 68 |
--pipeline-parallel-size=1 \
|
|
|
|
| 62 |
Run the command below to start an OpenAI-compatible API service:
|
| 63 |
|
| 64 |
```bash
|
| 65 |
+
vllm serve "/PATH/CapRL-InternVL3.5-8B" \
|
| 66 |
--trust-remote-code \
|
| 67 |
--tensor-parallel-size=1 \
|
| 68 |
--pipeline-parallel-size=1 \
|