ktou -> groonga
Browse files
README.md
CHANGED
|
@@ -6686,12 +6686,12 @@ Invoke the llama.cpp server or the CLI.
|
|
| 6686 |
|
| 6687 |
### CLI:
|
| 6688 |
```bash
|
| 6689 |
-
llama-cli --hf-repo
|
| 6690 |
```
|
| 6691 |
|
| 6692 |
### Server:
|
| 6693 |
```bash
|
| 6694 |
-
llama-server --hf-repo
|
| 6695 |
```
|
| 6696 |
|
| 6697 |
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
|
|
@@ -6708,9 +6708,9 @@ cd llama.cpp && LLAMA_CURL=1 make
|
|
| 6708 |
|
| 6709 |
Step 3: Run inference through the main binary.
|
| 6710 |
```
|
| 6711 |
-
./llama-cli --hf-repo
|
| 6712 |
```
|
| 6713 |
or
|
| 6714 |
```
|
| 6715 |
-
./llama-server --hf-repo
|
| 6716 |
```
|
|
|
|
| 6686 |
|
| 6687 |
### CLI:
|
| 6688 |
```bash
|
| 6689 |
+
llama-cli --hf-repo groonga/multilingual-e5-base-Q4_K_M-GGUF --hf-file multilingual-e5-base-q4_k_m.gguf -p "The meaning to life and the universe is"
|
| 6690 |
```
|
| 6691 |
|
| 6692 |
### Server:
|
| 6693 |
```bash
|
| 6694 |
+
llama-server --hf-repo groonga/multilingual-e5-base-Q4_K_M-GGUF --hf-file multilingual-e5-base-q4_k_m.gguf -c 2048
|
| 6695 |
```
|
| 6696 |
|
| 6697 |
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
|
|
|
|
| 6708 |
|
| 6709 |
Step 3: Run inference through the main binary.
|
| 6710 |
```
|
| 6711 |
+
./llama-cli --hf-repo groonga/multilingual-e5-base-Q4_K_M-GGUF --hf-file multilingual-e5-base-q4_k_m.gguf -p "The meaning to life and the universe is"
|
| 6712 |
```
|
| 6713 |
or
|
| 6714 |
```
|
| 6715 |
+
./llama-server --hf-repo groonga/multilingual-e5-base-Q4_K_M-GGUF --hf-file multilingual-e5-base-q4_k_m.gguf -c 2048
|
| 6716 |
```
|