---
license: other
license_name: lfm1.0
license_link: LICENSE
tags:
- liquid
- lfm2
- edge
base_model:
- LiquidAI/LFM2-350M-PII-Extract-JP
---
# LFM2-350M-PII-Extract-JP-GGUF
Based on [LFM2-350M](https://huggingface.co/LiquidAI/LFM2-350M), this checkpoint is designed to **extract personally identifiable information (PII) from Japanese text and output it in JSON format.**
The output can then be used to mask out sensitive information in contracts, emails, personal medical reports, insurance bills, etc. directly on-device.
Find more details in the original model card: https://huggingface.co/LiquidAI/LFM2-350M-PII-Extract-JP
## 🏃 How to run LFM2
Example usage with [llama.cpp](https://github.com/ggml-org/llama.cpp):
- Extract address, company/institution names, email addresses, human names, and phone numbers from Japanese text.
```
llama-cli -hf LiquidAI/LFM2-350M-PII-Extract-JP-GGUF -st --temp 0.0 --json-schema {} --jinja
```
- Specifying a particular quantization scheme (e.g. `Q8_0`):
```
llama-cli -hf LiquidAI/LFM2-350M-PII-Extract-JP-GGUF:Q8_0 -st --temp 0.0 --json-schema {} --jinja
```
Several quantization variants are available (`Q4_0`, `Q4_K_M`, `Q5_K_M`, `Q6_K`, `Q8_0`, and `F16`).
- Only extracting particular entities (e.g. only extract `address` and `company_name`):
```
llama-cli -hf LiquidAI/LFM2-350M-PII-Extract-JP-GGUF -st --temp 0.0 --json-schema {} -sys "Extract , "
```