Liquid AI

LFM2-350M-PII-Extract-JP-GGUF

Based on LFM2-350M, this checkpoint is designed to extract personally identifiable information (PII) from Japanese text and output it in JSON format. The output can then be used to mask out sensitive information in contracts, emails, personal medical reports, insurance bills, etc. directly on-device.

Find more details in the original model card: https://huggingface.co/LiquidAI/LFM2-350M-PII-Extract-JP

πŸƒ How to run LFM2

Example usage with llama.cpp:

  • Extract address, company/institution names, email addresses, human names, and phone numbers from Japanese text.

    llama-cli -hf LiquidAI/LFM2-350M-PII-Extract-JP-GGUF -st --temp 0.0 --json-schema {} --jinja
    
  • Specifying a particular quantization scheme (e.g. Q8_0):

    llama-cli -hf LiquidAI/LFM2-350M-PII-Extract-JP-GGUF:Q8_0 -st --temp 0.0 --json-schema {} --jinja
    

    Several quantization variants are available (Q4_0, Q4_K_M, Q5_K_M, Q6_K, Q8_0, and F16).

  • Only extracting particular entities (e.g. only extract address and company_name):

    llama-cli -hf LiquidAI/LFM2-350M-PII-Extract-JP-GGUF -st --temp 0.0 --json-schema {} -sys "Extract <address>, <company_name>"
    
Downloads last month
507
GGUF
Model size
0.4B params
Architecture
lfm2
Hardware compatibility
Log In to view the estimation

4-bit

5-bit

6-bit

8-bit

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for LiquidAI/LFM2-350M-PII-Extract-JP-GGUF

Base model

LiquidAI/LFM2-350M
Quantized
(1)
this model

Collection including LiquidAI/LFM2-350M-PII-Extract-JP-GGUF