YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

swtx SIMCSE RoBERTa WWM Ext Chinese

This model provides simplified Chinese sentence embeddings encoding based on Simple Contrastive Learning. The pretrained model(Chinese RoBERTa WWM Ext) is used for token encoding.

How to use

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("swtx/simcse-chinese-roberta-wwm-ext")
model = AutoModel.from_pretrained("swtx/simcse-chinese-roberta-wwm-ext")
Downloads last month
15
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support