Text Classification
Transformers
TensorBoard
Safetensors
bert
Generated from Trainer
sentiment-analysis
text-embeddings-inference
Instructions to use DerivedFunction01/bert-imdb with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use DerivedFunction01/bert-imdb with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="DerivedFunction01/bert-imdb")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("DerivedFunction01/bert-imdb") model = AutoModelForSequenceClassification.from_pretrained("DerivedFunction01/bert-imdb") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 9c2025e8fb1b4968550c91f744ba94fd54c8e906b0b7b1d31f27f57fdeb6a10a
- Size of remote file:
- 5.2 kB
- SHA256:
- 18c8cf7dae4409c26114566049e699b0715bcd026800c0ac9d041c777f6d87e4
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.