MobileNetV3-Large for Bacterial Colony Classification
This model is a fine-tuned version of MobileNetV3-Large on the DIBaS (Digital Image of Bacterial Species) dataset for classifying bacterial colony images into 33 species.
Model Description
- Model Architecture: MobileNetV3-Large (pretrained on ImageNet)
- Task: Multi-class image classification (33 bacterial species)
- Dataset: DIBaS - 660+ microscopy images of bacterial colonies
- Framework: PyTorch + timm
Performance
| Metric | Value |
|---|---|
| Validation Accuracy | 95.45% |
| Macro F1-Score | 0.954 |
| Parameters | 4.24M |
| Model Size | 16.4 MB |
| GPU Latency | 4.39 ms (RTX 4070 SUPER) |
| CPU Latency | 20.33 ms |
Comparison with Other Architectures
| Model | Params (M) | Val Accuracy |
|---|---|---|
| MobileNetV3-Large | 4.24 | 95.45% |
| ResNet50 | 23.58 | 93.94% |
| EfficientNet-B3 | 10.75 | 92.42% |
| ViT-Tiny | 5.53 | 78.79% |
Intended Use
This model is designed for:
- Automated bacterial species identification from colony images
- Research in clinical microbiology
- Educational purposes in microbiology labs
- Mobile/edge deployment for rapid identification
Training Details
- Optimizer: AdamW (lr=1e-3, weight_decay=1e-4)
- Epochs: 20
- Batch Size: 32
- Image Size: 224×224
- Augmentation: RandomResizedCrop, HorizontalFlip
- Hardware: NVIDIA RTX 4070 SUPER
- Mixed Precision: Enabled (AMP)
- Train/Val/Test Split: 70/20/10 (stratified, seed=42)
How to Use
With timm (Recommended)
import timm
import torch
from PIL import Image
from torchvision import transforms
# Load model
model = timm.create_model('mobilenetv3_large_100', pretrained=False, num_classes=33)
state_dict = torch.load('pytorch_model.pth', map_location='cpu')
model.load_state_dict(state_dict['model_state'])
model.eval()
# Preprocessing
transform = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(224),
transforms.ToTensor(),
transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
])
# Inference
image = Image.open('bacteria_image.jpg').convert('RGB')
input_tensor = transform(image).unsqueeze(0)
with torch.no_grad():
outputs = model(input_tensor)
predicted_class = outputs.argmax(dim=1).item()
confidence = torch.softmax(outputs, dim=1)[0, predicted_class].item()
print(f"Predicted class: {CLASS_NAMES[predicted_class]}")
print(f"Confidence: {confidence:.2%}")
Class Labels
CLASS_NAMES = [
"Acinetobacter_baumannii", "Actinomyces_israelii", "Bacteroides_fragilis",
"Bifidobacterium_spp", "Candida_albicans", "Clostridium_perfringens",
"Enterococcus_faecalis", "Enterococcus_faecium", "Escherichia_coli",
"Fusobacterium", "Lactobacillus_casei", "Lactobacillus_crispatus",
"Lactobacillus_delbrueckii", "Lactobacillus_gasseri", "Lactobacillus_jensenii",
"Lactobacillus_johnsonii", "Lactobacillus_paracasei", "Lactobacillus_plantarum",
"Lactobacillus_reuteri", "Lactobacillus_rhamnosus", "Lactobacillus_salivarius",
"Listeria_monocytogenes", "Micrococcus_spp", "Neisseria_gonorrhoeae",
"Porphyromonas_gingivalis", "Propionibacterium_acnes", "Proteus",
"Pseudomonas_aeruginosa", "Staphylococcus_aureus", "Staphylococcus_epidermidis",
"Staphylococcus_saprophyticus", "Streptococcus_agalactiae", "Veillonella"
]
Limitations
- Trained on single laboratory/microscope setup (DIBaS dataset)
- May not generalize to different imaging conditions or staining methods
- Some morphologically similar species (e.g., Staphylococcus variants) may be confused
- Not validated for clinical diagnostic use
Citation
@inproceedings{hoflaz2025bacterial,
title={Lightweight CNNs Outperform Vision Transformers for Bacterial Colony Classification},
author={Hoflaz, Ibrahim},
booktitle={IEEE Conference},
year={2025}
}
References
- DIBaS Dataset: http://misztal.edu.pl/software/databases/dibas/
- Zieliński, B., et al. "Deep learning approach to bacterial colony classification." PloS one 12.9 (2017): e0184554.
Model Card Contact
- GitHub: ihoflaz/bacterial-colony-classification
- Author: Ibrahim Hoflaz
- Downloads last month
- 14