timm/vit_large_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 926
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 174 • 2
timm/vit_large_patch14_clip_224.laion2b_ft_in12k Image Classification • 0.3B • Updated Jan 21 • 1.22k
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k Image Classification • 0.6B • Updated Jan 21 • 2.59k • 1
timm/vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 308 • 2
timm/vit_large_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 288 • 1
timm/vit_large_patch14_clip_224.openai_ft_in1k Image Classification • 0.3B • Updated Jan 21 • 3.56k • 1
timm/vit_large_patch14_clip_224.openai_ft_in12k_in1k Image Classification • 0.3B • Updated Jan 21 • 1.71k • 38
timm/vit_base_patch32_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 1.28k • 2
timm/vit_base_patch32_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 78
timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 304k • 4
thaonguyen274/vit-base-patch16-224-finetuned-imageclassification Image Classification • Updated Nov 6, 2022 • 5