timm/vit_huge_patch14_clip_336.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 310 • 2
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.6B • Updated Jan 21 • 167 • 2
timm/vit_huge_patch14_clip_224.laion2b_ft_in12k Image Classification • 0.6B • Updated Jan 21 • 2.47k • 1
timm/vit_base_patch32_clip_448.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 196k • 4
timm/vit_base_patch32_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 104
timm/vit_base_patch32_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 60
timm/vit_base_patch32_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 1.43k • 2
timm/vit_base_patch32_224.augreg_in21k_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 18.5k • 2
timm/vit_base_patch16_clip_384.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 37 • 1
timm/vit_base_patch16_clip_384.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 989 • 4
timm/vit_base_patch16_clip_384.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 361 • 5
timm/vit_base_patch16_clip_224.openai_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 3.85k
timm/vit_base_patch16_clip_224.openai_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 2.61k • 1
timm/vit_base_patch16_clip_224.laion2b_ft_in12k_in1k Image Classification • 0.1B • Updated Jan 21 • 1.56k • 2
timm/vit_base_patch16_clip_224.laion2b_ft_in1k Image Classification • 0.1B • Updated Jan 21 • 1.04k • 1