Output contains many \n
									2
	#7 opened about 14 hours ago
		by
		
				
							
						purpledeerz
	
Unable to run with vLLM
									7
	#6 opened 7 days ago
		by
		
				
							
						rak-r
	
Update config.json
									1
	#5 opened 7 days ago
		by
		
				
							
						prince-canuma
	
Run with llamacpp
❤️
							🔥
							
						2
				
								6
#4 opened 7 days ago
		by
		
				
							
						AbacusGauge
	
Congrats!
❤️
							
						4
				
									2
	#3 opened 8 days ago
		by
		
				
							
						CyborgPaloma
	
Adding `transformers` as the library name
🚀
							
						2
				#2 opened 11 days ago
		by
		
				
							
						ariG23498
	
How to run inference without vLLM? (e.g., with standard transformers)
								1
#1 opened 11 days ago
		by
		
				
							
						Fezz04