Post
4051
I've created a space for chatting with Gemma 2 using llama.cpp
- ποΈ Choose between 27B IT and 9b IT models
- π Fast inference using llama.cpp
- gokaygokay/Gemma-2-llamacpp
- ποΈ Choose between 27B IT and 9b IT models
- π Fast inference using llama.cpp
- gokaygokay/Gemma-2-llamacpp