Minimum Requirements for running the model in CUDA
Hello Guys,
Congratulations for the great work.
Could you please share the minimum requirements that Meltemi needs in order to run in GPU with cuda?
For example a dedicated memory of 4gb (NVIDIA) will be able to run the model efficiently?
Moreover, with a simple usage with CPU will the model run in the inference mode or is not possible?
Thank you in advance.
Rafail
Thank you for the quick response. I will try the GGUF version.
I was waiting for a greek LLM for a very long time. Congratulations. Sorry for my ignorance, but can this model be used for Retrieval Augmented Generation?
As any other model, it can be used as a "generator" in the RAG process.
Having said that for the storing and retrieval purposes you would have to pick suitable embeddings for Greek which might not be the case for all embedding algos out there, so it might require some trial and error.