Model Page: Gemma
Fine-tuned Gemma with OpenAI Function Call Support
Finetuned version of Gemma 7B Instruct to support direct function calling. This new capability aligns with the functionality seen in OpenAI's models, enabling Gemma to interact with external data sources and perform more complex tasks, such as fetching real-time information or integrating with custom databases for enriched AI-powered applications.
Features
- Direct Function Calls: Gemma now supports structured function calls, allowing for the integration of external APIs and databases directly into the conversational flow. This makes it possible to execute custom searches, retrieve data from the web or specific databases, and even summarize or explain content in depth.
Fine-tuned Quantization Models
Updating:
Model Description
Gemma is a family of lightweight, state-of-the-art open models from Google, built from the same research and technology used to create the Gemini models. They are text-to-text, decoder-only large language models, available in English, with open weights, pre-trained variants, and instruction-tuned variants. Gemma models are well-suited for a variety of text generation tasks, including question answering, summarization, and reasoning. Their relatively small size makes it possible to deploy them in environments with limited resources such as a laptop, desktop or your own cloud infrastructure, democratizing access to state of the art AI models and helping foster innovation for everyone.
- Downloads last month
- 11