language: en
license: mit
tags:
- text-generation-inference
- transformers
- Phi3
- kubernetes
base_model: unsloth/Phi-3-mini-4k-instruct
datasets:
- andyburgin/kubefix
Phi-3-mini-4k-instruct-kubefix-v0.1-gguf: Fine-Tuned Phi3 for Kubernetes fault resolution
The purpose of this model is for use with K8sGPT for fault analysis and resolution. Ultimately the resulting LLM is intended to be self-hosted in a GPU free environment running under local-ai in Kubernetes.
The model was finetuned on andyburgin/kubefix which contains a series of Question and Answer pairs generated from a subset of the Kubernetes documentation from the English markdown files. The Q&A pairs have been generated from the documents using an opensource model (to avoid licencing issues for some free models or SaasS services) - after much trial and error the openchat-3.5-0106 model was found to be the least problematic.
For a detailed description of the method used to generate the andyburgin/kubefix dataset and this model please see the kubefix-llm repo.
Model & Development
- Developed by: andyburgin
- License: mit
- Finetuned from model: unsloth/Phi-3-mini-4k-instruct
Key Features
- Kubernetes Focus: Optimised for fault analysis for Kubernetes clusters with K8sGPT.
- Knowledge Base: Trained on a genrateed dataset from a subset of the Kubernetes documentation.
- Text Generation: Generates informative and potentially helpful responses.
Important Note
This model and dataset are under development and v0.1 is the very first release and is likely to need much optimisation and development.
License
This model is distributed under the MIT License
Contributing
Contributions are welcome to this repository! If you have improvements or suggestions, feel free to create a pull request.
Disclaimer
Please note - the dataset and resultant model should be considered highly experimental and used with caution, use at your own risk. ```