Text2Text Generation
Transformers
PyTorch
Safetensors
t5
text-generation-inference
Inference Endpoints

Model Fine-tuning and Optimized Prompt Templates

#2
by limcheekin - opened

Hi there,

Thank you for sharing the model.

Currently, I am utilizing instruction prompts from the GitHub repository located at https://github.com/google-research/FLAN/blob/main/flan/v2/flan_templates_branched.py for my conversational chatbot.

I am curious if the model has been fine-tuned using specific sets of optimized prompt templates or structures. If there are any accompanying write-ups or blog posts detailing the process of fine-tuning the model from flan-t5-xl, I would greatly appreciate it if you could share them with me.

Lastly, I would be grateful if you could provide the optimized prompt templates/structures to enhance the text generation capabilities of the model, ensuring optimal performance and quality.

I look forward to hearing from you.

Best regards.

did you find anything to fine-tune this model

Sign up or log in to comment