--- license: mit library_name: transformers tags: - text-generation - meditation --- # Fine-Tuned Meditation Text Generation Model This model is fine-tuned for generating text related to meditation and mindfulness topics. It is compatible with the Hugging Face Transformers library and is optimized for text generation tasks. ## Intended Use This model is designed to assist users by generating informative or calming text related to meditation, mindfulness, and relaxation practices. It can be used to create content for meditation guides, descriptions, or other wellness-oriented resources. ## Example Usage with Hugging Face Transformers To use this model for text generation, you can load it directly with the Hugging Face `pipeline` and generate responses based on prompts related to meditation and mindfulness. ### Code Example Install the required libraries if you haven’t already: ```bash pip install transformers torch from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline # Load the model and tokenizer model_name = "Phoenix21/fine-tuned-meditation-model" # Replace with your model path on Hugging Face tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) # Create a text generation pipeline generator = pipeline("text-generation", model=model, tokenizer=tokenizer) # Example prompt prompt = "Meditation is a powerful tool for managing stress because" output = generator(prompt, max_length=100, do_sample=True, temperature=0.7) # Print generated text print(output[0]["generated_text"])