metadata
license: mit
language:
- en
base_model:
- tuner007/pegasus_paraphrase
tags:
- hunmaniser
- ai
- aidetection
- text-generation
- paraphrasing
- nlp
- transformers
- pegasus
library_name: transformers
pipeline_tag: text2text-generation
widget:
- text: >-
The train was unusually empty as Aarav boarded it late one evening, the
dim overhead lights casting long shadows. He settled into a corner seat,
staring out at the fleeting city lights, when he noticed a leather-bound
journal lying on the seat beside him. Curious, he opened it to find pages
filled with beautiful sketches of places he’d never seen and short notes
signed only with the name "S." Each entry felt like a glimpse into a
stranger's soul—a story of travels, heartbreaks, and quiet moments of joy.
As the train approached his stop, Aarav hesitated, then tucked the journal
into his bag, determined to return it. What he didn’t realize was that
finding the journal would lead him to a serendipitous encounter with the
artist, someone who would change his life forever.
context: >-
The train was unusually empty as Aarav boarded it late one evening, the dim
overhead lights casting long shadows. He settled into a corner seat, staring
out at the fleeting city lights, when he noticed a leather-bound journal lying
on the seat beside him. Curious, he opened it to find pages filled with
beautiful sketches of places he’d never seen and short notes signed only with
the name "S." Each entry felt like a glimpse into a stranger's soul—a story of
travels, heartbreaks, and quiet moments of joy. As the train approached his
stop, Aarav hesitated, then tucked the journal into his bag, determined to
return it. What he didn’t realize was that finding the journal would lead him
to a serendipitous encounter with the artist, someone who would change his
life forever.
Model Card: Humaneyes Text Paraphraser
Model Description
Humaneyes is an advanced text paraphrasing model built using the Pegasus transformer architecture. The model is designed to generate high-quality, contextually-aware paraphrases while preserving the original text's paragraph structure and semantic meaning.
Model Details
- Developed by: Eemansleepdeprived
- Model type: Text-to-text generation (Paraphrasing)
- Language(s): English
- Base model: Google Pegasus Large
- Input format: Plain text
- Output format: Paraphrased text
Intended Use
Primary Use Cases
- Academic writing: Helping researchers and students rephrase text
- Content creation: Assisting writers in generating alternative text variations
- Language learning: Providing examples of different ways to express ideas
Potential Limitations
- May not perfectly preserve highly technical or domain-specific language
- Performance can vary depending on input text complexity
- Not recommended for professional legal or medical document translation
Performance and Evaluation
Key Features
- Preserves paragraph structure
- Maintains semantic meaning
- Handles various text lengths and complexities
- Supports sentence-level paraphrasing
Evaluation Metrics
- Semantic similarity
- Readability
- Grammatical correctness
Training Data
Training Methodology
- Base model: Trained on a diverse corpus of English text
- Fine-tuning: Specific details of paraphrasing fine-tuning
Dataset Characteristics
- Diverse text sources
- Multiple domains and writing styles
Ethical Considerations
Bias and Fairness
- Regular assessments for potential biases in paraphrasing
- Commitment to continuous improvement of model fairness
Usage Guidelines
- Intended for supportive, creative purposes
- Not designed to replace original authorship
- Encourage proper attribution and original thinking
Limitations and Potential Biases
- May occasionally produce text that diverges significantly from the original
- Could introduce subtle semantic shifts
- Performance may vary across different text domains
How to Use
Example Usage
from transformers import PegasusTokenizer, PegasusForConditionalGeneration
tokenizer = PegasusTokenizer.from_pretrained('Eemansleepdeprived/Humaneyes')
model = PegasusForConditionalGeneration.from_pretrained('Eemansleepdeprived/Humaneyes')
input_text = "Your original text goes here."
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
paraphrased_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
Contact and Collaboration
For questions, feedback, or collaboration opportunities, please contact Eemansleepdeprived at link [email protected].
License
This model is released under the MIT License.