File size: 2,523 Bytes
61c873d
 
7bf787c
 
 
 
 
 
 
 
 
61c873d
1af2e5f
 
7bf787c
1af2e5f
7bf787c
 
1af2e5f
7bf787c
1af2e5f
7bf787c
1af2e5f
7bf787c
1af2e5f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7bf787c
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
---
license: apache-2.0
datasets:
- ajibawa-2023/General-Stories-Collection
language:
- en
tags:
- story
- art
- general audience
- knowledge
---
**General-Stories-Mistral-7B**

This model is based on my dataset [General-Stories-Collection](https://huggingface.co/datasets/ajibawa-2023/General-Stories-Collection) which has **1.3 million** stories especially meant for General audience.

After an extensive training period spanning over 15 days, this model has been meticulously honed to deliver captivating narratives with broad appeal. 
Leveraging a vast synthetic dataset comprising approximately **1.3 million** stories tailored for diverse readership, this model possesses a deep understanding of narrative intricacies and themes.

We're excited to introduce this powerful tool, ready to spark imagination and entertain readers worldwide with its versatile storytelling capabilities.

As we embark on this exciting journey of AI storytelling, we invite you to explore the endless possibilities our model has to offer. Whether you're a writer seeking inspiration, a reader in search of a captivating tale, or a creative mind eager to push the boundaries of storytelling, our model is here to inspire, entertain, and enrich your literary experience.

Kindly note this is qLoRA version.


**GGUF & Exllama**

Standard Q_K & GGUF: TBA

Exllama: TBA



**Training**

Entire dataset was trained on 4 x A100 80GB. For 2 epoch, training took more than 30 Hours. Axolotl codebase was used for training purpose. Entire data is trained on Mistral-7B-v0.1.

**Example Prompt:**

This model uses **ChatML** prompt format.

```
<|im_start|>system
You are a Helpful Assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

```
You can modify above Prompt as per your requirement. 


I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development.

Thank you for your love & support.

**Example Output**

Example 1



![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/mVLGRiYKzFCC2wAJOejLP.jpeg)



Example 2


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/FwCUW9FDDnmBpdnqraWNF.jpeg)



Example 3


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/w0D_eX3xG6MnX5wWD8LT9.jpeg)


Example 4


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/HaJ91YQ9d57SGv7BwTcv_.jpeg)