julien-c HF staff commited on
Commit
3924c27
1 Parent(s): f785a3a

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/Naveen-k/KanBERTo/README.md

Files changed (1) hide show
  1. README.md +28 -0
README.md ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: kn
3
+ ---
4
+
5
+ # Welcome to KanBERTo (ಕನ್ಬರ್ಟೋ)
6
+
7
+ ## Model Description
8
+
9
+ > This is a small language model for [Kannada](https://en.wikipedia.org/wiki/Kannada) language with 1M data samples taken from
10
+ [OSCAR page](https://traces1.inria.fr/oscar/files/compressed-orig/kn.txt.gz)
11
+
12
+ ## Training params
13
+
14
+ - **Dataset** - 1M data samples are used to train this model from OSCAR page(https://traces1.inria.fr/oscar/) eventhough data set is of 1.7 GB due to resource constraint to train
15
+ I have picked only 1M data from the total 1.7GB data set. If you are interested in collaboration and have computational resources to train on you are most welcome to do so.
16
+
17
+ - **Preprocessing** - ByteLevelBPETokenizer is used to tokenize the sentences at character level and vocabulary size is set to 52k as per standard values given by 🤗
18
+ - **Hyperparameters** - __ByteLevelBPETokenizer__ : vocabulary size = 52_000 and min_frequency = 2
19
+ __Trainer__ : num_train_epochs=12 - trained for 12 epochs
20
+ per_gpu_train_batch_size=64 - batch size for the datasamples is 64
21
+ save_steps=10_000 - save model for every 10k steps
22
+ save_total_limit=2 - save limit is set for 2
23
+
24
+ **Intended uses & limitations**
25
+ this is for anyone who wants to make use of kannada language models for various tasks like language generation, translation and many more use cases.
26
+
27
+ **Whatever else is helpful!**
28
+ If you are intersted in collaboration feel free to reach me [Naveen](mailto:[email protected])