sultan commited on
Commit
9034f7b
1 Parent(s): 15a9455

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -50,12 +50,9 @@ For the XL-Sum task, we choose our best run for each model using the eval set. W
50
 
51
  # FineTuning our efficient ArabicT5-49GB-Small model with Torch on 3070 laptop GPU ###
52
 
53
- If you are running your code on a laptop GPU (e.g., a gaming laptop) or limited GPU memory, we recommended using our ArabicT5-49GB-Small model, which was the only model from the list that we were able to run on 3070 Laptop card with a batch size of 8. We manage to achieve an F1 score of 85.391 (slightly better than our FLAX code ) on the TyDi QA task. See the notebook below for reference :
54
-
55
  [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/ArabicT5/blob/main/ArabicT5_49GB_Small_on_3070_Laptop_GPU.ipynb)
56
 
57
-
58
-
59
 
60
 
61
  # FineTuning our ArabicT5 model on generative and abstractive tasks with FLAX ###
@@ -65,6 +62,13 @@ If you are running your code on a laptop GPU (e.g., a gaming laptop) or limited
65
  [COLAB]: https://colab.research.google.com/assets/colab-badge.svg
66
 
67
 
 
 
 
 
 
 
 
68
  # Continual Pre-Training of ArabicT5 with T5x
69
  if you want to continue pre-training ArabicT5 on your own data, we have uploaded the raw t5x checkpoint to this link https://huggingface.co/sultan/ArabicT5-49GB-base/blob/main/arabict5_49GB_base_t5x.tar.gz
70
  We will soon share a tutorial on how you can do that for free with Kaggle TPU
 
50
 
51
  # FineTuning our efficient ArabicT5-49GB-Small model with Torch on 3070 laptop GPU ###
52
 
 
 
53
  [![Open In Colab][COLAB]](https://colab.research.google.com/github/salrowili/ArabicT5/blob/main/ArabicT5_49GB_Small_on_3070_Laptop_GPU.ipynb)
54
 
55
+ If you are running your code on a laptop GPU (e.g., a gaming laptop) or limited GPU memory, we recommended using our ArabicT5-49GB-Small model, which was the only model from the list that we were able to run on 3070 Laptop card with a batch size of 8. We manage to achieve an F1 score of 85.391 (slightly better than our FLAX code ) on the TyDi QA task. See the notebook below for reference :
 
56
 
57
 
58
  # FineTuning our ArabicT5 model on generative and abstractive tasks with FLAX ###
 
62
  [COLAB]: https://colab.research.google.com/assets/colab-badge.svg
63
 
64
 
65
+ # FineTuning ArabicT5 on TPUv3-8 with free Kaggle ###
66
+
67
+
68
+ https://www.kaggle.com/code/sultanalrowili/arabict5-on-tydi-with-free-tpuv3-8-with-kaggle
69
+
70
+
71
+
72
  # Continual Pre-Training of ArabicT5 with T5x
73
  if you want to continue pre-training ArabicT5 on your own data, we have uploaded the raw t5x checkpoint to this link https://huggingface.co/sultan/ArabicT5-49GB-base/blob/main/arabict5_49GB_base_t5x.tar.gz
74
  We will soon share a tutorial on how you can do that for free with Kaggle TPU