-
Attention Is All You Need
Paper • 1706.03762 • Published • 44 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 14 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 7 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 14
Taufiq Dwi Purnomo
taufiqdp
AI & ML interests
SLM, VLM
Recent Activity
Organizations
Collections
1
spaces
7
models
7
taufiqdp/mobilenetv4_conv_medium.e500_r256_in1k-emotion
Image Classification
•
Updated
•
5
taufiqdp/mobilenetv4_conv_small.e2400_r224_in1k_nsfw_classifier
Image Classification
•
Updated
•
5
taufiqdp/convnext-eurosat
Image Classification
•
Updated
•
8
taufiqdp/train-checkpoint
Updated
taufiqdp/gemma-2b-q8_0-gguf
Updated
•
4
taufiqdp/stablelm-2-1_6b-indo-lora
Updated
•
7
taufiqdp/indonesian-sentiment
Text Classification
•
Updated
•
87
•
1