t5-brokarry-unknown
This model is a fine-tuned version of paust/pko-t5-large on the None dataset.
Model description
의도 | 개체 |
---|---|
일반대화 | |
전화연결 | 대상 |
장소안내 | 장소, 대상 |
날씨예보 | 언제, 시간,장소, 대상, 조건 |
화물추천 | 언제, 시간, 상차, 하차, 조건 |
Unknown | |
*대상 : 상차지/하차지 |
How to use
import requests
API_URL = "https://api-inference.huggingface.co/models/yeye776/t5-brokarry-unknown"
headers = {"Authorization": "Bearer hf_key"}
def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
prompt = "브로캐리에 관련된 이용자의 대화인데 분류 및 인식 해줘! :"
input = "내일 심야 상차지가 분당인 화물 추천해줘"
output = query({
"inputs": prompt+input,
"options":{"wait_for_model":True}
})
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0007
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.06
- num_epochs: 10
Training output
- 전체 훈련 스텝 수 (Global Step): 130
- 훈련 손실 (Training Loss): 0.442529062124399
- 훈련 시간 (Train Runtime): 836.443 초(약 13분)
Framework versions
- Transformers 4.35.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for yeye776/t5-brokarry-unknown
Base model
paust/pko-t5-large