kimhyeongjun's picture
Update README.md
e05bd5a verified
|
raw
history blame
2.34 kB
metadata
library_name: transformers
license: other
base_model: NousResearch/Hermes-3-Llama-3.1-8B
tags:
  - llama-factory
  - full
  - unsloth
  - generated_from_trainer
model-index:
  - name: kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor
    results: []

kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor

This is my personal toy project for Chuseok(Korean Thanksgiving Day).

This model is a fine-tuned version of NousResearch/Hermes-3-Llama-3.1-8B on the Korean_synthetic_financial_dataset_21K.

Model description

Everything happened automatically without any user intervention.

Based on finance PDF data collected directly from the web, we refined the raw data using the 'meta-llama/Meta-Llama-3.1-70B-Instruct-FP8' model. After generating synthetic data based on the cleaned data, we further evaluated the quality of the generated data using the 'meta-llama/Llama-Guard-3-8B' and 'RLHFlow/ArmoRM-Llama3-8B-v0.1' models. We then used 'Alibaba-NLP/gte-large-en-v1.5' to extract embeddings and applied Faiss to perform Jaccard distance-based nearest neighbor analysis to construct the final dataset of 21k, which is diverse and sophisticated.

λͺ¨λ“  과정은 μ‚¬μš©μžμ˜ κ°œμž… 없이 μžλ™μœΌλ‘œ μ§„ν–‰λ˜μ—ˆμŠ΅λ‹ˆλ‹€.

μ›Ήμ—μ„œ 직접 μˆ˜μ§‘ν•œ 금육 κ΄€λ ¨ PDF 데이터λ₯Ό 기반으둜, 돈이 μ—†μ–΄μ„œ 'meta-llama/Meta-Llama-3.1-70B-Instruct-FP8' λͺ¨λΈμ„ ν™œμš©ν•˜μ—¬ Raw 데이터λ₯Ό μ •μ œν•˜μ˜€μŠ΅λ‹ˆλ‹€. μ •μ œλœ 데이터λ₯Ό λ°”νƒ•μœΌλ‘œ ν•©μ„± 데이터λ₯Ό μƒμ„±ν•œ ν›„, 'meta-llama/Llama-Guard-3-8B' 및 'RLHFlow/ArmoRM-Llama3-8B-v0.1' λͺ¨λΈμ„ 톡해 μƒμ„±λœ λ°μ΄ν„°μ˜ ν’ˆμ§ˆμ„ μ‹¬μΈ΅μ μœΌλ‘œ ν‰κ°€ν•˜μ˜€μŠ΅λ‹ˆλ‹€. μ΄μ–΄μ„œ 'Alibaba-NLP/gte-large-en-v1.5'λ₯Ό μ‚¬μš©ν•˜μ—¬ μž„λ² λ”©μ„ μΆ”μΆœν•˜κ³ , Faissλ₯Ό μ μš©ν•˜μ—¬ μžμΉ΄λ“œ 거리 기반의 κ·Όμ ‘ 이웃 뢄석을 μˆ˜ν–‰ν•¨μœΌλ‘œμ¨ λ‹€μ–‘ν•˜κ³  μ •κ΅ν•œ μ΅œμ’… 데이터셋 21k을 직접 κ΅¬μ„±ν•˜μ˜€μŠ΅λ‹ˆλ‹€.

Task duration

3days (20240914~20240916)

evaluation

Nothing (I had to take the Thanksgiving holiday off.)

sample

image/png

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1