Edit model card

spow12/MK_Nemo_12B

Model Description

This model is a Supervised fine-tuned version of mistralai/Mistral-Nemo-Instruct-2407 with DeepSpeed and trl for korean.

Merge methods.

models:
    - model: anthracite-org/magnum-v4-12b
    - model: mistralai/Mistral-Nemo-Instruct-2407
    - model: spow12/Mistral-Nemo-Instruct-2407_sft_ver_4.4(private)
    - model: werty1248/Mistral-Nemo-NT-Ko-12B-dpo
merge_method: model_stock
base_model: spow12/Mistral-Nemo-Instruct-2407_sft_ver_4.4(private)
dtype: bfloat16

Trained Data

  • Trained with public, private data (about 130K)

Usage

from transformers import TextStreamer, pipeline, AutoTokenizer, AutoModelForCausalLM

model_id = 'spow12/MK_Nemo_12B'
tokenizer = AutoTokenizer.from_pretrained(model_id)
# %%
model = AutoModelForCausalLM.from_pretrained(
    model_id,
    torch_dtype=torch.bfloat16,
    attn_implementation="flash_attention_2",  #Optional
    device_map='auto',
)
model.eval()

pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, device_map='auto')

generation_configs = dict(
    max_new_tokens=2048,
    num_return_sequences=1, 
    temperature=0.75,
    # repetition_penalty=1.1,
    do_sample=True,
    top_k=20,
    top_p=0.9,
    min_p=0.1,
    eos_token_id=tokenizer.eos_token_id,
    pad_token_id=tokenizer.eos_token_id,
    streamer = TextStreamer(tokenizer) # Optional, if you want to use streamer, you have to set num_beams=1
)

sys_message = """당신은 μΉœμ ˆν•œ μ±—λ΄‡μœΌλ‘œμ„œ μƒλŒ€λ°©μ˜ μš”μ²­μ— μ΅œλŒ€ν•œ μžμ„Έν•˜κ³  μΉœμ ˆν•˜κ²Œ λ‹΅ν•΄μ•Όν•©λ‹ˆλ‹€. 
μ‚¬μš©μžκ°€ μ œκ³΅ν•˜λŠ” 정보λ₯Ό μ„Έμ‹¬ν•˜κ²Œ λΆ„μ„ν•˜μ—¬ μ‚¬μš©μžμ˜ μ˜λ„λ₯Ό μ‹ μ†ν•˜κ²Œ νŒŒμ•…ν•˜κ³  그에 따라 닡변을 μƒμ„±ν•΄μ•Όν•©λ‹ˆλ‹€.  

항상 맀우 μžμ—°μŠ€λŸ¬μš΄ ν•œκ΅­μ–΄λ‘œ μ‘λ‹΅ν•˜μ„Έμš”."""

message = [
    {
        'role': "system",
        'content': sys_message
    },
    {
        'role': 'user',
        'content': "ν˜„μž¬μ˜ κ²½μ œμƒν™©μ— λŒ€ν•΄ μ–΄λ–»κ²Œ 생각해?."
    }
]
conversation = pipe(message, **generation_configs)
conversation[-1]

#output
ν˜„μž¬μ˜ κ²½μ œμƒν™©μ€ κ°κ΅­λ§ˆλ‹€ λ‹€λ₯΄λ©°, μ „λ°˜μ μœΌλ‘œλŠ” μ½”λ‘œλ‚˜19 팬데믹의 영ν–₯으둜 큰 타격을 받은 μƒνƒœμž…λ‹ˆλ‹€. λ§Žμ€ κ΅­κ°€μ—μ„œ 경제 μ„±μž₯λ₯ μ΄ κ°μ†Œν•˜κ³  μ‹€μ—…λ₯ μ΄ μƒμŠΉν–ˆμŠ΅λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜ 각ꡭ μ •λΆ€λŠ” μž¬μ •κ³Ό 톡화 정책을 톡해 경제λ₯Ό μ§€μ§€ν•˜κ³  λ³΅κ΅¬ν•˜κΈ° μœ„ν•΄ λ…Έλ ₯ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€. μ½”λ‘œλ‚˜19 λ°±μ‹ μ˜ 개발과 배포가 경제 νšŒλ³΅μ— 도움이 될 κ²ƒμœΌλ‘œ κΈ°λŒ€λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€. κ·ΈλŸ¬λ‚˜ μ½”λ‘œλ‚˜19 μ΄μ „μ˜ 경제 μ„±μž₯λ₯ μ„ νšŒλ³΅ν•˜κΈ° μœ„ν•΄μ„œλŠ” μ‹œκ°„μ΄ 걸릴 수 μžˆμŠ΅λ‹ˆλ‹€. μž₯κΈ°μ μœΌλ‘œλŠ” μ €μ„±μž₯κ³Ό κ³ μΈν”Œλ ˆμ΄μ…˜μ΄ 계속될 수 μžˆλŠ” μœ„ν—˜λ„ μžˆμŠ΅λ‹ˆλ‹€. λ”°λΌμ„œ 각ꡭ은 μ½”λ‘œλ‚˜19 μ΄ν›„μ˜ μ„Έκ³„μ—μ„œ μƒˆλ‘œμš΄ 경제 λͺ¨λΈμ„ λͺ¨μƒ‰ν•˜κ³ , 디지털화와 녹색 경제 μ „ν™˜μ„ κ°€μ†ν™”ν•˜λŠ” λ“± λ―Έλž˜μ— λŒ€λΉ„ν•˜λŠ” λ…Έλ ₯이 ν•„μš”ν•©λ‹ˆλ‹€.
Downloads last month
90
Safetensors
Model size
12.2B params
Tensor type
BF16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for spow12/MK_Nemo_12B

Collection including spow12/MK_Nemo_12B