|
--- |
|
datasets: |
|
- FreedomIntelligence/alpaca-gpt4-indonesian |
|
- FreedomIntelligence/evol-instruct-indonesian |
|
- FreedomIntelligence/sharegpt-indonesian |
|
- jakartaresearch/indoqa |
|
language: |
|
- id |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# Notebook Info |
|
**Reference**: |
|
- https://huggingface.co/docs/transformers/chat_templating |
|
- https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/ai-services/openai/includes/chat-markup-language.md |
|
- https://huggingface.co/datasets/FreedomIntelligence/alpaca-gpt4-indonesian |
|
- https://huggingface.co/datasets/FreedomIntelligence/sharegpt-indonesian |
|
- https://huggingface.co/datasets/FreedomIntelligence/evol-instruct-indonesian |
|
- https://huggingface.co/datasets/jakartaresearch/indoqa |
|
|
|
|
|
**Task**: |
|
Chat or Conversational |
|
|
|
**Input**: |
|
User's prompt containing chat templated text in string format |
|
|
|
**Output**: |
|
Model's generated text in string format |
|
|
|
**Experiment**: |
|
- Use bos_token and eos_token to replace <|im_start|> and <|im_end|> in ChatML. (Inspired by: https://asmirnov.xyz/doppelganger) |
|
- Use left padding and left truncation to conform to max_length. |
|
- Set max_length = 256 in the training process, which consumes 33.7 GB of memory. |
|
|
|
**Notebook**: |
|
- https://drive.google.com/file/d/11FiaWxGt2HxUirZrHTNLaVmiqrUwejwV/view?usp=drive_link |