--- license: apache-2.0 pipeline_tag: text-generation ---

SongComposer

[💻Github Repo](https://github.com/pjlab-songcomposer/songcomposer) [📖Paper](https://arxiv.org/abs/2402.17645)
**SongComposer** is a language large model (LLM) based on [InternLM2](https://github.com/InternLM/InternLM) for lyric and melody composition in song generation. We release SongComposer series in two versions: - SongComposer_pretrain: The pretrained SongComposer with InternLM2 as the initialization of the LLM, gains basic knowledge of lyric and melody. - SongComposer_sft: The finetuned SongComposer for *instruction-following song generation* including lyric to melody, melody to lyric, song continuation, text to song. ### Import from Transformers To load the SongComposer_pretrain model using Transformers, use the following code: ```python from transformers import AutoTokenizer, AutoModel ckpt_path = "Mar2Ding/songcomposer_pretrain" tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True) model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half() prompt = ' Total 7 lines. The first line:可,,<137>,<79>|惜,,<137>,<79>|这,,<137>,<88>|是,,<121>,<79>|属,,<121>,<79>|于,,<214>,<88>|你,,<141>,<79>|的,,<130>,<79>|风,,<151>,<79>|景, ,<181><137>,<79>\n' model.inference_pretrain(prompt, tokenizer, model) ``` ### 通过 Transformers 加载 通过以下的代码加载 SongComposer_pretrain 模型 ```python from transformers import AutoTokenizer, AutoModel ckpt_path = "Mar2Ding/songcomposer_pretrain" tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True) model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half() prompt = ' Total 7 lines. The first line:可,,<137>,<79>|惜,,<137>,<79>|这,,<137>,<88>|是,,<121>,<79>|属,,<121>,<79>|于,,<214>,<88>|你,,<141>,<79>|的,,<130>,<79>|风,,<151>,<79>|景, ,<181><137>,<79>\n' model.inference_pretrain(prompt, tokenizer, model) ``` ### Open Source License The code is licensed under Apache-2.0, while model weights are fully open for academic research and also allow free commercial usage.