agent404 commited on
Commit
3db6f38
1 Parent(s): b8be370

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -25,11 +25,12 @@ margin. Our work reveals that LLMs can be an excellent compressor for music, but
25
 
26
  <!-- <audio controls src="https://cdn-uploads.huggingface.co/production/uploads/5fd6f670053c8345eddc1b68/8NSONUjIF7KGUCfwzPCd9.mpga"></audio> -->
27
 
28
- <iframe width="787" height="528" src="https://www.youtube.com/embed/zt3l49K55Io" title="ChatMusician: Fostering Intrinsic Musical Abilities Into LLM" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
 
29
 
30
  ## Usage
31
 
32
- You can use the models through Huggingface's Transformers library. Check our Github repo for more advanced use: [https://github.com/hf-lin/ChatMusician](https://github.com/hf-lin/ChatMusician)
33
 
34
  ## CLI demo
35
  ```python
@@ -37,8 +38,8 @@ from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig
37
  from string import Template
38
  prompt_template = Template("Human: ${inst} </s> Assistant: ")
39
 
40
- tokenizer = AutoTokenizer.from_pretrained("m-a-p/ChatMusician-v1-sft-78k", trust_remote_code=True)
41
- model = AutoModelForCausalLM.from_pretrained("m-a-p/ChatMusician-v1-sft-78k", trust_remote_code=True).eval()
42
  model.cuda()
43
  generation_config = GenerationConfig(
44
  temperature=0.2,
 
25
 
26
  <!-- <audio controls src="https://cdn-uploads.huggingface.co/production/uploads/5fd6f670053c8345eddc1b68/8NSONUjIF7KGUCfwzPCd9.mpga"></audio> -->
27
 
28
+ <video src="https://www.youtube.com/embed/zt3l49K55Io" width="320" height="240" controls></video>
29
+ <!-- <iframe width="787" height="528" src="https://www.youtube.com/embed/zt3l49K55Io" title="ChatMusician: Fostering Intrinsic Musical Abilities Into LLM" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
30
 
31
  ## Usage
32
 
33
+ You can use the models through Huggingface's Transformers library. Check our Github repo for more advanced use: [https://github.com/hf-lin/ChatMusician](https://github.com/hf-lin/ChatMusician) -->
34
 
35
  ## CLI demo
36
  ```python
 
38
  from string import Template
39
  prompt_template = Template("Human: ${inst} </s> Assistant: ")
40
 
41
+ tokenizer = AutoTokenizer.from_pretrained("m-a-p/ChatMusician", trust_remote_code=True)
42
+ model = AutoModelForCausalLM.from_pretrained("m-a-p/ChatMusician", trust_remote_code=True).eval()
43
  model.cuda()
44
  generation_config = GenerationConfig(
45
  temperature=0.2,