Update tool use chat template
#57 opened 5 days ago
by
Rocketknight1
Truly great model for text-based operations like analysing and researching
4
#56 opened 21 days ago
by
bkieser
"triu_tril_cuda_template" not implemented for 'BFloat16'
3
#52 opened 3 months ago
by
Ashmal
Prompt format for fine-tuning
#51 opened 3 months ago
by
skevja
Request: DOI
#50 opened 3 months ago
by
gagan3012
Please document pretraining datasets
#49 opened 3 months ago
by
markding
Instruct-finetuning dataset
5
#43 opened 3 months ago
by
Andriy
Context length is not 128k
2
#41 opened 3 months ago
by
pseudotensor
Is there a best way to infer this model from multiple small memory GPUs?
1
#39 opened 3 months ago
by
hongdouzi
Configuring command-r-gptq
#33 opened 4 months ago
by
Cyleux
Configuring Command-R for long context tasks
3
#32 opened 4 months ago
by
beam-me-up-scotty
Any recommendations for fine-tuning on two 40GB A100s?
#31 opened 4 months ago
by
omarabb315
Any recommended frontend to run this model?
2
#30 opened 4 months ago
by
DrNicefellow
[AUTOMATED] Model Memory Requirements
#26 opened 4 months ago
by
model-sizer-bot
[AUTOMATED] Model Memory Requirements
#25 opened 4 months ago
by
model-sizer-bot
Error "sharded is not supported for AutoModel" when deploying on sagemaker endpoint
#22 opened 4 months ago
by
LorenzoCevolaniAXA
Create SASA
#21 opened 4 months ago
by
kevin12123141
*zsh: killed* on macbookpro M2 with 24GB
5
#14 opened 4 months ago
by
aleksandrvin
gguf is required :)
12
#11 opened 4 months ago
by
flymonk