not working
#1
by
stormchaser
- opened
this is not working, gives error "failed to load model". & unknown model architecture 'stablelm'
stormchaser
changed discussion title from
share the conversion script
to not working
The PR haven't been merged yet. Switch to the #3586 branch. If you use GitHub CLI just type "gh pr checkout 3586" inside your llama.cpp directory and then recompile it.
Support will be coming very shortly to llama.cpp. At the moment you'll have to build the dev branch yourself.
git clone https://github.com/Galunid/llama.cpp -b stablelm-supportcd