选择模板后报错!
Traceback (most recent call last):
File "C:\Users\Administrator\text-generation-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 78, in load_model
output = load_func_maploader
File "C:\Users\Administrator\text-generation-webui\modules\models.py", line 232, in llamacpp_loader
from modules.llamacpp_model import LlamaCppModel
File "C:\Users\Administrator\text-generation-webui\modules\llamacpp_model.py", line 11, in
import llama_cpp
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp_init_.py", line 1, in
from .llama_cpp import *
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 1292, in
llama_backend_init(c_bool(False))
File "C:\Users\Administrator.conda\envs\textgen\lib\site-packages\llama_cpp\llama_cpp.py", line 403, in llama_backend_init
return _lib.llama_backend_init(numa)
OSError: [WinError -1073741795] Windows Error 0xc000001d
Hello, thank you for your message. I am not familiar with Text UIs and its compability, functionality, but I built a simple space which allow to interract with a model, so you can try it out here.
https://huggingface.co/spaces/s3nh/s3nh-chinese-alpaca-2-7b-GGML
All best,
Damian
I have successfully started, how to use text-generation-webui for secondary training?