MMedAgent_demo / model_worker_c80683.log
YPan0's picture
Upload folder using huggingface_hub
b6deff2 verified
2024-09-30 08:49:49 | INFO | model_worker | args: Namespace(host='0.0.0.0', port=40000, worker_address='http://localhost:40000', controller_address='http://localhost:20001', model_path='/home/jack/Projects/yixin-llm/merge_med_llava_3', model_base=None, model_name=None, device='cuda', multi_modal=False, limit_model_concurrency=5, stream_interval=1, no_register=False, load_8bit=False, load_4bit=False)
2024-09-30 08:49:49 | INFO | model_worker | Loading the model merge_med_llava_3 on worker c80683 ...
2024-09-30 08:49:49 | WARNING | transformers.models.llama.tokenization_llama | You are using the legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This means that tokens that come after special tokens will not be properly handled. We recommend you to read the related pull request available at https://github.com/huggingface/transformers/pull/24565
2024-09-30 08:49:50 | ERROR | stderr | /home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2024-09-30 08:49:50 | ERROR | stderr | warnings.warn(
2024-09-30 08:49:50 | ERROR | stderr | Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s]
2024-09-30 08:49:57 | ERROR | stderr | Loading checkpoint shards: 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 1/2 [00:06<00:06, 6.58s/it]
2024-09-30 08:49:59 | ERROR | stderr | Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:08<00:00, 3.97s/it]
2024-09-30 08:49:59 | ERROR | stderr | Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:08<00:00, 4.36s/it]
2024-09-30 08:49:59 | ERROR | stderr |
2024-09-30 08:50:01 | INFO | model_worker | Register to controller
2024-09-30 08:50:01 | ERROR | stderr | INFO: Started server process [1161256]
2024-09-30 08:50:01 | ERROR | stderr | INFO: Waiting for application startup.
2024-09-30 08:50:01 | ERROR | stderr | INFO: Application startup complete.
2024-09-30 08:50:01 | ERROR | stderr | INFO: Uvicorn running on http://0.0.0.0:40000 (Press CTRL+C to quit)
2024-09-30 08:50:16 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:50:31 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:50:47 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:51:02 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:51:17 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:51:32 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:51:47 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:52:02 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:52:17 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:52:32 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:52:47 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:53:02 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:53:17 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:53:32 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:53:47 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:54:02 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:54:17 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:54:32 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:54:47 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:55:02 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:55:17 | INFO | model_worker | Send heart beat. Models: ['merge_med_llava_3']. Semaphore: None. global_counter: 0
2024-09-30 08:55:20 | ERROR | stderr | INFO: Shutting down
2024-09-30 08:55:20 | ERROR | stderr | INFO: Waiting for application shutdown.
2024-09-30 08:55:20 | ERROR | stderr | INFO: Application shutdown complete.
2024-09-30 08:55:20 | ERROR | stderr | INFO: Finished server process [1161256]
2024-09-30 08:55:20 | ERROR | stderr | Traceback (most recent call last):
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/runpy.py", line 196, in _run_module_as_main
2024-09-30 08:55:20 | ERROR | stderr | return _run_code(code, main_globals, None,
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/runpy.py", line 86, in _run_code
2024-09-30 08:55:20 | ERROR | stderr | exec(code, run_globals)
2024-09-30 08:55:20 | ERROR | stderr | File "/data1/jackdata/yixin-llm-data/yptests/MMedAgent_demo/llava/serve/model_worker.py", line 285, in <module>
2024-09-30 08:55:20 | ERROR | stderr | uvicorn.run(app, host=args.host, port=args.port, log_level="info")
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/main.py", line 575, in run
2024-09-30 08:55:20 | ERROR | stderr | server.run()
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/server.py", line 65, in run
2024-09-30 08:55:20 | ERROR | stderr | return asyncio.run(self.serve(sockets=sockets))
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/asyncio/runners.py", line 44, in run
2024-09-30 08:55:20 | ERROR | stderr | return loop.run_until_complete(main)
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 1511, in uvloop.loop.Loop.run_until_complete
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 1504, in uvloop.loop.Loop.run_until_complete
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 1377, in uvloop.loop.Loop.run_forever
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 555, in uvloop.loop.Loop._run
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/loop.pyx", line 474, in uvloop.loop.Loop._on_idle
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/cbhandles.pyx", line 83, in uvloop.loop.Handle._run
2024-09-30 08:55:20 | ERROR | stderr | File "uvloop/cbhandles.pyx", line 63, in uvloop.loop.Handle._run
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/server.py", line 68, in serve
2024-09-30 08:55:20 | ERROR | stderr | with self.capture_signals():
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/contextlib.py", line 142, in __exit__
2024-09-30 08:55:20 | ERROR | stderr | next(self.gen)
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/site-packages/uvicorn/server.py", line 328, in capture_signals
2024-09-30 08:55:20 | ERROR | stderr | signal.raise_signal(captured_signal)
2024-09-30 08:55:20 | ERROR | stderr | KeyboardInterrupt
2024-09-30 08:55:20 | ERROR | stderr | Exception ignored in: <module 'threading' from '/home/jack/anaconda3/envs/llavaplus/lib/python3.10/threading.py'>
2024-09-30 08:55:20 | ERROR | stderr | Traceback (most recent call last):
2024-09-30 08:55:20 | ERROR | stderr | File "/home/jack/anaconda3/envs/llavaplus/lib/python3.10/threading.py", line 1567, in _shutdown
2024-09-30 08:55:20 | ERROR | stderr | lock.acquire()
2024-09-30 08:55:20 | ERROR | stderr | KeyboardInterrupt: