- Notifications
You must be signed in to change notification settings - Fork 74
Description
使用命令:
python3 -m chat.server.model_worker --model-path models/testgpt --device mps
执行log记录如下:
2024-01-22 17:08:04 | INFO | model_worker | args: Namespace(host='localhost', port=21002, worker_address='http://localhost:21002', controller_address='http://localhost:21001', model_path='models/testgpt', revision='main', device='mps', gpus=None, num_gpus=1, max_gpu_memory=None, load_8bit=False, cpu_offloading=False, gptq_ckpt=None, gptq_wbits=16, gptq_groupsize=-1, gptq_act_order=False, awq_ckpt=None, awq_wbits=16, awq_groupsize=-1, model_names=None, conv_template=None, embed_in_truncate=False, limit_worker_concurrency=5, stream_interval=2, no_register=False)
2024-01-22 17:08:04 | INFO | stdout | testgpt!!!!!!
2024-01-22 17:08:04 | INFO | model_worker | Loading the model ['testgpt'] on worker c61383a5 ...
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 286, in hf_raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | response.raise_for_status()
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | raise HTTPError(http_error_msg, response=self)
2024-01-22 17:08:04 | ERROR | stderr | requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/models/testgpt/resolve/main/tokenizer_config.json
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | The above exception was the direct cause of the following exception:
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/utils/hub.py", line 429, in cached_file
2024-01-22 17:08:04 | ERROR | stderr | resolved_file = hf_hub_download(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
2024-01-22 17:08:04 | ERROR | stderr | return fn(*args, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1368, in hf_hub_download
2024-01-22 17:08:04 | ERROR | stderr | raise head_call_error
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download
2024-01-22 17:08:04 | ERROR | stderr | metadata = get_hf_file_metadata(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
2024-01-22 17:08:04 | ERROR | stderr | return fn(*args, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata
2024-01-22 17:08:04 | ERROR | stderr | r = _request_wrapper(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
2024-01-22 17:08:04 | ERROR | stderr | response = _request_wrapper(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
2024-01-22 17:08:04 | ERROR | stderr | hf_raise_for_status(response)
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 323, in hf_raise_for_status
2024-01-22 17:08:04 | ERROR | stderr | raise RepositoryNotFoundError(message, response) from e
2024-01-22 17:08:04 | ERROR | stderr | huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65ae3074-2287e248178f4d250c755ad1;7b85dc4b-9482-4cb3-95c0-2ce23b61028f)
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Repository Not Found for url: https://huggingface.co/models/testgpt/resolve/main/tokenizer_config.json.
2024-01-22 17:08:04 | ERROR | stderr | Please make sure you specified the correct repo_id and repo_type.
2024-01-22 17:08:04 | ERROR | stderr | If you are trying to access a private or gated repo, make sure you are authenticated.
2024-01-22 17:08:04 | ERROR | stderr | Invalid username or password.
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | The above exception was the direct cause of the following exception:
2024-01-22 17:08:04 | ERROR | stderr |
2024-01-22 17:08:04 | ERROR | stderr | Traceback (most recent call last):
2024-01-22 17:08:04 | ERROR | stderr | File "", line 198, in _run_module_as_main
2024-01-22 17:08:04 | ERROR | stderr | File "", line 88, in _run_code
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 521, in
2024-01-22 17:08:04 | ERROR | stderr | args, worker = create_model_worker()
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 498, in create_model_worker
2024-01-22 17:08:04 | ERROR | stderr | worker = ModelWorker(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/server/model_worker.py", line 213, in init
2024-01-22 17:08:04 | ERROR | stderr | self.model, self.tokenizer = load_model(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 278, in load_model
2024-01-22 17:08:04 | ERROR | stderr | model, tokenizer = adapter.load_model(model_path, kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 1589, in load_model
2024-01-22 17:08:04 | ERROR | stderr | model, tokenizer = super().load_model(model_path, from_pretrained_kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/Users/fanjunyang/Documents/Tools/Test-Agent/chat/model/model_adapter.py", line 62, in load_model
2024-01-22 17:08:04 | ERROR | stderr | tokenizer = AutoTokenizer.from_pretrained(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 686, in from_pretrained
2024-01-22 17:08:04 | ERROR | stderr | tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 519, in get_tokenizer_config
2024-01-22 17:08:04 | ERROR | stderr | resolved_config_file = cached_file(
2024-01-22 17:08:04 | ERROR | stderr | ^^^^^^^^^^^^
2024-01-22 17:08:04 | ERROR | stderr | File "/opt/homebrew/lib/python3.11/site-packages/transformers/utils/hub.py", line 450, in cached_file
2024-01-22 17:08:04 | ERROR | stderr | raise EnvironmentError(
2024-01-22 17:08:04 | ERROR | stderr | OSError: models/testgpt is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
2024-01-22 17:08:04 | ERROR | stderr | If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login or by passing token=<your_token>