Skip to content
This repository was archived by the owner on Oct 25, 2024. It is now read-only.

Commit 3a09877

Browse files
authored
Fix modeling_auto trust_remote_code issue (#1355)
* Update modeling_auto.py Signed-off-by: Wang, Chang <chang1.wang@intel.com>
1 parent 6567d46 commit 3a09877

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

intel_extension_for_transformers/transformers/modeling/modeling_auto.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -204,7 +204,8 @@ def from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs):
204204
elif kwargs.get("use_neural_speed", None) is not None:
205205
use_neural_speed = kwargs.pop("use_neural_speed", True) and not use_xpu
206206
else:
207-
config = transformers.AutoConfig.from_pretrained(pretrained_model_name_or_path)
207+
config = transformers.AutoConfig.from_pretrained(pretrained_model_name_or_path,
208+
trust_remote_code=kwargs.get("trust_remote_code", False))
208209
if hasattr(config, "model_type") == False:
209210
logger.error("Can't get the model_type. Please check the correct model_type")
210211
exit(0)

0 commit comments

Comments
 (0)