transformers==4.42.0版本下,自動安裝模型時出現一個BUG(自動從Hugging Faces上下載)。
2025-07-02 14:07:08,641 - __main__ - ERROR - 模型加載失敗: Failed to import transformers.models.llama.tokenization_llama_fast because of the following error (look up to see its traceback):
(unicode error) 'utf-8' codec can't decode bytes in position 3344-3345: invalid continuation byte (tokenization_llama_fast.py, line 120)
ERROR: ? ?Traceback (most recent call last):
? File "/root/miniconda3/envs/我的conda名稱/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1560, in _get_module
? ? return importlib.import_module("." + module_name, self.__name__)
? File "/root/miniconda3/envs/我的conda名稱/lib/python3.10/importlib/__init__.py", line 126, in import_module
? ? return _bootstrap._gcd_import(name[level:], package, level)
? File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
? File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
? File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
? File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
? File "<frozen importlib._bootstrap_external>", line 879, in exec_module
? File "<frozen importlib._bootstrap_external>", line 1017, in get_code
? File "<frozen importlib._bootstrap_external>", line 947, in source_to_code
? File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
? File "/root/miniconda3/envs/我的conda名稱/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 120
? ? """
? ? ? ?^
SyntaxError: (unicode error) 'utf-8' codec can't decode bytes in position 3344-3345: invalid continuation byte
其實表面上看就知道這個是一個字符編碼的問題,但是網上沒有什么簡單的針對方案。其實你找到這個文件,打開一開似乎還看不出?tokenization_llama_fast.py, line 120 有什么問題,而且你還會發現這一行其實就是一個注釋,最簡單的方式就是清除注釋,之后運行就正常了,非常簡單。