Skip to content

Installation Error - Failed building wheel for tokenizers #2831

@victorlongo

Description

@victorlongo

🐛 Bug

Information

Model I am using (Bert, XLNet ...): N/A

Language I am using the model on (English, Chinese ...): N/A

The problem arises when using:

  • the official example scripts: (give details below)

Problem arises in transformers installation on Microsoft Windows 10 Pro, version 10.0.17763

After creating and activating the virtual environment, installing transformers is not possible, because the following error occurs:

"error: can not find Rust Compiler"
"ERROR: Failed building wheel for tokenizers"
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed d

The tasks I am working on is:
[X ] transformers installation

To reproduce

Steps to reproduce the behavior:

  1. From command line interface, create and activate a virtual environment by following the steps in this URL: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
  2. Install transformers from source, by following the example in the topic From Source on this URL: https://github.com/huggingface/transformers
-m pip --version -m pip install --upgrade pip -m pip install --user virtualenv -m venv env .\env\Scripts\activate pip install transformers ERROR: Command errored out with exit status 1: command: 'c:\users\vbrandao\env\scripts\python.exe' 'c:\users\vbrandao\env\lib\site-packages\pip\_vendor\pep517\_in_process.py' build_wheel 'C:\Users\vbrandao\AppData\Local\Temp\tmpj6evjmze' cwd: C:\Users\vbrandao\AppData\Local\Temp\pip-install-sza2_lmj\tokenizers Complete output (10 lines): running bdist_wheel running build running build_py creating build creating build\lib creating build\lib\tokenizers copying tokenizers\__init__.py -> build\lib\tokenizers running build_ext running build_rust error: Can not find Rust compiler ---------------------------------------- ERROR: Failed building wheel for tokenizers Failed to build tokenizers ERROR: Could not build wheels for tokenizers which use PEP 517 and cannot be installed directly 

Expected behavior

Installation of transformers should be complete.

Environment info

  • transformers version: N/A - installation step
  • Platform: Command Line Interface / Virtual Env
  • Python version: python 3.8
  • PyTorch version (GPU?): N/A
  • Tensorflow version (GPU?): N/A
  • Using GPU in script?: N/A
  • Using distributed or parallel set-up in script?: N/A
    tokenizers_intallation_error

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions