-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip install -r requirments failed #1
Comments
Hello, you can try the following solutions: |
i have installed Rust and update pip,it raise a problem:
note: This error originates from a subprocess, and is likely not a problem with pip. |
Perhaps you can try changing the Python version. conda create -n py3.11 python=3.11
conda activate py3.11
pip install tokenizers Or look for some solutions in the hugginface/tokenizers repository on GitHub. |
Yes, i have pip install successfully by degrading python to 3.11 |
If you cannot connect to Huggingface to download, you can choose to use domestic model platforms to download, such as WiseModel or ModelScope. |
yes,I solved the problem by using a mirror website |
Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [51 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-312
creating build/lib.linux-x86_64-cpython-312/tokenizers
copying py_src/tokenizers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers
creating build/lib.linux-x86_64-cpython-312/tokenizers/models
copying py_src/tokenizers/models/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/models
creating build/lib.linux-x86_64-cpython-312/tokenizers/decoders
copying py_src/tokenizers/decoders/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/decoders
creating build/lib.linux-x86_64-cpython-312/tokenizers/normalizers
copying py_src/tokenizers/normalizers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/normalizers
creating build/lib.linux-x86_64-cpython-312/tokenizers/pre_tokenizers
copying py_src/tokenizers/pre_tokenizers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/pre_tokenizers
creating build/lib.linux-x86_64-cpython-312/tokenizers/processors
copying py_src/tokenizers/processors/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/processors
creating build/lib.linux-x86_64-cpython-312/tokenizers/trainers
copying py_src/tokenizers/trainers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/trainers
creating build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
creating build/lib.linux-x86_64-cpython-312/tokenizers/tools
copying py_src/tokenizers/tools/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/tools
copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-cpython-312/tokenizers/tools
copying py_src/tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers
copying py_src/tokenizers/models/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/models
copying py_src/tokenizers/decoders/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/decoders
copying py_src/tokenizers/normalizers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/normalizers
copying py_src/tokenizers/pre_tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/pre_tokenizers
copying py_src/tokenizers/processors/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/processors
copying py_src/tokenizers/trainers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/trainers
copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-cpython-312/tokenizers/tools
running build_ext
running build_rust
error: can't find Rust compiler
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects
The text was updated successfully, but these errors were encountered: