Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pip install -r requirments failed #1

Closed
cccchou opened this issue Jul 26, 2024 · 6 comments
Closed

pip install -r requirments failed #1

cccchou opened this issue Jul 26, 2024 · 6 comments

Comments

@cccchou
Copy link

cccchou commented Jul 26, 2024

Building wheels for collected packages: tokenizers
Building wheel for tokenizers (pyproject.toml) ... error
error: subprocess-exited-with-error

× Building wheel for tokenizers (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [51 lines of output]
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-312
creating build/lib.linux-x86_64-cpython-312/tokenizers
copying py_src/tokenizers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers
creating build/lib.linux-x86_64-cpython-312/tokenizers/models
copying py_src/tokenizers/models/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/models
creating build/lib.linux-x86_64-cpython-312/tokenizers/decoders
copying py_src/tokenizers/decoders/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/decoders
creating build/lib.linux-x86_64-cpython-312/tokenizers/normalizers
copying py_src/tokenizers/normalizers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/normalizers
creating build/lib.linux-x86_64-cpython-312/tokenizers/pre_tokenizers
copying py_src/tokenizers/pre_tokenizers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/pre_tokenizers
creating build/lib.linux-x86_64-cpython-312/tokenizers/processors
copying py_src/tokenizers/processors/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/processors
creating build/lib.linux-x86_64-cpython-312/tokenizers/trainers
copying py_src/tokenizers/trainers/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/trainers
creating build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_bpe.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/bert_wordpiece.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/sentencepiece_unigram.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/base_tokenizer.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/char_level_bpe.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
copying py_src/tokenizers/implementations/byte_level_bpe.py -> build/lib.linux-x86_64-cpython-312/tokenizers/implementations
creating build/lib.linux-x86_64-cpython-312/tokenizers/tools
copying py_src/tokenizers/tools/init.py -> build/lib.linux-x86_64-cpython-312/tokenizers/tools
copying py_src/tokenizers/tools/visualizer.py -> build/lib.linux-x86_64-cpython-312/tokenizers/tools
copying py_src/tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers
copying py_src/tokenizers/models/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/models
copying py_src/tokenizers/decoders/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/decoders
copying py_src/tokenizers/normalizers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/normalizers
copying py_src/tokenizers/pre_tokenizers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/pre_tokenizers
copying py_src/tokenizers/processors/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/processors
copying py_src/tokenizers/trainers/init.pyi -> build/lib.linux-x86_64-cpython-312/tokenizers/trainers
copying py_src/tokenizers/tools/visualizer-styles.css -> build/lib.linux-x86_64-cpython-312/tokenizers/tools
running build_ext
running build_rust
error: can't find Rust compiler

  If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.

  To update pip, run:

      pip install --upgrade pip

  and then retry package installation.

  If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

@coderxyd
Copy link
Contributor

Hello, you can try the following solutions:
1、To update pip: pip install --upgrade pip
2、Try installing a Rust compiler: https://www.rust-lang.org/tools/install

@coderxyd coderxyd reopened this Jul 26, 2024
@cccchou
Copy link
Author

cccchou commented Jul 26, 2024

Hello, you can try the following solutions: 1、To update pip: pip install --upgrade pip 2、Try installing a Rust compiler: https://www.rust-lang.org/tools/install

i have installed Rust and update pip,it raise a problem:
warning: variable does not need to be mutable
--> tokenizers-lib/src/models/unigram/model.rs:265:21
|
265 | let mut target_node = &mut best_path_ends_at[key_pos];
| ----^^^^^^^^^^^
| |
| help: remove this mut
|
= note: #[warn(unused_mut)] on by default

  warning: variable does not need to be mutable
     --> tokenizers-lib/src/models/unigram/model.rs:282:21
      |
  282 |                 let mut target_node = &mut best_path_ends_at[starts_at + mblen];
      |                     ----^^^^^^^^^^^
      |                     |
      |                     help: remove this `mut`

  warning: variable does not need to be mutable
     --> tokenizers-lib/src/pre_tokenizers/byte_level.rs:200:59
      |
  200 |     encoding.process_tokens_with_offsets_mut(|(i, (token, mut offsets))| {
      |                                                           ----^^^^^^^
      |                                                           |
      |                                                           help: remove this `mut`

  error: casting `&T` to `&mut T` is undefined behavior, even if the reference is unused, consider instead using an `UnsafeCell`
     --> tokenizers-lib/src/models/bpe/trainer.rs:526:47
      |
  522 |                     let w = &words[*i] as *const _ as *mut _;
      |                             -------------------------------- casting happend here
  ...
  526 |                         let word: &mut Word = &mut (*w);
      |                                               ^^^^^^^^^
      |
      = note: for more information, visit <https://doc.rust-lang.org/book/ch15-05-interior-mutability.html>
      = note: `#[deny(invalid_reference_casting)]` on by default

  warning: `tokenizers` (lib) generated 3 warnings
  error: could not compile `tokenizers` (lib) due to 1 previous error; 3 warnings emitted

  Caused by:
    process didn't exit successfully: `/home/alex/.rustup/toolchains/stable-x86_64-unknown-linux-gnu/bin/rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' --check-cfg 'cfg(docsrs)' --check-cfg 'cfg(feature, values("cached-path", "clap", "cli", "default", "dirs", "esaxx_fast", "fancy-regex", "http", "indicatif", "onig", "progressbar", "reqwest", "unstable_wasm"))' -C metadata=ca82cc11ddbd27ad -C extra-filename=-ca82cc11ddbd27ad --out-dir /tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps -C strip=debuginfo -L dependency=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps --extern aho_corasick=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libaho_corasick-058ca2dc14da8460.rmeta --extern cached_path=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libcached_path-0ddc233bd0f68097.rmeta --extern clap=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libclap-1c4a241bb9a5ea97.rmeta --extern derive_builder=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libderive_builder-30b036efe70af586.rmeta --extern dirs=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libdirs-b311a36f526b825d.rmeta --extern esaxx_rs=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libesaxx_rs-c65ab84d25070fb0.rmeta --extern getrandom=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libgetrandom-5ae740cfed4fe951.rmeta --extern indicatif=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libindicatif-eac1b29ca288ba51.rmeta --extern itertools=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libitertools-c16f098d485dff7c.rmeta --extern lazy_static=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/liblazy_static-61c19818fd703d35.rmeta --extern log=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/liblog-2b0fd75752e87a82.rmeta --extern macro_rules_attribute=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libmacro_rules_attribute-ff063e4c1dc7b613.rmeta --extern monostate=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libmonostate-465817acb0a4eda0.rmeta --extern onig=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libonig-c5a5b8c2c12ad002.rmeta --extern paste=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libpaste-0c52bc493755067c.so --extern rand=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/librand-5f4fa53845062e70.rmeta --extern rayon=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/librayon-5d621dc00426f097.rmeta --extern rayon_cond=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/librayon_cond-935d56f420972abd.rmeta --extern regex=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libregex-09715098661a60b8.rmeta --extern regex_syntax=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libregex_syntax-538adb3fb8143d33.rmeta --extern reqwest=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libreqwest-1c0c17a79c545c37.rmeta --extern serde=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libserde-252b70978046a684.rmeta --extern serde_json=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libserde_json-9ead1eedf9e4df53.rmeta --extern spm_precompiled=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libspm_precompiled-c06319c8751ae99a.rmeta --extern thiserror=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libthiserror-a5fb447d5a8fc752.rmeta --extern unicode_normalization_alignments=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libunicode_normalization_alignments-05d295e16055f4ea.rmeta --extern unicode_segmentation=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libunicode_segmentation-15110a3e2499edda.rmeta --extern unicode_categories=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/deps/libunicode_categories-c5e00e0e8b967371.rmeta -L native=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/build/bzip2-sys-cf2a1788a126262c/out/lib -L native=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/build/zstd-sys-1eedde4d4448ea63/out -L native=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/build/esaxx-rs-1ffcd584795bbc51/out -L native=/tmp/pip-install-x4c1a76i/tokenizers_0fef19d3d5d449bb83986f9e86f5458f/target/release/build/onig_sys-1328865d5c179d9c/out` (exit status: 1)
  error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

@coderxyd
Copy link
Contributor

Perhaps you can try changing the Python version.

conda create -n py3.11 python=3.11
conda activate py3.11
pip install tokenizers

Or look for some solutions in the hugginface/tokenizers repository on GitHub.
huggingface/tokenizers#1050

@cccchou
Copy link
Author

cccchou commented Jul 26, 2024

Perhaps you can try changing the Python version.

conda create -n py3.11 python=3.11
conda activate py3.11
pip install tokenizers

Or look for some solutions in the hugginface/tokenizers repository on GitHub. huggingface/tokenizers#1050

Yes, i have pip install successfully by degrading python to 3.11
but when i download the module it rasie the problem with internet connection:
huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the files on the Hub and we cannot find the appropriate snapshot folder for the specified revision on the local disk. Please check your internet connection and try again.
how can i download the module?

@coderxyd
Copy link
Contributor

If you cannot connect to Huggingface to download, you can choose to use domestic model platforms to download, such as WiseModel or ModelScope.

@cccchou
Copy link
Author

cccchou commented Jul 26, 2024

If you cannot connect to Huggingface to download, you can choose to use domestic model platforms to download, such as WiseModel or ModelScope.

yes,I solved the problem by using a mirror website

@zxlzr zxlzr closed this as completed Jul 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants