rust-bert is a Rust-based implementation of transformer-based natural language processing models that provides ready-to-use pipelines for tasks such as text classification, summarization, and question answering. The project ports many capabilities of the Hugging Face Transformers ecosystem into the Rust programming language. It allows developers to run state-of-the-art NLP models like BERT, GPT-2, and DistilBERT directly within Rust applications while maintaining high performance and memory efficiency. The library integrates with Rust machine learning infrastructure using crates such as tch-rs and ONNX Runtime for model execution. It also includes tokenization utilities, model architectures, and task-specific pipelines that simplify the development of NLP applications. Because Rust is known for its safety and performance, this project enables developers to deploy modern NLP models in production systems written in Rust.

Features

  • Rust implementation of transformer-based NLP models
  • Support for models such as BERT, DistilBERT, and GPT-2
  • Ready-to-use pipelines for common NLP tasks
  • Integration with tch-rs and ONNX Runtime for model execution
  • Multithreaded tokenization for high-performance preprocessing
  • Production-friendly deployment in Rust applications

Project Samples

Project Activity

See All Activity >

Categories

Machine Learning

License

Apache License V2.0

Follow rust-bert

rust-bert Web Site

Other Useful Business Software
Gemini 3 and 200+ AI Models on One Platform Icon
Gemini 3 and 200+ AI Models on One Platform

Access Google's best plus Claude, Llama, and Gemma. Fine-tune and deploy from one console.

Build generative AI apps with Vertex AI. Switch between models without switching platforms.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of rust-bert!

Additional Project Details

Programming Language

Rust

Related Categories

Rust Machine Learning Software

Registered

16 hours ago