8 releases (4 breaking)

Uses new Rust 2024

new 0.5.0 Feb 9, 2026
0.4.0 Jan 29, 2026
0.3.0 Jan 28, 2026
0.2.3 Jan 28, 2026
0.1.1 Dec 4, 2025

#1593 in Debugging


Used in 2 crates

MIT/Apache

80KB
2K SLoC

Burn central

Current Crates.io Version Minimum Supported Rust Version Test Status license


Description

Burn Central is a new way of using Burn. It aims at providing a central platform for experiment tracking, model sharing, and deployment for all Burn users!

This repository contains the SDK associated with the project. It offers macros that help attach to your code and send training data to our application. To use this project you must first create an account on the application.

Also needed to use this is the new burn-cli.

Installation

Add Burn Central to your Cargo.toml:

[dependencies]
burn-central = "0.5.0"

Quick Start

Currently, we only support training. Here's how to integrate Burn Central into your training workflow:

1. Register your training function

Use the #[register] macro to register your training function:

use burn_central::{
    experiment::ExperimentRun,
    macros::register,
    runtime::{Args, ArtifactLoader, Model, MultiDevice},
};
use burn::prelude::*;

#[register(training, name = "mnist")]
pub fn training<B: AutodiffBackend>(
    client: &ExperimentRun,
    config: Args<YourExperimentConfig>,
    MultiDevice(devices): MultiDevice<B>,
    loader: ArtifactLoader<ModelArtifact<B>>,
) -> Result<Model<ModelArtifact<B::InnerBackend>>, String> {
    // Log your configuration
    client.log_config("Training Config", &training_config)
        .expect("Logging config failed");

    // Your training logic here...
    let model = train::<B>(client, artifact_dir, &training_config, devices[0].clone())?;

    Ok(Model(ModelArtifact {
        model_record: model.into_record(),
        config: training_config,
    }))
}

2. Integrate with your Learner

To enable experiment tracking, you need to add three key components to your LearnerBuilder:

use burn_central::integration::{
    RemoteMetricLogger,
    remote_interrupter,
    RemoteCheckpointRecorder,
};
use burn::train::{LearnerBuilder, metric::{AccuracyMetric, LossMetric}};

let learner = LearnerBuilder::new(artifact_dir)
    .metric_train_numeric(AccuracyMetric::new())
    .metric_valid_numeric(AccuracyMetric::new())
    .metric_train_numeric(LossMetric::new())
    .metric_valid_numeric(LossMetric::new())
    // Remote metric logging
    .with_metric_logger(RemoteMetricLogger::new(client))
    // Remote checkpoint saving
    .with_file_checkpointer(RemoteCheckpointRecorder::new(client))
    // Remote interruption handling
    .with_interrupter(remote_interrupter(client))
    .num_epochs(config.num_epochs)
    .summary()
    .build(
        model.init::<B>(&device),
        optimizer.init(),
        learning_rate,
        LearningStrategy::SingleDevice(device),
    );

3. Run your training

Once integrated, run your training using the burn-cli to automatically track metrics, checkpoints, and logs on Burn Central.

Requirements

Contribution

Contributions to this repository are welcome. You can also submit issues for features you would like to see in the near future.

License

Licensed under either of:

at your option.

Dependencies

~61–100MB
~1.5M SLoC