-
Renmin University of China
Highlights
- Pro
Stars
Code for the paper "Beyond Autoregression: Discrete Diffusion for Complex Reasoning and Planning"
Code for “FlowMM Generating Materials with Riemannian Flow Matching” and "FlowLLM: Flow Matching for Material Generation with Large Language Models as Base Distributions"
EDM2 and Autoguidance -- Official PyTorch implementation
Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"
Minimal, clean code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization, with PyTorch/CUDA
Gemini is a modern LaTex beamerposter theme 🖼
Release for Improved Denoising Diffusion Probabilistic Models
Minimal, clean code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
v objective diffusion inference code for PyTorch.
Karras et al. (2022) diffusion models for PyTorch
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.
A PyTorch implementation of the paper "All are Worth Words: A ViT Backbone for Diffusion Models".
Official PyTorch implementation for "Unifying Bayesian Flow Networks and Diffusion Models through Stochastic Differential Equations"
PyTorch implementation for "Training and Inference on Any-Order Autoregressive Models the Right Way", NeurIPS 2022 Oral, TPM 2023 Best Paper Honorable Mention
Elucidating the Design Space of Diffusion-Based Generative Models (EDM)
[ICML 2024 Best Paper] Discrete Diffusion Modeling by Estimating the Ratios of the Data Distribution (https://arxiv.org/abs/2310.16834)
Template-based docx report creation
A PyTorch implementation of MAGE: MAsked Generative Encoder to Unify Representation Learning and Image Synthesis
✨✨Latest Advances on Multimodal Large Language Models
High-fidelity performance metrics for generative models in PyTorch
VQVAEs, GumbelSoftmaxes and friends
A simple Bayesian Flow model for MNIST.
This is the official code release for Bayesian Flow Networks.
A simple implimentation of Bayesian Flow Networks (BFN)
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.