-
@Lightning-AI , University of Wisconsin-Madison
- Madison, WI
- https://magazine.sebastianraschka.com
- @rasbt
- in/sebastianraschka
Highlights
- Pro
Lists (1)
Sort Name ascending (A-Z)
Stars
Named Entity Recognition with an decoder-only (autoregressive) LLM using HuggingFace
Command line program to validate and convert CITATION.cff files.
A general fine-tuning kit geared toward diffusion models.
CLI/GUI for managing the battery charging status for Apple silicon (M1, M32, M3) Macs
Machine Learning Journal for Intermediate to Advanced Topics.
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
Finetune a pre-trained GPT-2 model to generate personalized product recommendations for users, based on product reviews and metadata
The most streamlined road map to learn ML fundamentals for free.
A reactive notebook for Python — run reproducible experiments, execute as a script, deploy as an app, and version with git.
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
Make PyTorch models up to 40% faster! Thunder is a source to source compiler for PyTorch. It enables using different hardware executors at once; across one or thousands of GPUs.
Transform datasets at scale. Optimize datasets for fast AI model training.
Reference implementation for DPO (Direct Preference Optimization)
Machine Learning Engineering Open Book
Full Stack Graph Machine Learning: Theory, Practice, Tools and Techniques
Distributed Reinforcement Learning accelerated by Lightning Fabric
NeurIPS Large Language Model Efficiency Challenge: 1 LLM + 1GPU + 1Day
Scientific Python Library Development Guide and Cookiecutter
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI)
Large Language Models for All, 🦙 Cult and More, Stay in touch !
Fast & Simple repository for pre-training and fine-tuning T5-style models
Train to 94% on CIFAR-10 in <6.3 seconds on a single A100. Or ~95.79% in ~110 seconds (or less!)
Advanced AI Explainability for computer vision. Support for CNNs, Vision Transformers, Classification, Object detection, Segmentation, Image similarity and more.
Lightning HPO & Training Studio App