Stars
🌻 The collaborative editing software that runs Wikipedia. Mirror from https://gerrit.wikimedia.org/g/mediawiki/core. See https://mediawiki.org/wiki/Developer_access for contributing.
🎬 人人影视 机器人和网站,包含人人影视全部资源以及众多网友的网盘分享
Mooncake is the serving platform for Kimi, a leading LLM service provided by Moonshot AI.
Puck is a high-performance ANN search engine
A high-throughput and memory-efficient inference and serving engine for LLMs
Fast and memory-efficient exact attention
ChatLaw:A Powerful LLM Tailored for Chinese Legal. 中文法律大模型
A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
Concurrently chat with ChatGPT, Bing Chat, Bard, Alpaca, Vicuna, Claude, ChatGLM, MOSS, 讯飞星火, 文心一言 and more, discover the best answers
ImageBind One Embedding Space to Bind Them All
[ICML 2023] SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
AudioGPT: Understanding and Generating Speech, Music, Sound, and Talking Head
Universal LLM Deployment Engine with ML Compilation
The Official Python Client for Lamini's API
An open-source tool-augmented conversational language model from Fudan University
Semantic cache for LLMs. Fully integrated with LangChain and llama_index.
Provides end-to-end model development pipelines for LLMs and Multimodal models that can be launched on-prem or cloud-native.
🦜🔗 Build context-aware reasoning applications
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
Source code for Twitter's Recommendation Algorithm
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
microsoft / Megatron-DeepSpeed
Forked from NVIDIA/Megatron-LMOngoing research training transformer language models at scale, including: BERT & GPT-2
OneFlow is a deep learning framework designed to be user-friendly, scalable and efficient.
Code and documentation to train Stanford's Alpaca models, and generate the data.
Aligning pretrained language models with instruction data generated by themselves.
GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)