Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 vllm 79836 16730 Python 1987 A high-throughput and memory-efficient inference and serving engine for LLMs 2026-05-13T06:08:01Z
2 LlamaFactory 71200 8699 Python 958 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2026-05-12T03:35:23Z
3 sglang 27734 5868 Python 644 SGLang is a high-performance serving framework for large language models and multimodal models. 2026-05-13T06:26:56Z
4 ms-swift 14091 1406 Python 990 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 600+ LLMs (Qwen3.6, DeepSeek-R1, GLM-5.1, InternLM3, Llama4, …) and 300+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, GLM4.5v, Gemma4, Llava, Phi4, …) (AAAI 2025). 2026-05-13T06:07:11Z
5 TensorRT-LLM 13620 2378 Python 588 TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and supports state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in a performant way. 2026-05-13T06:25:09Z
6 flashinfer 5606 971 Python 345 FlashInfer: Kernel Library for LLM Serving 2026-05-13T05:29:19Z
7 MoeKoeMusic 5605 367 Vue 0 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux / Web :electron: 2026-05-09T09:03:01Z
8 Bangumi 5461 163 TypeScript 33 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android。 2026-05-12T18:06:19Z
9 xtuner 5128 420 Python 238 A Next-Generation Training Engine Built for Ultra-Large MoE Models 2026-05-13T04:57:40Z
10 trace.moe 4984 261 None 0 Trace back an anime scene with a screenshot 2026-04-06T15:47:36Z
11 fastllm 4548 453 C++ 279 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2026-05-13T03:20:32Z
12 GLM-4.5 4339 453 Python 24 GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models 2026-02-01T08:28:10Z
13 Moeditor 4114 271 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
14 flash-moe 3844 479 Objective-C 10 Running a big model on a small laptop 2026-03-19T17:21:57Z
15 Moe-Counter 2896 288 JavaScript 3 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2026-04-16T03:39:37Z
16 moemail 2529 2233 TypeScript 46 A cute temporary email service built with NextJS + Cloudflare technology stack 🎉 | 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2026-03-30T09:35:05Z
17 MoeGoe 2415 245 Python 28 Executable file for VITS inference 2023-08-22T07:17:37Z
18 MoE-LLaVA 2314 141 Python 65 【TMM 2025🔥】 Mixture-of-Experts for Large Vision-Language Models 2025-07-15T07:59:33Z
19 MoBA 2117 146 Python 10 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
20 ICEdit 2102 115 Python 23 [NeurIPS 2025] Image editing is worth a single LoRA! 0.1% training data for fantastic image editing! Surpasses GPT-4o in ID persistence~ MoE ckpt released! Only 4GB VRAM is enough to run! 2025-12-19T19:08:02Z
21 DeepSeek-MoE 1929 308 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
22 fastmoe 1848 206 Python 27 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
23 OpenMoE 1677 87 Python 6 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
24 paimon-moe 1515 281 JavaScript 310 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2026-04-28T10:41:45Z
25 uccl 1358 145 C++ 53 UCCL is an efficient communication library for GPUs, covering collectives, P2P (e.g., KV cache transfer, RL weight transfer), and EP (e.g., GPU-driven) 2026-05-13T05:44:55Z
26 moepush 1344 428 TypeScript 14 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
27 MOE 1321 139 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
28 SpikingBrain-7B 1318 186 Python 7 Spiking Brain-inspired Large Models, integrating hybrid efficient attention, MoE modules and spike encoding into its architecture 2026-05-02T15:59:25Z
29 mixture-of-experts 1243 112 Python 6 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
30 Uni-MoE 1108 69 Python 26 Uni-MoE: Lychee’s Large Multimodal Model Family. 2025-12-22T02:32:34Z
31 Aria 1087 87 Jupyter Notebook 32 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
32 MoeMemosAndroid 1085 119 Kotlin 92 An app to help you capture thoughts and ideas 2026-05-01T16:26:34Z
33 SmartImage 1018 55 C# 4 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2026-05-02T15:40:32Z
34 llama-moe 1001 60 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
35 MoeTTS 996 75 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
36 MiniMind-in-Depth 995 86 None 6 轻量级大语言模型MiniMind的源码解读,包含tokenizer、RoPE、MoE、KV Cache、pretraining、SFT、LoRA、DPO等完整流程 2025-06-16T14:13:15Z
37 Tutel 990 108 C 54 Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4 2026-05-11T16:43:18Z
38 Time-MoE 959 113 Python 15 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2026-03-21T16:00:55Z
39 moebius 935 52 JavaScript 40 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
40 Adan 816 70 Python 6 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2025-06-08T14:35:41Z
41 Hunyuan-A13B 812 117 Python 24 Tencent Hunyuan A13B (short as Hunyuan-A13B), an innovative and open-source LLM built on a fine-grained MoE architecture. 2025-07-08T08:45:27Z
42 DeepSeek-671B-SFT-Guide 806 96 Python 1 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
43 cudnn-frontend 801 160 Python 41 cudnn_frontend provides a c++ wrapper for the cudnn backend API and samples on how to use it 2026-05-08T20:03:09Z
44 MoeMemos 785 69 Swift 67 An app to help you capture thoughts and ideas 2026-05-01T15:23:59Z
45 MixtralKit 774 75 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
46 moe-theme.el 770 66 Emacs Lisp 15 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2026-03-04T15:21:30Z
47 diy-llm 755 82 Jupyter Notebook 1 🎓 系统性大语言模型构建课程|🛠️ 覆盖预训练数据工程、Tokenizer、Transformer、MoE、GPU 编程 (CUDA/Triton)、分布式训练、Scaling Laws、推理优化及对齐 (SFT/RLHF/GRPO)|🚀 6 个渐进式作业 + 代码驱动,建立 LLM 全栈认知体系 2026-05-09T08:29:54Z
48 moe 712 33 Nim 39 A command line based editor inspired by Vim. Written in Nim. 2026-05-12T18:24:42Z
49 sonic-moe 683 85 Python 8 Accelerating MoE with IO and Tile-aware Optimizations 2026-05-13T06:09:13Z
50 Awesome-Mixture-of-Experts-Papers 663 47 None 3 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
51 MoePeek 659 38 Swift 9 A lightweight macOS selection translator built with pure Swift 6, featuring on-device Apple Translate for privacy, only 5MB install size and stable ~50MB memory usage. 一款轻量级 macOS 划词翻译工具,纯 Swift 6 开发,设备端 Apple 翻译保护隐私,安装体积仅 5MB,后台运行内存稳定约 50MB 2026-04-03T05:56:57Z
52 moedict-webkit 648 99 Objective-C 102 萌典網站 2026-05-02T03:06:52Z
53 SwiftLM 640 34 Swift 0 ⚡ Native MLX Swift LLM inference server for Apple Silicon. OpenAI-compatible API, SSD streaming for 100B+ MoE models, TurboQuant KV cache compression, MACOS + iOS iPhone app. 2026-05-13T06:07:30Z
54 MoeList 637 23 Kotlin 32 Another unofficial Android MAL client 2026-05-06T22:31:14Z
55 vtbs.moe 631 36 Vue 34 Virtual YouTubers in bilibili 2025-07-31T13:39:09Z
56 satania.moe 616 54 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
57 Chinese-Mixtral 612 43 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2026-04-19T00:59:54Z
58 moebius 609 42 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
59 moebooru 600 79 Ruby 30 Moebooru, a fork of danbooru1 that has been heavily modified 2026-05-09T14:11:01Z
60 MoeGoe_GUI 573 69 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
61 trace.moe-telegram-bot 552 79 TypeScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2026-05-12T00:27:32Z
62 BitSoulStockSkill 539 44 Python 0 由BitSoul出品的A股市场全能Skill,自带免费历史数据,内置100+行业主流因子,完整的回测框架,基于MOE架构的股票筛选与买卖判断,更提供因子挖矿等趣味接口,欢迎安装试用,也欢共同开发交流! 2026-03-21T08:19:00Z
63 moerail 524 41 JavaScript 22 铁路车站代码查询 × 动车组交路查询 2025-08-13T12:55:25Z
64 YOLO-Master 504 57 Python 16 [CVPR2026]🚀🚀🚀Official code for the paper “YOLO-Master: MOE-Accelerated with Specialized Transformers for Enhanced Real-time Detection.” (YOLO = You Only Look Once) 🔥🔥🔥 2026-05-13T06:01:38Z
65 LPLB 503 35 Python 1 An early research stage expert-parallel load balancer for MoE models based on linear programming. 2025-11-19T07:20:35Z
66 step_into_llm 478 127 Jupyter Notebook 27 MindSpore online courses: Step into LLM 2025-12-22T11:46:46Z
67 MOE 423 78 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
68 DiT-MoE 420 19 Python 7 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
69 hydra-moe 415 16 Python 10 None 2023-11-02T22:53:15Z
70 MoeSR 412 12 JavaScript 8 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2026-03-09T14:07:16Z
71 MoeLoaderP 409 28 C# 12 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
72 eLLM 409 41 Rust 0 eLLM can infer LLM on CPUs faster than on GPUs 2026-05-13T06:20:47Z
73 Awesome-Efficient-Arch 407 33 None 0 Speed Always Wins: A Survey on Efficient Architectures for Large Language Models 2025-11-11T09:47:37Z
74 moe-sticker-bot 404 59 Go 37 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
75 awesome-moe-inference 387 15 None 0 Curated collection of papers in MoE model inference 2026-03-12T01:59:19Z
76 st-moe-pytorch 384 33 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
77 nmoe 384 33 Python 2 MoE training for Me and You and maybe other people 2026-03-15T22:23:47Z
78 WThermostatBeca 371 72 C++ 4 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
79 pixiv.moe 370 42 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
80 FreeMoe 363 8 None 8 Unlock App Vip 2026-01-30T05:50:54Z
81 Lvllm 358 34 Python 1 LvLLM is a special NUMA extension of vllm that makes full use of CPU and memory resources, reduces GPU memory requirements, and features an efficient GPU parallel and NUMA parallel architecture, supporting hybrid inference for MOE large models. 2026-04-28T13:07:10Z
82 MOEAFramework 356 129 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2026-01-21T16:26:02Z
83 notify.moe 352 46 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
84 soft-moe-pytorch 346 10 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
85 dialogue.moe 344 11 Python 1 None 2022-12-14T14:50:38Z
86 MoH 312 15 Python 5 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
87 MoE-Infinity 304 29 Python 8 PyTorch library for cost-effective, fast and easy serving of MoE models. 2026-05-04T18:57:20Z
88 android-app 303 30 Kotlin 4 Official LISTEN.moe Android app 2026-05-13T02:44:32Z
89 moell-blog 302 79 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
90 Project_Chronos 302 49 Python 0 ⚡ Zero-Stall MoE Inference via Lookahead Prediction & Async DMA Prefetching. Optimized for SSD I/O with Hybrid MLA+Sliding Window Attention. 2026-04-26T08:21:10Z
91 kimi-agent-internals 299 93 Fluent 0 Extracted artifacts from Kimi OK-Computer (and other agents) system for AI studies in agentic architecture. 2026-03-29T01:36:16Z
92 moeSS 296 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
93 apex-quant 288 24 Shell 2 Adaptive Precision for EXpert Models: MoE-aware mixed-precision quantization 2026-05-04T16:25:51Z
94 WAM-Diff 284 51 Python 1 WAM-Diff: A Masked Diffusion VLA Framework with MoE and Online Reinforcement Learning for Autonomous Driving 2026-02-01T03:59:10Z
95 moe 277 47 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
96 parameter-efficient-moe 277 17 Python 1 None 2023-10-31T19:21:15Z
97 Cornell-MOE 275 64 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
98 MoE-Adapters4CL 274 27 Python 7 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2025-09-18T08:38:29Z
99 Ling-V2 271 17 Python 4 Ling-V2 is a MoE LLM provided and open-sourced by InclusionAI. 2025-10-04T06:15:38Z
100 fiddler 265 35 Python 3 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z