Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 vllm 72274 14038 Python 1707 A high-throughput and memory-efficient inference and serving engine for LLMs 2026-03-07T04:27:05Z
2 LlamaFactory 67984 8293 Python 904 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2026-03-06T10:44:57Z
3 sglang 24185 4687 Python 588 SGLang is a high-performance serving framework for large language models and multimodal models. 2026-03-07T04:34:47Z
4 TensorRT-LLM 13023 2152 Python 536 TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and supports state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in a performant way. 2026-03-07T04:27:23Z
5 ms-swift 12951 1235 Python 843 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 600+ LLMs (Qwen3.5, DeepSeek-R1, GLM-5, InternLM3, Llama4, …) and 300+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, GLM4.5v, Llava, Phi4, …) (AAAI 2025). 2026-03-06T10:55:18Z
6 Bangumi 5278 158 TypeScript 32 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android。 2026-03-03T17:18:16Z
7 MoeKoeMusic 5152 340 Vue 2 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux / Web :electron: 2026-03-05T09:13:59Z
8 flashinfer 5093 762 Python 333 FlashInfer: Kernel Library for LLM Serving 2026-03-07T04:03:46Z
9 xtuner 5092 404 Python 234 A Next-Generation Training Engine Built for Ultra-Large MoE Models 2026-03-06T13:29:54Z
10 trace.moe 4951 260 None 0 Trace back an anime scene with a screenshot 2026-03-01T12:18:31Z
11 GLM-4.5 4247 439 Python 26 GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models 2026-02-01T08:28:10Z
12 fastllm 4170 414 C++ 319 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2026-03-05T12:04:18Z
13 Moeditor 4120 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
14 Moe-Counter 2801 284 JavaScript 3 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2026-02-05T07:16:09Z
15 MoeGoe 2419 246 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
16 MoE-LLaVA 2303 142 Python 65 【TMM 2025🔥】 Mixture-of-Experts for Large Vision-Language Models 2025-07-15T07:59:33Z
17 ICEdit 2080 113 Python 22 [NeurIPS 2025] Image editing is worth a single LoRA! 0.1% training data for fantastic image editing! Surpasses GPT-4o in ID persistence~ MoE ckpt released! Only 4GB VRAM is enough to run! 2025-12-19T19:08:02Z
18 MoBA 2075 136 Python 10 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
19 moemail 2067 1495 TypeScript 39 A cute temporary email service built with NextJS + Cloudflare technology stack 🎉 | 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2026-02-13T06:37:37Z
20 DeepSeek-MoE 1895 307 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
21 fastmoe 1845 202 Python 27 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
22 OpenMoE 1664 85 Python 6 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
23 paimon-moe 1508 278 JavaScript 308 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2026-02-25T04:22:48Z
24 MOE 1319 140 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
25 moepush 1316 414 TypeScript 13 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
26 SpikingBrain-7B 1266 171 Python 8 Spiking Brain-inspired Large Models, integrating hybrid efficient attention, MoE modules and spike encoding into its architecture 2025-12-01T11:13:32Z
27 mixture-of-experts 1232 110 Python 6 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
28 uccl 1232 127 C++ 50 UCCL is an efficient communication library for GPUs, covering collectives, P2P (e.g., KV cache transfer, RL weight transfer), and EP (e.g., GPU-driven) 2026-03-07T01:08:59Z
29 Aria 1084 87 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
30 Uni-MoE 1081 65 Python 26 Uni-MoE: Lychee’s Large Multimodal Model Family. 2025-12-22T02:32:34Z
31 MoeMemosAndroid 1023 112 Kotlin 83 An app to help you capture thoughts and ideas 2026-02-24T13:19:13Z
32 llama-moe 1002 62 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
33 MoeTTS 996 75 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
34 SmartImage 978 51 C# 4 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2026-03-01T02:13:12Z
35 Tutel 972 107 C 54 Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4 2026-03-06T15:12:49Z
36 Time-MoE 918 106 Python 13 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-12-12T08:11:45Z
37 moebius 910 49 JavaScript 40 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
38 Hunyuan-A13B 812 119 Python 24 Tencent Hunyuan A13B (short as Hunyuan-A13B), an innovative and open-source LLM built on a fine-grained MoE architecture. 2025-07-08T08:45:27Z
39 Adan 808 71 Python 5 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2025-06-08T14:35:41Z
40 DeepSeek-671B-SFT-Guide 797 96 Python 1 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
41 MixtralKit 771 75 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
42 moe-theme.el 768 66 Emacs Lisp 15 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2026-03-04T15:21:30Z
43 MiniMind-in-Depth 766 66 None 5 轻量级大语言模型MiniMind的源码解读,包含tokenizer、RoPE、MoE、KV Cache、pretraining、SFT、LoRA、DPO等完整流程 2025-06-16T14:13:15Z
44 MoeMemos 731 63 Swift 50 An app to help you capture thoughts and ideas 2026-02-27T06:44:39Z
45 moe 701 33 Nim 59 A command line based editor inspired by Vim. Written in Nim. 2026-03-06T04:01:46Z
46 Awesome-Mixture-of-Experts-Papers 661 45 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
47 moedict-webkit 644 99 Objective-C 103 萌典網站 2026-03-07T00:49:31Z
48 vtbs.moe 632 36 Vue 33 Virtual YouTubers in bilibili 2025-07-31T13:39:09Z
49 MoeList 628 21 Kotlin 30 Another unofficial Android MAL client 2026-03-04T13:59:50Z
50 satania.moe 612 55 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
51 moebius 610 42 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
52 Chinese-Mixtral 610 42 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
53 sonic-moe 601 59 Python 10 Accelerating MoE with IO and Tile-aware Optimizations 2026-02-27T05:42:53Z
54 moebooru 592 81 Ruby 29 Moebooru, a fork of danbooru1 that has been heavily modified 2026-02-28T12:48:55Z
55 MoeGoe_GUI 572 69 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
56 trace.moe-telegram-bot 550 78 TypeScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2026-03-07T02:02:14Z
57 moerail 520 42 JavaScript 21 铁路车站代码查询 × 动车组交路查询 2025-08-13T12:55:25Z
58 LPLB 499 32 Python 1 An early research stage expert-parallel load balancer for MoE models based on linear programming. 2025-11-19T07:20:35Z
59 step_into_llm 483 127 Jupyter Notebook 27 MindSpore online courses: Step into LLM 2025-12-22T11:46:46Z
60 MOE 423 78 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
61 hydra-moe 415 16 Python 10 None 2023-11-02T22:53:15Z
62 DiT-MoE 415 19 Python 7 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
63 Awesome-Efficient-Arch 400 33 None 0 Speed Always Wins: A Survey on Efficient Architectures for Large Language Models 2025-11-11T09:47:37Z
64 MoeLoaderP 399 28 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
65 YOLO-Master 398 43 Python 16 [CVPR2026]🚀🚀🚀Official code for the paper “YOLO-Master: MOE-Accelerated with Specialized Transformers for Enhanced Real-time Detection.” (YOLO = You Only Look Once) 🔥🔥🔥 2026-02-26T03:10:05Z
66 MoeSR 386 9 JavaScript 8 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2026-02-25T15:05:54Z
67 moe-sticker-bot 379 37 Go 33 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
68 WThermostatBeca 371 73 C++ 4 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
69 pixiv.moe 370 42 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
70 nmoe 366 31 Python 2 MoE training for Me and You and maybe other people 2026-02-07T06:24:47Z
71 MoePeek 365 22 Swift 5 A lightweight macOS selection translator built with pure Swift 6, featuring on-device Apple Translate for privacy, only 5MB install size and stable ~50MB memory usage. 一款轻量级 macOS 划词翻译工具,纯 Swift 6 开发,设备端 Apple 翻译保护隐私,安装体积仅 5MB,后台运行内存稳定约 50MB 2026-03-02T16:02:25Z
72 MOEAFramework 353 128 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2026-01-21T16:26:02Z
73 notify.moe 350 46 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
74 awesome-moe-inference 346 14 None 0 Curated collection of papers in MoE model inference 2025-10-20T01:30:05Z
75 dialogue.moe 342 11 Python 1 None 2022-12-14T14:50:38Z
76 MoH 304 15 Python 5 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
77 moell-blog 302 79 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
78 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
79 MoE-Infinity 285 24 Python 12 PyTorch library for cost-effective, fast and easy serving of MoE models. 2026-03-03T10:41:44Z
80 android-app 282 27 Kotlin 4 Official LISTEN.moe Android app 2026-03-01T15:54:51Z
81 moe 277 47 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
82 Cornell-MOE 275 65 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
83 parameter-efficient-moe 274 16 Python 1 None 2023-10-31T19:21:15Z
84 MoE-Adapters4CL 269 27 Python 7 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2025-09-18T08:38:29Z
85 GRIN-MoE 264 14 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
86 MoE-plus-plus 264 13 Python 1 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
87 fiddler 262 32 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
88 Ling-V2 262 18 Python 4 Ling-V2 is a MoE LLM provided and open-sourced by InclusionAI. 2025-10-04T06:15:38Z
89 kimi-agent-internals 252 70 Fluent 0 Extracted artifacts from Kimi OK-Computer (and other agents) system for AI studies in agentic architecture. 2026-02-06T23:07:30Z
90 inferflow 251 23 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
91 MoeQuest 250 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
92 FreeMoe 250 6 None 7 Unlock App Vip 2026-01-30T05:50:54Z
93 MoeLoader-Delta 247 36 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
94 Lvllm 247 22 Python 6 LvLLM is a special NUMA extension of vllm that makes full use of CPU and memory resources, reduces GPU memory requirements, and features an efficient GPU parallel and NUMA parallel architecture, supporting hybrid inference for MOE large models. 2026-03-06T14:09:57Z
95 moeins 246 70 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
96 Ling 240 20 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-05-14T06:34:57Z
97 transformers-qwen3-moe-fused 240 12 Python 3 Fused Qwen3 MoE layer for faster training, compatible with Transformers, LoRA, bnb 4-bit quant, Unsloth. Also possible to train LoRA over GGUF 2026-02-19T06:15:21Z
98 gdx-pay 238 88 Java 7 A libGDX cross-platform API for InApp purchasing. 2025-12-29T06:13:40Z
99 ComfyUI-WanMoeKSampler 232 18 Python 16 Modification of the KSampler for running models like Wan2.2 a14B 2025-10-22T21:09:42Z
100 moebius 231 4 PHP 0 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z