Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 LLaMA-Factory 59589 7295 Python 675 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2025-09-30T10:11:50Z
2 sglang 18525 3056 Python 559 SGLang is a fast serving framework for large language models and vision language models. 2025-10-01T03:07:59Z
3 TensorRT-LLM 11749 1773 C++ 719 TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in performant way. 2025-10-01T02:12:11Z
4 ms-swift 10165 893 Python 575 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, GLM4.5, InternLM3, DeepSeek-R1, …) and 200+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, Llava, GLM4v, Phi4, …) (AAAI 2025). 2025-10-01T03:23:24Z
5 xtuner 4913 371 Python 231 A Next-Generation Training Engine Built for Ultra-Large MoE Models 2025-09-30T08:59:48Z
6 Bangumi 4815 152 TypeScript 25 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android。 2025-09-30T20:43:46Z
7 trace.moe 4792 253 None 0 Anime Scene Search by Image 2025-09-15T13:48:26Z
8 Moeditor 4129 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
9 fastllm 3971 408 C++ 288 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2025-09-26T11:52:11Z
10 flashinfer 3825 524 Cuda 196 FlashInfer: Kernel Library for LLM Serving 2025-10-01T01:51:01Z
11 MoeKoeMusic 3787 237 Vue 17 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux / Web :electron: 2025-09-11T07:49:50Z
12 GLM-4.5 2771 278 Python 19 GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models 2025-09-30T07:56:41Z
13 Moe-Counter 2541 263 JavaScript 4 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2025-08-12T08:16:18Z
14 MoeGoe 2398 245 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
15 MoE-LLaVA 2247 141 Python 64 【TMM 2025🔥】 Mixture-of-Experts for Large Vision-Language Models 2025-07-15T07:59:33Z
16 ICEdit 1963 110 Python 23 [NeurIPS 2025] Image editing is worth a single LoRA! 0.1% training data for fantastic image editing! Surpasses GPT-4o in ID persistence~ MoE ckpt released! Only 4GB VRAM is enough to run! 2025-09-19T15:58:27Z
17 MoBA 1910 114 Python 9 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
18 DeepSeek-MoE 1804 292 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
19 fastmoe 1795 196 Python 27 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
20 OpenMoE 1605 82 Python 6 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
21 moemail 1594 941 TypeScript 29 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2025-09-17T14:42:06Z
22 paimon-moe 1473 275 JavaScript 297 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2025-09-30T21:59:02Z
23 Cortex 1426 110 Python 1 个人构建MoE大模型:从预训练到DPO的完整实践 2025-09-28T04:56:55Z
24 MOE 1319 138 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
25 mixture-of-experts 1181 110 Python 5 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
26 moepush 1168 301 TypeScript 11 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
27 Aria 1072 86 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
28 llama-moe 991 61 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
29 MoeTTS 990 77 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
30 Tutel 926 104 C 53 Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4 2025-09-15T17:08:08Z
31 moebius 855 49 JavaScript 40 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
32 MoeMemosAndroid 854 91 Kotlin 81 An app to help you capture thoughts and ideas 2025-08-26T17:31:03Z
33 Adan 798 69 Python 3 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2025-06-08T14:35:41Z
34 Time-MoE 779 79 Python 11 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-06-17T02:47:16Z
35 MixtralKit 770 77 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
36 UMOE-Scaling-Unified-Multimodal-LLMs 768 47 Python 13 The repository of Uni-MoE model series 2025-09-18T12:35:40Z
37 DeepSeek-671B-SFT-Guide 764 94 Python 2 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
38 moe-theme.el 761 64 Emacs Lisp 15 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2025-05-27T06:12:05Z
39 Hunyuan-A13B 754 94 Python 13 Tencent Hunyuan A13B (short as Hunyuan-A13B), an innovative and open-source LLM built on a fine-grained MoE architecture. 2025-07-08T08:45:27Z
40 moe 690 35 Nim 83 A command line based editor inspired by Vim. Written in Nim. 2025-09-30T23:34:47Z
41 Awesome-Mixture-of-Experts-Papers 644 45 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
42 SmartImage 641 30 C# 7 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2025-09-24T08:36:11Z
43 moedict-webkit 632 101 Objective-C 102 萌典網站 2025-09-18T03:03:16Z
44 vtbs.moe 628 36 Vue 32 Virtual YouTubers in bilibili 2025-07-31T13:39:09Z
45 satania.moe 613 56 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
46 MoeMemos 610 56 Swift 67 An app to help you capture thoughts and ideas 2025-08-27T15:54:28Z
47 moebius 610 43 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
48 Chinese-Mixtral 609 44 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
49 MoeList 599 21 Kotlin 29 Another unofficial Android MAL client 2025-09-28T08:58:08Z
50 MoeGoe_GUI 569 68 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
51 moebooru 569 81 Ruby 25 Moebooru, a fork of danbooru1 that has been heavily modified 2025-08-14T04:32:09Z
52 trace.moe-telegram-bot 541 77 JavaScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2025-09-25T03:35:19Z
53 moerail 485 38 JavaScript 15 铁路车站代码查询 × 动车组交路查询 2025-08-13T12:55:25Z
54 step_into_llm 476 122 Jupyter Notebook 36 MindSpore online courses: Step into LLM 2025-08-21T02:57:09Z
55 MOE 423 76 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
56 hydra-moe 416 16 Python 10 None 2023-11-02T22:53:15Z
57 DiT-MoE 387 19 Python 6 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
58 MoeLoaderP 380 26 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
59 pixiv.moe 366 41 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
60 st-moe-pytorch 363 32 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
61 WThermostatBeca 361 71 C++ 3 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
62 moe-sticker-bot 360 37 Go 30 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
63 notify.moe 352 45 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
64 MOEAFramework 343 128 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2025-07-30T12:40:46Z
65 Awesome-Efficient-Arch 337 30 None 0 Speed Always Wins: A Survey on Efficient Architectures for Large Language Models 2025-08-29T09:05:29Z
66 dialogue.moe 336 9 Python 1 None 2022-12-14T14:50:38Z
67 MoeSR 334 10 JavaScript 7 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2025-08-06T14:15:38Z
68 soft-moe-pytorch 325 9 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
69 moell-blog 302 80 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
70 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
71 moe 279 46 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
72 android-app 275 25 Kotlin 5 Official LISTEN.moe Android app 2025-09-28T01:59:00Z
73 MoH 275 14 Python 4 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
74 awesome-moe-inference 272 11 None 0 Curated collection of papers in MoE model inference 2025-09-15T12:26:47Z
75 Cornell-MOE 270 63 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
76 parameter-efficient-moe 269 16 Python 1 None 2023-10-31T19:21:15Z
77 GRIN-MoE 264 13 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
78 MoeQuest 251 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
79 MoeLoader-Delta 249 37 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
80 inferflow 248 25 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
81 MoE-Adapters4CL 245 19 Python 8 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2025-09-18T08:38:29Z
82 MoE-Infinity 243 18 Python 10 PyTorch library for cost-effective, fast and easy serving of MoE models. 2025-07-07T12:54:02Z
83 moeins 241 70 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
84 MoE-plus-plus 239 13 Python 1 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
85 fiddler 234 24 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
86 gdx-pay 233 88 Java 8 A libGDX cross-platform API for InApp purchasing. 2025-07-26T21:09:56Z
87 moebius 232 4 PHP 4 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z
88 MiniMind-in-Depth 232 28 None 2 轻量级大语言模型MiniMind的源码解读,包含tokenizer、RoPE、MoE、KV Cache、pretraining、SFT、LoRA、DPO等完整流程 2025-06-16T14:13:15Z
89 ModuleFormer 224 12 Python 2 ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. 2025-09-18T00:30:52Z
90 CoE 220 27 Python 3 Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models 2025-09-14T21:58:07Z
91 Ling 203 17 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-05-14T06:34:57Z
92 moe 201 22 None 1 Misspelling Oblivious Word Embeddings 2019-08-06T12:42:31Z
93 LLaVA-MoD 200 15 Python 3 [ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation 2025-03-31T09:41:38Z
94 MoePhoto 191 22 Python 5 MoePhoto Image Toolbox萌图工具箱 2024-09-23T06:35:27Z
95 Yuan2.0-M32 189 38 Python 6 Mixture-of-Experts (MoE) Language Model 2024-09-09T09:14:15Z
96 MixLoRA 188 18 Python 3 State-of-the-art Parameter-Efficient MoE Fine-tuning Method 2024-08-22T08:02:04Z
97 MOELoRA-peft 182 20 Python 7 [SIGIR’24] The official implementation code of MOELoRA. 2024-07-22T07:32:43Z
98 transformers-qwen3-moe-fused 179 6 Python 5 Fused Qwen3 MoE layer for faster training, compatible with HF Transformers, LoRA, 4-bit quant, Unsloth 2025-09-23T04:31:30Z
99 SMoE-Stereo 175 12 Python 2 [ICCV 2025 Highlight] 🌟🌟🌟 Learning Robust Stereo Matching in the Wild with Selective Mixture-of-Experts 2025-07-24T04:31:01Z
100 MoeRanker 172 14 JavaScript 9 Moe attribute sorter 2025-06-30T03:16:02Z