Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 vllm 60655 10700 Python 1848 A high-throughput and memory-efficient inference and serving engine for LLMs 2025-10-22T03:30:03Z
2 LLaMA-Factory 60576 7335 Python 719 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2025-10-21T10:26:12Z
3 sglang 19092 3126 Python 528 SGLang is a fast serving framework for large language models and vision language models. 2025-10-22T03:32:52Z
4 TensorRT-LLM 11919 1811 C++ 735 TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and supports state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in a performant way. 2025-10-22T03:48:19Z
5 ms-swift 10558 911 Python 601 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, GLM4.5, InternLM3, DeepSeek-R1, …) and 200+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, Llava, GLM4v, Phi4, …) (AAAI 2025). 2025-10-22T02:59:50Z
6 xtuner 4943 377 Python 227 A Next-Generation Training Engine Built for Ultra-Large MoE Models 2025-10-21T11:12:06Z
7 Bangumi 4907 154 TypeScript 24 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android。 2025-10-21T14:43:10Z
8 trace.moe 4812 255 None 0 Anime Scene Search by Image 2025-10-10T13:31:43Z
9 Moeditor 4131 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
10 fastllm 4030 412 C++ 294 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2025-09-26T11:52:11Z
11 flashinfer 3941 537 Cuda 212 FlashInfer: Kernel Library for LLM Serving 2025-10-22T03:48:39Z
12 MoeKoeMusic 3918 247 Vue 12 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux / Web :electron: 2025-10-17T08:08:15Z
13 GLM-4.5 3051 307 Python 20 GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models 2025-10-11T03:42:50Z
14 Moe-Counter 2570 264 JavaScript 4 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2025-08-12T08:16:18Z
15 MoeGoe 2397 245 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
16 MoE-LLaVA 2254 140 Python 64 【TMM 2025🔥】 Mixture-of-Experts for Large Vision-Language Models 2025-07-15T07:59:33Z
17 ICEdit 1995 110 Python 24 [NeurIPS 2025] Image editing is worth a single LoRA! 0.1% training data for fantastic image editing! Surpasses GPT-4o in ID persistence~ MoE ckpt released! Only 4GB VRAM is enough to run! 2025-10-13T01:59:51Z
18 MoBA 1940 116 Python 9 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
19 DeepSeek-MoE 1810 293 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
20 fastmoe 1807 196 Python 27 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
21 Cortex 1646 130 Python 0 个人构建MoE大模型:从预训练到DPO的完整实践 2025-10-21T11:40:47Z
22 moemail 1628 979 TypeScript 32 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2025-10-21T17:04:28Z
23 OpenMoE 1614 83 Python 6 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
24 paimon-moe 1478 272 JavaScript 299 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2025-09-30T21:59:02Z
25 MOE 1319 138 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
26 mixture-of-experts 1190 110 Python 5 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
27 moepush 1184 310 TypeScript 11 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
28 Aria 1072 86 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
29 llama-moe 994 61 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
30 MoeTTS 992 77 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
31 Tutel 932 104 C 53 Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4 2025-10-06T07:54:59Z
32 MoeMemosAndroid 872 93 Kotlin 84 An app to help you capture thoughts and ideas 2025-08-26T17:31:03Z
33 moebius 864 48 JavaScript 40 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
34 Hunyuan-A13B 801 114 Python 22 Tencent Hunyuan A13B (short as Hunyuan-A13B), an innovative and open-source LLM built on a fine-grained MoE architecture. 2025-07-08T08:45:27Z
35 Adan 801 69 Python 4 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2025-06-08T14:35:41Z
36 Time-MoE 799 81 Python 11 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-06-17T02:47:16Z
37 Uni-MoE 790 48 Python 15 The repository of Uni-MoE model series 2025-10-21T07:20:26Z
38 MixtralKit 770 77 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
39 DeepSeek-671B-SFT-Guide 766 94 Python 2 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
40 moe-theme.el 761 64 Emacs Lisp 15 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2025-05-27T06:12:05Z
41 moe 688 35 Nim 83 A command line based editor inspired by Vim. Written in Nim. 2025-10-22T00:03:28Z
42 Awesome-Mixture-of-Experts-Papers 648 45 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
43 SmartImage 645 30 C# 7 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2025-10-09T02:46:42Z
44 moedict-webkit 634 100 Objective-C 102 萌典網站 2025-10-07T06:22:07Z
45 vtbs.moe 627 36 Vue 32 Virtual YouTubers in bilibili 2025-07-31T13:39:09Z
46 MoeMemos 619 56 Swift 69 An app to help you capture thoughts and ideas 2025-08-27T15:54:28Z
47 satania.moe 610 56 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
48 Chinese-Mixtral 610 44 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
49 moebius 609 43 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
50 MoeList 601 20 Kotlin 29 Another unofficial Android MAL client 2025-10-21T10:48:01Z
51 MoeGoe_GUI 568 68 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
52 moebooru 568 81 Ruby 25 Moebooru, a fork of danbooru1 that has been heavily modified 2025-10-10T19:08:11Z
53 trace.moe-telegram-bot 541 78 JavaScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2025-10-15T09:44:32Z
54 moerail 490 38 JavaScript 16 铁路车站代码查询 × 动车组交路查询 2025-08-13T12:55:25Z
55 step_into_llm 478 122 Jupyter Notebook 36 MindSpore online courses: Step into LLM 2025-08-21T02:57:09Z
56 MOE 422 76 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
57 hydra-moe 415 16 Python 10 None 2023-11-02T22:53:15Z
58 DiT-MoE 392 19 Python 6 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
59 MoeLoaderP 382 26 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
60 pixiv.moe 366 41 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
61 st-moe-pytorch 365 33 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
62 WThermostatBeca 364 71 C++ 3 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
63 moe-sticker-bot 363 37 Go 30 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
64 notify.moe 350 45 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
65 MOEAFramework 346 128 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2025-07-30T12:40:46Z
66 Awesome-Efficient-Arch 346 30 None 0 Speed Always Wins: A Survey on Efficient Architectures for Large Language Models 2025-08-29T09:05:29Z
67 MoeSR 338 10 JavaScript 7 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2025-08-06T14:15:38Z
68 dialogue.moe 337 11 Python 1 None 2022-12-14T14:50:38Z
69 soft-moe-pytorch 328 9 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
70 moell-blog 302 80 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
71 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
72 awesome-moe-inference 289 11 None 0 Curated collection of papers in MoE model inference 2025-10-20T01:30:05Z
73 MoH 283 14 Python 4 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
74 MiniMind-in-Depth 280 34 None 2 轻量级大语言模型MiniMind的源码解读,包含tokenizer、RoPE、MoE、KV Cache、pretraining、SFT、LoRA、DPO等完整流程 2025-06-16T14:13:15Z
75 moe 279 46 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
76 android-app 276 25 Kotlin 5 Official LISTEN.moe Android app 2025-10-19T03:23:33Z
77 parameter-efficient-moe 271 16 Python 1 None 2023-10-31T19:21:15Z
78 Cornell-MOE 270 63 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
79 GRIN-MoE 264 13 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
80 MoE-Adapters4CL 253 19 Python 8 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2025-09-18T08:38:29Z
81 MoE-Infinity 252 18 Python 12 PyTorch library for cost-effective, fast and easy serving of MoE models. 2025-10-15T17:52:58Z
82 MoeQuest 251 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
83 MoeLoader-Delta 248 37 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
84 inferflow 248 25 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
85 MoE-plus-plus 247 13 Python 1 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
86 moeins 241 70 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
87 fiddler 238 24 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
88 gdx-pay 233 88 Java 7 A libGDX cross-platform API for InApp purchasing. 2025-10-10T06:47:44Z
89 moebius 231 4 PHP 4 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z
90 Ling 227 20 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-05-14T06:34:57Z
91 ModuleFormer 224 11 Python 2 ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. 2025-09-18T00:30:52Z
92 CoE 221 27 Python 3 Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models 2025-09-14T21:58:07Z
93 LLaVA-MoD 204 15 Python 2 [ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation 2025-03-31T09:41:38Z
94 moe 201 22 None 1 Misspelling Oblivious Word Embeddings 2019-08-06T12:42:31Z
95 transformers-qwen3-moe-fused 194 9 Python 5 Fused Qwen3 MoE layer for faster training, compatible with HF Transformers, LoRA, 4-bit quant, Unsloth 2025-10-21T23:00:59Z
96 MixLoRA 192 18 Python 3 State-of-the-art Parameter-Efficient MoE Fine-tuning Method 2024-08-22T08:02:04Z
97 MoePhoto 191 22 Python 5 MoePhoto Image Toolbox萌图工具箱 2024-09-23T06:35:27Z
98 Yuan2.0-M32 189 38 Python 6 Mixture-of-Experts (MoE) Language Model 2024-09-09T09:14:15Z
99 MOELoRA-peft 184 20 Python 7 [SIGIR’24] The official implementation code of MOELoRA. 2024-07-22T07:32:43Z
100 SMoE-Stereo 178 12 Python 2 [ICCV 2025 Highlight] 🌟🌟🌟 Learning Robust Stereo Matching in the Wild with Selective Mixture-of-Experts 2025-07-24T04:31:01Z