Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 LLaMA-Factory 57492 7049 Python 613 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2025-09-03T09:22:55Z
2 sglang 17642 2835 Python 587 SGLang is a fast serving framework for large language models and vision language models. 2025-09-05T03:33:30Z
3 TensorRT-LLM 11510 1717 C++ 751 TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in performant way. 2025-09-05T03:05:27Z
4 ms-swift 9670 851 Python 625 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, GLM4.5, InternLM3, DeepSeek-R1, …) and 200+ MLLMs (Qwen2.5-VL, Qwen2.5-Omni, InternVL3.5, Ovis2.5, Llava, GLM4v, Phi4, …) (AAAI 2025). 2025-09-04T10:33:50Z
5 trace.moe 4771 251 None 0 Anime Scene Search by Image 2025-08-15T11:34:45Z
6 Bangumi 4718 153 TypeScript 25 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android。 2025-08-29T20:39:21Z
7 Moeditor 4135 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
8 fastllm 3889 399 C++ 271 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2025-09-05T01:24:50Z
9 flashinfer 3675 478 Cuda 179 FlashInfer: Kernel Library for LLM Serving 2025-09-04T20:47:15Z
10 MoeKoeMusic 3530 220 Vue 13 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux / Web :electron: 2025-09-02T06:29:54Z
11 Moe-Counter 2512 257 JavaScript 4 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2025-08-12T08:16:18Z
12 GLM-4.5 2410 242 Python 25 GLM-4.5: An open-source large language model designed for intelligent agents by Z.ai 2025-08-25T06:55:03Z
13 MoeGoe 2394 247 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
14 MoE-LLaVA 2231 140 Python 64 【TMM 2025🔥】 Mixture-of-Experts for Large Vision-Language Models 2025-07-15T07:59:33Z
15 MoBA 1888 115 Python 9 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
16 DeepSeek-MoE 1784 290 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
17 fastmoe 1782 196 Python 27 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
18 OpenMoE 1584 80 Python 6 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
19 moemail 1563 896 TypeScript 26 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2025-06-22T16:38:30Z
20 paimon-moe 1466 274 JavaScript 290 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2025-08-19T12:16:52Z
21 MOE 1319 138 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
22 mixture-of-experts 1162 109 Python 5 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
23 moepush 1130 286 TypeScript 11 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
24 Aria 1067 85 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
25 MoeTTS 989 77 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
26 llama-moe 984 61 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
27 Tutel 906 100 C 52 Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4 2025-08-28T05:51:21Z
28 MoeMemosAndroid 839 87 Kotlin 80 An app to help you capture thoughts and ideas 2025-08-26T17:31:03Z
29 moebius 839 49 JavaScript 39 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
30 Adan 795 69 Python 3 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2025-06-08T14:35:41Z
31 Cortex 776 65 Python 1 个人构建MoE大模型:从预训练到DPO的完整实践 2025-09-04T09:55:37Z
32 MixtralKit 769 78 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
33 moe-theme.el 760 64 Emacs Lisp 15 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2025-05-27T06:12:05Z
34 UMOE-Scaling-Unified-Multimodal-LLMs 756 47 Python 13 The codes about “Uni-MoE: Scaling Unified Multimodal Models with Mixture of Experts” 2025-08-06T10:54:43Z
35 Time-MoE 752 76 Python 11 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-06-17T02:47:16Z
36 DeepSeek-671B-SFT-Guide 750 94 Python 2 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
37 Hunyuan-A13B 741 87 Python 12 Tencent Hunyuan A13B (short as Hunyuan-A13B), an innovative and open-source LLM built on a fine-grained MoE architecture. 2025-07-08T08:45:27Z
38 moe 683 35 Nim 83 A command line based editor inspired by Vim. Written in Nim. 2025-08-14T22:50:02Z
39 Awesome-Mixture-of-Experts-Papers 642 45 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
40 moedict-webkit 630 99 Objective-C 102 萌典網站 2025-08-10T01:48:58Z
41 vtbs.moe 629 36 Vue 32 Virtual YouTubers in bilibili 2025-07-31T13:39:09Z
42 SmartImage 629 29 C# 7 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2025-09-05T03:37:12Z
43 satania.moe 611 56 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
44 moebius 609 43 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
45 Chinese-Mixtral 608 44 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
46 MoeMemos 598 53 Swift 67 An app to help you capture thoughts and ideas 2025-08-27T15:54:28Z
47 MoeList 594 21 Kotlin 29 Another unofficial Android MAL client 2025-09-02T16:59:44Z
48 MoeGoe_GUI 570 68 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
49 moebooru 565 81 Ruby 25 Moebooru, a fork of danbooru1 that has been heavily modified 2025-08-14T04:32:09Z
50 trace.moe-telegram-bot 536 76 JavaScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2025-09-04T13:20:39Z
51 moerail 482 37 JavaScript 15 铁路车站代码查询 × 动车组交路查询 2025-08-13T12:55:25Z
52 step_into_llm 477 123 Jupyter Notebook 36 MindSpore online courses: Step into LLM 2025-08-21T02:57:09Z
53 MOE 423 76 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
54 hydra-moe 416 16 Python 10 None 2023-11-02T22:53:15Z
55 MoeLoaderP 380 26 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
56 DiT-MoE 372 18 Python 6 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
57 pixiv.moe 365 42 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
58 st-moe-pytorch 359 32 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
59 WThermostatBeca 358 71 C++ 2 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
60 moe-sticker-bot 356 37 Go 30 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
61 notify.moe 354 45 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
62 MOEAFramework 343 129 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2025-07-30T12:40:46Z
63 dialogue.moe 334 9 Python 1 None 2022-12-14T14:50:38Z
64 MoeSR 320 10 JavaScript 7 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2025-08-06T14:15:38Z
65 soft-moe-pytorch 317 9 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
66 Awesome-Efficient-Arch 308 28 None 0 Speed Always Wins: A Survey on Efficient Architectures for Large Language Models 2025-08-29T09:05:29Z
67 moell-blog 302 80 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
68 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
69 moe 279 46 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
70 android-app 274 25 Kotlin 5 Official LISTEN.moe Android app 2025-08-31T11:29:35Z
71 Cornell-MOE 270 63 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
72 parameter-efficient-moe 270 16 Python 1 None 2023-10-31T19:21:15Z
73 MoH 270 14 Python 4 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
74 GRIN-MoE 264 13 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
75 awesome-moe-inference 252 10 None 0 Curated collection of papers in MoE model inference 2025-08-01T03:34:19Z
76 MoeQuest 251 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
77 MoeLoader-Delta 249 37 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
78 inferflow 247 25 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
79 moeins 241 70 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
80 MoE-plus-plus 240 11 Python 1 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
81 MoE-Adapters4CL 236 18 Python 7 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2025-09-04T11:21:35Z
82 moebius 233 4 PHP 4 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z
83 gdx-pay 232 88 Java 8 A libGDX cross-platform API for InApp purchasing. 2025-07-26T21:09:56Z
84 MoE-Infinity 230 17 Python 10 PyTorch library for cost-effective, fast and easy serving of MoE models. 2025-07-07T12:54:02Z
85 fiddler 227 24 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
86 ModuleFormer 223 12 Python 2 ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. 2024-04-10T18:16:32Z
87 CoE 220 27 Python 3 Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models 2025-06-25T23:14:46Z
88 moe 201 22 None 1 Misspelling Oblivious Word Embeddings 2019-08-06T12:42:31Z
89 LLaVA-MoD 193 13 Python 2 [ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation 2025-03-31T09:41:38Z
90 MoePhoto 192 22 Python 5 MoePhoto Image Toolbox萌图工具箱 2024-09-23T06:35:27Z
91 Ling 190 16 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-05-14T06:34:57Z
92 Yuan2.0-M32 189 40 Python 6 Mixture-of-Experts (MoE) Language Model 2024-09-09T09:14:15Z
93 MixLoRA 184 18 Python 3 State-of-the-art Parameter-Efficient MoE Fine-tuning Method 2024-08-22T08:02:04Z
94 MiniMind-in-Depth 178 25 None 1 轻量级大语言模型MiniMind的源码解读,包含tokenizer、RoPE、MoE、KV Cache、pretraining、SFT、LoRA、DPO等完整流程 2025-06-16T14:13:15Z
95 MOELoRA-peft 177 19 Python 7 [SIGIR’24] The official implementation code of MOELoRA. 2024-07-22T07:32:43Z
96 MoeRanker 172 14 JavaScript 9 Moe attribute sorter 2025-06-30T03:16:02Z
97 SMoE-Stereo 169 12 Python 2 [ICCV 2025 Highlight] 🌟🌟🌟 Learning Robust Stereo Matching in the Wild with Selective Mixture-of-Experts 2025-07-24T04:31:01Z
98 transformers-qwen3-moe-fused 169 4 Python 3 Fused Qwen3 MoE layer for faster training, compatible with HF Transformers, LoRA, 4-bit quant, Unsloth 2025-09-04T03:32:05Z
99 ghost-theme-Moegi 164 26 Handlebars 3 An elegant & fresh ghost theme. 2023-10-16T16:09:28Z
100 Frequency_Aug_VAE_MoESR 161 4 Python 12 Latent-based SR using MoE and frequency augmented VAE decoder 2023-11-26T10:33:36Z