Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 vllm 63417 11382 Python 1910 A high-throughput and memory-efficient inference and serving engine for LLMs 2025-11-19T03:12:31Z
2 LLaMA-Factory 62686 7590 Python 774 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2025-11-18T16:53:28Z
3 sglang 20240 3440 Python 579 SGLang is a fast serving framework for large language models and vision language models. 2025-11-19T03:47:43Z
4 TensorRT-LLM 12170 1877 C++ 672 TensorRT LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and supports state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT LLM also contains components to create Python and C++ runtimes that orchestrate the inference execution in a performant way. 2025-11-19T03:49:00Z
5 ms-swift 11126 983 Python 756 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, GLM4.5, InternLM3, DeepSeek-R1, …) and 200+ MLLMs (Qwen3-VL, Qwen3-Omni, InternVL3.5, Ovis2.5, Llava, GLM4v, Phi4, …) (AAAI 2025). 2025-11-18T18:31:01Z
6 xtuner 4988 385 Python 230 A Next-Generation Training Engine Built for Ultra-Large MoE Models 2025-11-18T12:50:41Z
7 Bangumi 4958 154 TypeScript 27 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android。 2025-11-17T18:38:19Z
8 trace.moe 4860 256 None 0 Anime Scene Search by Image 2025-10-10T13:31:43Z
9 MoeKoeMusic 4176 264 Vue 16 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux / Web :electron: 2025-11-11T09:18:31Z
10 Moeditor 4130 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
11 flashinfer 4094 570 Cuda 237 FlashInfer: Kernel Library for LLM Serving 2025-11-19T02:17:00Z
12 fastllm 4075 413 C++ 304 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2025-11-18T08:42:46Z
13 GLM-4.5 3212 332 Python 23 GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models 2025-10-11T03:42:50Z
14 Moe-Counter 2625 270 JavaScript 4 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2025-08-12T08:16:18Z
15 MoeGoe 2397 245 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
16 MoE-LLaVA 2273 140 Python 65 【TMM 2025🔥】 Mixture-of-Experts for Large Vision-Language Models 2025-07-15T07:59:33Z
17 ICEdit 2033 114 Python 26 [NeurIPS 2025] Image editing is worth a single LoRA! 0.1% training data for fantastic image editing! Surpasses GPT-4o in ID persistence~ MoE ckpt released! Only 4GB VRAM is enough to run! 2025-11-12T17:28:10Z
18 MoBA 1984 124 Python 10 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
19 Cortex 1835 140 Python 1 个人构建MoE大模型:从预训练到DPO的完整实践 2025-11-05T05:57:57Z
20 DeepSeek-MoE 1834 293 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
21 fastmoe 1813 197 Python 27 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
22 moemail 1706 1048 TypeScript 34 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2025-10-25T17:57:13Z
23 OpenMoE 1634 84 Python 6 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
24 paimon-moe 1484 275 JavaScript 303 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2025-10-22T05:43:36Z
25 MOE 1320 140 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
26 moepush 1209 323 TypeScript 11 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
27 mixture-of-experts 1205 110 Python 6 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
28 Aria 1081 85 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
29 llama-moe 994 64 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
30 MoeTTS 993 77 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
31 Tutel 938 106 C 54 Tutel MoE: Optimized Mixture-of-Experts Library, Support GptOss/DeepSeek/Kimi-K2/Qwen3 using FP8/NVFP4/MXFP4 2025-11-10T16:07:53Z
32 MoeMemosAndroid 893 95 Kotlin 84 An app to help you capture thoughts and ideas 2025-11-02T16:02:08Z
33 moebius 869 48 JavaScript 40 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
34 Uni-MoE 849 50 Python 20 Uni-MoE: Lychee’s Large Multimodal Model Family. 2025-11-18T07:04:32Z
35 Time-MoE 823 87 Python 15 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-06-17T02:47:16Z
36 Hunyuan-A13B 805 117 Python 24 Tencent Hunyuan A13B (short as Hunyuan-A13B), an innovative and open-source LLM built on a fine-grained MoE architecture. 2025-07-08T08:45:27Z
37 Adan 803 69 Python 4 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2025-06-08T14:35:41Z
38 DeepSeek-671B-SFT-Guide 779 94 Python 2 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
39 MixtralKit 773 77 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
40 moe-theme.el 761 65 Emacs Lisp 15 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2025-05-27T06:12:05Z
41 moe 689 35 Nim 83 A command line based editor inspired by Vim. Written in Nim. 2025-11-11T23:03:59Z
42 Awesome-Mixture-of-Experts-Papers 649 44 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
43 SmartImage 648 31 C# 7 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2025-11-05T18:12:53Z
44 MoeMemos 636 56 Swift 70 An app to help you capture thoughts and ideas 2025-11-06T15:58:10Z
45 moedict-webkit 635 100 Objective-C 102 萌典網站 2025-10-07T06:22:07Z
46 vtbs.moe 629 36 Vue 32 Virtual YouTubers in bilibili 2025-07-31T13:39:09Z
47 satania.moe 612 55 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
48 Chinese-Mixtral 610 44 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
49 moebius 609 43 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
50 MoeList 603 20 Kotlin 30 Another unofficial Android MAL client 2025-11-17T23:36:49Z
51 moebooru 575 81 Ruby 26 Moebooru, a fork of danbooru1 that has been heavily modified 2025-11-18T06:29:28Z
52 MoeGoe_GUI 569 68 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
53 trace.moe-telegram-bot 544 78 JavaScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2025-10-15T09:44:32Z
54 moerail 495 39 JavaScript 16 铁路车站代码查询 × 动车组交路查询 2025-08-13T12:55:25Z
55 step_into_llm 480 123 Jupyter Notebook 36 MindSpore online courses: Step into LLM 2025-11-19T02:22:45Z
56 MOE 422 76 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
57 MiniMind-in-Depth 418 40 None 2 轻量级大语言模型MiniMind的源码解读,包含tokenizer、RoPE、MoE、KV Cache、pretraining、SFT、LoRA、DPO等完整流程 2025-06-16T14:13:15Z
58 hydra-moe 414 16 Python 10 None 2023-11-02T22:53:15Z
59 DiT-MoE 403 19 Python 7 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
60 MoeLoaderP 384 26 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
61 st-moe-pytorch 370 33 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
62 pixiv.moe 367 41 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
63 WThermostatBeca 365 72 C++ 4 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
64 moe-sticker-bot 364 37 Go 30 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
65 Awesome-Efficient-Arch 359 31 None 0 Speed Always Wins: A Survey on Efficient Architectures for Large Language Models 2025-11-11T09:47:37Z
66 InferenceMAX 357 49 Python 39 Open Source Continuous Inference Benchmarking - GB200 NVL72 vs MI355X vs B200 vs H200 vs MI325X & soon™ TPUv6e/v7/Trainium2/3/GB300 NVL72 - DeepSeek 670B MoE, GPTOSS 2025-11-19T01:00:30Z
67 notify.moe 352 45 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
68 MOEAFramework 349 128 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2025-07-30T12:40:46Z
69 MoeSR 344 9 JavaScript 7 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2025-08-06T14:15:38Z
70 dialogue.moe 337 11 Python 1 None 2022-12-14T14:50:38Z
71 soft-moe-pytorch 335 9 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
72 awesome-moe-inference 306 11 None 0 Curated collection of papers in MoE model inference 2025-10-20T01:30:05Z
73 moell-blog 302 80 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
74 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
75 MoH 294 15 Python 4 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
76 moe 278 46 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
77 android-app 275 25 Kotlin 5 Official LISTEN.moe Android app 2025-11-16T17:21:54Z
78 Cornell-MOE 272 64 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
79 parameter-efficient-moe 272 16 Python 1 None 2023-10-31T19:21:15Z
80 GRIN-MoE 264 13 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
81 MoE-Infinity 261 19 Python 13 PyTorch library for cost-effective, fast and easy serving of MoE models. 2025-10-15T17:52:58Z
82 MoE-Adapters4CL 254 21 Python 8 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2025-09-18T08:38:29Z
83 MoE-plus-plus 252 13 Python 1 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
84 MoeQuest 251 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
85 inferflow 249 23 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
86 MoeLoader-Delta 248 37 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
87 moeins 242 70 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
88 fiddler 241 25 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
89 gdx-pay 234 88 Java 7 A libGDX cross-platform API for InApp purchasing. 2025-10-10T06:47:44Z
90 Ling 233 20 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-05-14T06:34:57Z
91 moebius 231 4 PHP 4 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z
92 Ling-V2 226 17 Python 4 Ling-V2 is a MoE LLM provided and open-sourced by InclusionAI. 2025-10-04T06:15:38Z
93 ModuleFormer 224 11 Python 2 ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. 2025-09-18T00:30:52Z
94 CoE 223 28 Python 3 Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models 2025-11-04T14:49:21Z
95 LLaVA-MoD 208 15 Python 2 [ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation 2025-03-31T09:41:38Z
96 transformers-qwen3-moe-fused 207 9 Python 6 Fused Qwen3 MoE layer for faster training, compatible with HF Transformers, LoRA, 4-bit quant, Unsloth 2025-11-06T07:57:50Z
97 moe 201 22 None 1 Misspelling Oblivious Word Embeddings 2019-08-06T12:42:31Z
98 MixLoRA 197 18 Python 3 State-of-the-art Parameter-Efficient MoE Fine-tuning Method 2024-08-22T08:02:04Z
99 ComfyUI-WanMoeKSampler 194 17 Python 11 Modification of the KSampler for running models like Wan2.2 a14B 2025-10-22T21:09:42Z
100 Yuan2.0-M32 192 38 Python 6 Mixture-of-Experts (MoE) Language Model 2024-09-09T09:14:15Z