Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 LLaMA-Factory 51757 6256 Python 476 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2025-06-07T01:00:09Z
2 sglang 14941 1935 Python 462 SGLang is a fast serving framework for large language models and vision language models. 2025-06-07T03:30:49Z
3 ms-swift 7956 678 Python 697 Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, InternLM3, DeepSeek-R1, …) and 200+ MLLMs (Qwen2.5-VL, Qwen2.5-Omni, Qwen2-Audio, Ovis2, InternVL3, Llava, GLM4v, Phi4, …) (AAAI 2025). 2025-06-06T08:44:23Z
4 trace.moe 4684 239 None 0 Anime Scene Search by Image 2024-10-13T03:00:58Z
5 Bangumi 4402 146 TypeScript 22 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。 2025-06-03T21:45:11Z
6 Moeditor 4134 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
7 fastllm 3625 370 C++ 240 fastllm是后端无依赖的高性能大模型推理库。同时支持张量并行推理稠密模型和混合模式推理MOE模型,任意10G以上显卡即可推理满血DeepSeek。双路9004/9005服务器+单显卡部署DeepSeek满血满精度原版模型,单并发20tps;INT4量化模型单并发30tps,多并发可达60+。 2025-06-05T04:05:51Z
8 MoeKoeMusic 2497 167 Vue 24 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux :electron: 2025-06-07T03:31:54Z
9 MoeGoe 2385 248 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
10 Moe-Counter 2333 239 JavaScript 6 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2025-02-06T06:16:00Z
11 MoE-LLaVA 2175 136 Python 63 Mixture-of-Experts for Large Vision-Language Models 2024-12-03T09:08:16Z
12 MoBA 1786 107 Python 6 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
13 fastmoe 1738 196 Python 26 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
14 DeepSeek-MoE 1716 280 Python 17 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
15 OpenMoE 1537 79 Python 5 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
16 paimon-moe 1455 275 JavaScript 277 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2025-05-27T10:13:47Z
17 MOE 1316 139 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
18 moemail 1280 637 TypeScript 14 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2025-06-05T16:04:52Z
19 mixture-of-experts 1114 109 Python 5 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
20 moepush 1054 218 TypeScript 10 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-05-10T11:42:44Z
21 Aria 1042 85 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
22 MoeTTS 989 77 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
23 llama-moe 965 63 Python 6 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
24 Tutel 838 97 C 43 Tutel MoE: Optimized Mixture-of-Experts Library, Support DeepSeek FP8/FP4 2025-06-06T22:41:28Z
25 moebius 810 47 JavaScript 39 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
26 Adan 792 68 Python 3 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2024-07-02T18:26:36Z
27 MoeMemosAndroid 769 79 Kotlin 78 An app to help you capture thoughts and ideas 2025-06-04T10:28:57Z
28 MixtralKit 767 79 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
29 moe-theme.el 758 64 Emacs Lisp 14 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2025-05-27T06:12:05Z
30 UMOE-Scaling-Unified-Multimodal-LLMs 723 44 Python 11 The codes about “Uni-MoE: Scaling Unified Multimodal Models with Mixture of Experts” 2025-05-13T05:05:42Z
31 DeepSeek-671B-SFT-Guide 693 88 Python 1 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
32 moe 685 33 Nim 85 A command line based editor inspired by Vim. Written in Nim. 2025-06-01T23:04:45Z
33 Time-MoE 640 62 Python 11 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-05-21T12:26:57Z
34 Awesome-Mixture-of-Experts-Papers 630 45 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
35 vtbs.moe 622 36 Vue 32 Virtual YouTubers in bilibili 2024-09-10T06:07:07Z
36 moedict-webkit 620 99 Objective-C 102 萌典網站 2025-06-06T02:49:41Z
37 SmartImage 618 27 C# 9 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2025-05-28T08:56:06Z
38 satania.moe 614 57 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
39 moebius 607 43 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
40 Chinese-Mixtral 605 44 Python 0 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
41 MoeList 572 20 Kotlin 26 Another unofficial Android MAL client 2025-05-30T12:24:05Z
42 MoeGoe_GUI 571 68 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
43 MoeMemos 565 48 Swift 63 An app to help you capture thoughts and ideas 2025-05-05T19:00:42Z
44 moebooru 555 84 Ruby 25 Moebooru, a fork of danbooru1 that has been heavily modified 2025-06-06T10:37:15Z
45 trace.moe-telegram-bot 536 78 JavaScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2025-04-28T10:02:09Z
46 step_into_llm 468 119 Jupyter Notebook 27 MindSpore online courses: Step into LLM 2025-01-06T01:50:19Z
47 moerail 454 34 JavaScript 16 铁路车站代码查询 × 动车组交路查询 2023-02-27T03:37:18Z
48 MOE 421 76 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
49 hydra-moe 414 16 Python 10 None 2023-11-02T22:53:15Z
50 MoeLoaderP 366 26 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-05-19T13:20:58Z
51 pixiv.moe 364 43 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
52 WThermostatBeca 354 70 C++ 1 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
53 notify.moe 350 45 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
54 moe-sticker-bot 341 35 Go 27 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
55 MOEAFramework 338 127 Java 1 A Free and Open Source Java Framework for Multiobjective Optimization 2025-06-05T13:18:55Z
56 st-moe-pytorch 337 29 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
57 DiT-MoE 333 15 Python 5 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
58 dialogue.moe 330 8 Python 1 None 2022-12-14T14:50:38Z
59 moell-blog 301 80 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
60 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
61 soft-moe-pytorch 296 8 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
62 MoeSR 280 9 JavaScript 7 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2024-04-17T12:34:26Z
63 moe 279 46 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
64 Cornell-MOE 269 63 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
65 GRIN-MoE 263 15 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
66 android-app 263 24 Kotlin 7 Official LISTEN.moe Android app 2025-06-06T20:05:40Z
67 parameter-efficient-moe 260 17 Python 1 None 2023-10-31T19:21:15Z
68 MoeQuest 251 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
69 MoH 250 13 Python 4 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
70 MoeLoader-Delta 244 37 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
71 moeins 243 69 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
72 inferflow 242 25 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
73 moebius 232 4 PHP 4 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z
74 gdx-pay 229 87 Java 8 A libGDX cross-platform API for InApp purchasing. 2025-01-02T19:28:20Z
75 MoE-plus-plus 223 7 Python 0 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
76 ModuleFormer 221 11 Python 2 ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. 2024-04-10T18:16:32Z
77 MoE-Adapters4CL 219 16 Python 6 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2024-11-17T05:47:00Z
78 fiddler 211 21 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
79 moe 201 22 None 1 Misspelling Oblivious Word Embeddings 2019-08-06T12:42:31Z
80 MoE-Infinity 195 16 Python 7 PyTorch library for cost-effective, fast and easy serving of MoE models. 2025-06-05T20:55:11Z
81 awesome-moe-inference 193 7 None 0 Curated collection of papers in MoE model inference 2025-02-19T07:45:29Z
82 MoePhoto 191 23 Python 5 MoePhoto Image Toolbox萌图工具箱 2024-09-23T06:35:27Z
83 Yuan2.0-M32 189 41 Python 6 Mixture-of-Experts (MoE) Language Model 2024-09-09T09:14:15Z
84 PlanMoE 177 19 Python 5 This is a repository aimed at accelerating the training of MoE models, offering a more efficient scheduling method. 2025-02-14T01:39:37Z
85 CoE 169 21 Python 1 Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models 2025-05-21T06:02:34Z
86 MOEAD 168 42 Python 2 MOEAD.多目标差分进化算法的学习,Python实现&动态展示过程 2022-06-22T02:07:23Z
87 MoeRanker 167 13 JavaScript 9 Moe attribute sorter 2025-05-12T15:04:14Z
88 MOELoRA-peft 167 18 Python 7 [SIGIR’24] The official implementation code of MOELoRA. 2024-07-22T07:32:43Z
89 LLaVA-MoD 168 10 Python 1 [ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation 2025-03-31T09:41:38Z
90 ghost-theme-Moegi 164 26 Handlebars 3 An elegant & fresh ghost theme. 2023-10-16T16:09:28Z
91 MixLoRA 162 16 Python 3 State-of-the-art Parameter-Efficient MoE Fine-tuning Method 2024-08-22T08:02:04Z
92 Ling 160 15 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-05-14T06:34:57Z
93 Frequency_Aug_VAE_MoESR 160 4 Python 12 Latent-based SR using MoE and frequency augmented VAE decoder 2023-11-26T10:33:36Z
94 moedict-desktop 153 15 C++ 6 MoeDict Desktop For MacOSX / Linux / Windows 2016-10-14T06:49:17Z
95 guide.encode.moe 151 20 Markdown 9 A guide for fansubbing 2024-04-14T10:27:37Z
96 Teyvat.moe 150 43 TypeScript 102 A flexible, community-driven interactive website for Genshin Impact. 2021-08-02T01:43:13Z
97 awesome-adaptive-computation 147 9 None 0 A curated reading list of research in Adaptive Computation, Inference-Time Computation & Mixture of Experts (MoE). 2025-01-01T13:49:09Z
98 moeda 145 22 JavaScript 5 :moneybag: :chart_with_upwards_trend: A foreign exchange rates and currency conversion using CLI 2023-06-25T15:30:33Z
99 Parameter-Efficient-MoE 143 18 Python 3 Parameter-Efficient Sparsity Crafting From Dense to Mixture-of-Experts for Instruction Tuning on General Tasks 2024-09-20T02:18:30Z
100 MoeMusic 142 52 Java 2 一款基于萌否网站api的音乐管理软件 2017-01-22T06:29:59Z