1 |
LLaMA-Factory |
48978 |
5967 |
Python |
462 |
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) |
2025-05-15T02:54:36Z |
2 |
sglang |
14374 |
1762 |
Python |
504 |
SGLang is a fast serving framework for large language models and vision language models. |
2025-05-16T00:46:43Z |
3 |
ms-swift |
7574 |
642 |
Python |
698 |
Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, InternLM3, GLM4, Mistral, Yi1.5, DeepSeek-R1, …) and 200+ MLLMs (Qwen2.5-VL, Qwen2.5-Omni, Qwen2-Audio, Ovis2, InternVL3, Llava, MiniCPM-V-2.6, GLM4v, Xcomposer2.5, DeepSeek-VL2, Phi4, GOT-OCR2, …). |
2025-05-16T03:14:47Z |
4 |
trace.moe |
4665 |
237 |
None |
0 |
Anime Scene Search by Image |
2024-10-13T03:00:58Z |
5 |
Bangumi |
4313 |
145 |
TypeScript |
21 |
:electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。 |
2025-05-15T00:49:58Z |
6 |
Moeditor |
4136 |
273 |
JavaScript |
106 |
(discontinued) Your all-purpose markdown editor. |
2020-07-07T01:08:32Z |
7 |
MoeGoe |
2388 |
247 |
Python |
27 |
Executable file for VITS inference |
2023-08-22T07:17:37Z |
8 |
MoeKoeMusic |
2360 |
157 |
Vue |
29 |
一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux :electron: |
2025-05-11T06:30:20Z |
9 |
Moe-Counter |
2282 |
235 |
JavaScript |
6 |
Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 |
2025-02-06T06:16:00Z |
10 |
MoE-LLaVA |
2158 |
133 |
Python |
63 |
Mixture-of-Experts for Large Vision-Language Models |
2024-12-03T09:08:16Z |
11 |
MoBA |
1777 |
105 |
Python |
5 |
MoBA: Mixture of Block Attention for Long-Context LLMs |
2025-04-03T07:28:06Z |
12 |
fastmoe |
1721 |
195 |
Python |
26 |
A fast MoE impl for PyTorch |
2025-02-10T06:04:33Z |
13 |
DeepSeek-MoE |
1688 |
279 |
Python |
16 |
DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models |
2024-01-16T12:18:10Z |
14 |
OpenMoE |
1527 |
78 |
Python |
5 |
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models |
2024-03-08T15:08:26Z |
15 |
paimon-moe |
1451 |
275 |
JavaScript |
274 |
Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. |
2025-05-07T08:15:09Z |
16 |
MOE |
1317 |
139 |
C++ |
170 |
A global, black box optimization engine for real world metric optimization. |
2023-03-24T11:00:32Z |
17 |
moemail |
1204 |
559 |
TypeScript |
15 |
一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 |
2025-05-12T16:15:59Z |
18 |
mixture-of-experts |
1103 |
108 |
Python |
5 |
PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 |
2024-04-19T08:22:39Z |
19 |
Aria |
1035 |
86 |
Jupyter Notebook |
31 |
Codebase for Aria - an Open Multimodal Native MoE |
2025-01-22T03:25:37Z |
20 |
moepush |
1006 |
188 |
TypeScript |
4 |
一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ |
2025-05-10T11:42:44Z |
21 |
MoeTTS |
989 |
77 |
None |
0 |
Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc |
2023-03-03T07:30:05Z |
22 |
llama-moe |
961 |
58 |
Python |
5 |
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) |
2024-12-06T04:47:07Z |
23 |
Tutel |
821 |
97 |
C |
43 |
Tutel MoE: Optimized Mixture-of-Experts Library, Support DeepSeek FP8/FP4 |
2025-05-12T21:29:36Z |
24 |
moebius |
814 |
46 |
JavaScript |
39 |
Modern ANSI & ASCII Art Editor |
2024-05-02T15:54:35Z |
25 |
Adan |
790 |
67 |
Python |
3 |
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models |
2024-07-02T18:26:36Z |
26 |
MixtralKit |
767 |
79 |
Python |
12 |
A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI |
2023-12-15T19:10:55Z |
27 |
moe-theme.el |
758 |
63 |
Emacs Lisp |
14 |
A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! |
2025-02-03T18:08:33Z |
28 |
MoeMemosAndroid |
757 |
79 |
Kotlin |
78 |
An app to help you capture thoughts and ideas |
2025-02-15T07:11:32Z |
29 |
UMOE-Scaling-Unified-Multimodal-LLMs |
722 |
44 |
Python |
11 |
The codes about “Uni-MoE: Scaling Unified Multimodal Models with Mixture of Experts” |
2025-05-13T05:05:42Z |
30 |
DeepSeek-671B-SFT-Guide |
682 |
86 |
Python |
1 |
An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) |
2025-03-13T03:51:33Z |
31 |
moe |
679 |
33 |
Nim |
87 |
A command line based editor inspired by Vim. Written in Nim. |
2025-05-16T00:20:31Z |
32 |
Awesome-Mixture-of-Experts-Papers |
623 |
44 |
None |
1 |
A curated reading list of research in Mixture-of-Experts(MoE). |
2024-10-30T07:48:14Z |
33 |
vtbs.moe |
621 |
36 |
Vue |
32 |
Virtual YouTubers in bilibili |
2024-09-10T06:07:07Z |
34 |
moedict-webkit |
619 |
99 |
Objective-C |
102 |
萌典網站 |
2025-04-09T19:08:43Z |
35 |
SmartImage |
615 |
27 |
C# |
9 |
Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) |
2025-05-14T19:37:39Z |
36 |
satania.moe |
614 |
57 |
HTML |
3 |
Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you |
2022-10-09T23:19:01Z |
37 |
moebius |
607 |
43 |
Elixir |
3 |
A functional query tool for Elixir |
2024-10-23T18:55:45Z |
38 |
Time-MoE |
603 |
56 |
Python |
15 |
[ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” |
2025-05-08T04:01:06Z |
39 |
Chinese-Mixtral |
603 |
44 |
Python |
0 |
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) |
2024-04-30T04:29:06Z |
40 |
MoeGoe_GUI |
572 |
68 |
C# |
8 |
GUI for MoeGoe |
2023-08-22T07:32:08Z |
41 |
MoeList |
555 |
20 |
Kotlin |
26 |
Another unofficial Android MAL client |
2025-05-11T08:45:49Z |
42 |
MoeMemos |
552 |
47 |
Swift |
63 |
An app to help you capture thoughts and ideas |
2025-05-05T19:00:42Z |
43 |
moebooru |
550 |
83 |
Ruby |
25 |
Moebooru, a fork of danbooru1 that has been heavily modified |
2025-05-15T16:24:26Z |
44 |
trace.moe-telegram-bot |
535 |
79 |
JavaScript |
0 |
This Telegram Bot can tell the anime when you send an screenshot to it |
2025-04-28T10:02:09Z |
45 |
step_into_llm |
465 |
114 |
Jupyter Notebook |
27 |
MindSpore online courses: Step into LLM |
2025-01-06T01:50:19Z |
46 |
moerail |
444 |
33 |
JavaScript |
16 |
铁路车站代码查询 × 动车组交路查询 |
2023-02-27T03:37:18Z |
47 |
MOE |
422 |
76 |
Java |
18 |
Make Opensource Easy - tools for synchronizing repositories |
2022-06-20T22:41:08Z |
48 |
hydra-moe |
412 |
15 |
Python |
10 |
None |
2023-11-02T22:53:15Z |
49 |
pixiv.moe |
364 |
43 |
TypeScript |
0 |
😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. |
2023-03-08T06:54:34Z |
50 |
MoeLoaderP |
361 |
26 |
C# |
11 |
🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. |
2025-04-15T15:58:36Z |
51 |
WThermostatBeca |
354 |
70 |
C++ |
2 |
Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery |
2023-08-26T22:10:38Z |
52 |
notify.moe |
351 |
45 |
Go |
86 |
:dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe |
2022-09-26T07:15:05Z |
53 |
MOEAFramework |
336 |
127 |
Java |
1 |
A Free and Open Source Java Framework for Multiobjective Optimization |
2025-05-09T12:39:25Z |
54 |
moe-sticker-bot |
335 |
35 |
Go |
26 |
A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. |
2024-06-06T15:28:28Z |
55 |
st-moe-pytorch |
331 |
28 |
Python |
4 |
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch |
2024-06-17T00:48:47Z |
56 |
dialogue.moe |
328 |
8 |
Python |
1 |
None |
2022-12-14T14:50:38Z |
57 |
DiT-MoE |
323 |
15 |
Python |
5 |
Scaling Diffusion Transformers with Mixture of Experts |
2024-09-09T02:12:12Z |
58 |
moell-blog |
301 |
80 |
PHP |
2 |
基于 Laravel 开发,支持 Markdown 语法的博客 |
2022-07-31T11:51:54Z |
59 |
moeSS |
298 |
107 |
PHP |
11 |
moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser |
2015-02-27T08:44:30Z |
60 |
soft-moe-pytorch |
290 |
8 |
Python |
4 |
Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch |
2025-04-02T12:47:40Z |
61 |
moe |
279 |
46 |
Scala |
18 |
An -OFun prototype of an Ultra Modern Perl 5 |
2013-09-27T18:39:18Z |
62 |
MoeSR |
279 |
8 |
JavaScript |
7 |
An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 |
2024-04-17T12:34:26Z |
63 |
Cornell-MOE |
269 |
63 |
C++ |
25 |
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. |
2020-02-04T18:39:37Z |
64 |
android-app |
263 |
25 |
Kotlin |
7 |
Official LISTEN.moe Android app |
2025-05-14T03:20:10Z |
65 |
GRIN-MoE |
262 |
15 |
None |
0 |
GRadient-INformed MoE |
2024-09-25T18:46:48Z |
66 |
parameter-efficient-moe |
257 |
16 |
Python |
1 |
None |
2023-10-31T19:21:15Z |
67 |
MoeQuest |
251 |
76 |
Java |
1 |
The meizi of a material design style welfare App. |
2017-02-14T14:13:53Z |
68 |
MoeLoader-Delta |
245 |
37 |
C# |
52 |
Improved branching version of MoeLoader |
2021-07-22T20:47:41Z |
69 |
inferflow |
242 |
25 |
C++ |
8 |
Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). |
2024-03-15T06:52:33Z |
70 |
moeins |
241 |
68 |
PHP |
2 |
萌音影视 - 在线影视应用 |
2018-10-31T01:47:27Z |
71 |
MoH |
240 |
10 |
Python |
4 |
MoH: Multi-Head Attention as Mixture-of-Head Attention |
2024-10-29T15:22:54Z |
72 |
moebius |
232 |
4 |
PHP |
4 |
True coroutines for PHP>=8.1 without worrying about event loops and callbacks. |
2022-06-08T23:18:45Z |
73 |
gdx-pay |
230 |
87 |
Java |
8 |
A libGDX cross-platform API for InApp purchasing. |
2025-01-02T19:28:20Z |
74 |
ModuleFormer |
220 |
11 |
Python |
2 |
ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. |
2024-04-10T18:16:32Z |
75 |
MoE-plus-plus |
218 |
6 |
Python |
0 |
[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts |
2024-10-16T06:21:31Z |
76 |
MoE-Adapters4CL |
214 |
16 |
Python |
5 |
Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 |
2024-11-17T05:47:00Z |
77 |
fiddler |
209 |
20 |
Python |
2 |
[ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration |
2024-11-18T00:25:45Z |
78 |
moe |
203 |
22 |
None |
1 |
Misspelling Oblivious Word Embeddings |
2019-08-06T12:42:31Z |
79 |
MoePhoto |
189 |
23 |
Python |
5 |
MoePhoto Image Toolbox萌图工具箱 |
2024-09-23T06:35:27Z |
80 |
Yuan2.0-M32 |
186 |
40 |
Python |
6 |
Mixture-of-Experts (MoE) Language Model |
2024-09-09T09:14:15Z |
81 |
MoE-Infinity |
182 |
13 |
Python |
7 |
PyTorch library for cost-effective, fast and easy serving of MoE models. |
2025-05-12T18:36:24Z |
82 |
PlanMoE |
177 |
19 |
Python |
5 |
This is a repository aimed at accelerating the training of MoE models, offering a more efficient scheduling method. |
2025-02-14T01:39:37Z |
83 |
awesome-moe-inference |
175 |
7 |
None |
0 |
Curated collection of papers in MoE model inference |
2025-02-19T07:45:29Z |
84 |
MOEAD |
168 |
42 |
Python |
2 |
MOEAD.多目标差分进化算法的学习,Python实现&动态展示过程 |
2022-06-22T02:07:23Z |
85 |
MoeRanker |
164 |
13 |
JavaScript |
9 |
Moe attribute sorter |
2025-05-12T15:04:14Z |
86 |
ghost-theme-Moegi |
164 |
26 |
Handlebars |
3 |
An elegant & fresh ghost theme. |
2023-10-16T16:09:28Z |
87 |
MOELoRA-peft |
163 |
18 |
Python |
7 |
[SIGIR’24] The official implementation code of MOELoRA. |
2024-07-22T07:32:43Z |
88 |
CoE |
162 |
21 |
Python |
0 |
Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models |
2025-05-12T23:13:23Z |
89 |
MixLoRA |
161 |
16 |
Python |
3 |
State-of-the-art Parameter-Efficient MoE Fine-tuning Method |
2024-08-22T08:02:04Z |
90 |
Frequency_Aug_VAE_MoESR |
158 |
4 |
Python |
12 |
Latent-based SR using MoE and frequency augmented VAE decoder |
2023-11-26T10:33:36Z |
91 |
LLaVA-MoD |
157 |
10 |
Python |
0 |
[ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation |
2025-03-31T09:41:38Z |
92 |
moedict-desktop |
153 |
15 |
C++ |
6 |
MoeDict Desktop For MacOSX / Linux / Windows |
2016-10-14T06:49:17Z |
93 |
Ling |
152 |
15 |
Python |
2 |
Ling is a MoE LLM provided and open-sourced by InclusionAI. |
2025-05-14T06:34:57Z |
94 |
Teyvat.moe |
150 |
43 |
TypeScript |
102 |
A flexible, community-driven interactive website for Genshin Impact. |
2021-08-02T01:43:13Z |
95 |
guide.encode.moe |
150 |
20 |
Markdown |
9 |
A guide for fansubbing |
2024-04-14T10:27:37Z |
96 |
moeda |
145 |
22 |
JavaScript |
5 |
:moneybag: :chart_with_upwards_trend: A foreign exchange rates and currency conversion using CLI |
2023-06-25T15:30:33Z |
97 |
awesome-adaptive-computation |
144 |
9 |
None |
0 |
A curated reading list of research in Adaptive Computation, Inference-Time Computation & Mixture of Experts (MoE). |
2025-01-01T13:49:09Z |
98 |
Parameter-Efficient-MoE |
143 |
18 |
Python |
3 |
Parameter-Efficient Sparsity Crafting From Dense to Mixture-of-Experts for Instruction Tuning on General Tasks |
2024-09-20T02:18:30Z |
99 |
MoeMusic |
142 |
52 |
Java |
2 |
一款基于萌否网站api的音乐管理软件 |
2017-01-22T06:29:59Z |
100 |
makegirlsmoe.github.io |
141 |
35 |
CSS |
0 |
MakeGirls.moe Official Blog |
2017-08-21T15:06:57Z |