1 |
LLaMA-Factory |
45768 |
5596 |
Python |
423 |
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) |
2025-03-31T16:15:16Z |
2 |
sglang |
12715 |
1403 |
Python |
437 |
SGLang is a fast serving framework for large language models and vision language models. |
2025-04-01T03:45:07Z |
3 |
trace.moe |
4620 |
236 |
None |
0 |
Anime Scene Search by Image |
2024-10-13T03:00:58Z |
4 |
Bangumi |
4159 |
143 |
TypeScript |
24 |
:electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。 |
2025-03-30T21:41:38Z |
5 |
Moeditor |
4131 |
274 |
JavaScript |
106 |
(discontinued) Your all-purpose markdown editor. |
2020-07-07T01:08:32Z |
6 |
MoeGoe |
2377 |
250 |
Python |
27 |
Executable file for VITS inference |
2023-08-22T07:17:37Z |
7 |
Moe-Counter |
2191 |
236 |
JavaScript |
6 |
Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 |
2025-02-06T06:16:00Z |
8 |
MoE-LLaVA |
2131 |
134 |
Python |
63 |
Mixture-of-Experts for Large Vision-Language Models |
2024-12-03T09:08:16Z |
9 |
MoeKoeMusic |
1853 |
127 |
Vue |
30 |
一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux :electron: |
2025-03-30T08:10:09Z |
10 |
MoBA |
1700 |
101 |
Python |
7 |
MoBA: Mixture of Block Attention for Long-Context LLMs |
2025-03-07T07:14:41Z |
11 |
fastmoe |
1685 |
194 |
Python |
25 |
A fast MoE impl for PyTorch |
2025-02-10T06:04:33Z |
12 |
DeepSeek-MoE |
1616 |
275 |
Python |
15 |
DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models |
2024-01-16T12:18:10Z |
13 |
OpenMoE |
1491 |
77 |
Python |
5 |
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models |
2024-03-08T15:08:26Z |
14 |
paimon-moe |
1437 |
275 |
JavaScript |
267 |
Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. |
2025-03-26T02:17:48Z |
15 |
MOE |
1312 |
140 |
C++ |
170 |
A global, black box optimization engine for real world metric optimization. |
2023-03-24T11:00:32Z |
16 |
mixture-of-experts |
1084 |
109 |
Python |
5 |
PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 |
2024-04-19T08:22:39Z |
17 |
Aria |
1030 |
86 |
Jupyter Notebook |
31 |
Codebase for Aria - an Open Multimodal Native MoE |
2025-01-22T03:25:37Z |
18 |
MoeTTS |
985 |
78 |
None |
0 |
Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc |
2023-03-03T07:30:05Z |
19 |
llama-moe |
942 |
55 |
Python |
5 |
⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) |
2024-12-06T04:47:07Z |
20 |
moemail |
907 |
358 |
TypeScript |
11 |
一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 |
2025-03-31T15:00:50Z |
21 |
moebius |
803 |
45 |
JavaScript |
38 |
Modern ANSI & ASCII Art Editor |
2024-05-02T15:54:35Z |
22 |
Tutel |
790 |
95 |
Python |
42 |
Tutel MoE: An Optimized Mixture-of-Experts Implementation |
2025-03-31T17:48:56Z |
23 |
Adan |
784 |
67 |
Python |
3 |
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models |
2024-07-02T18:26:36Z |
24 |
MixtralKit |
767 |
80 |
Python |
12 |
A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI |
2023-12-15T19:10:55Z |
25 |
moe-theme.el |
758 |
63 |
Emacs Lisp |
14 |
A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! |
2025-02-03T18:08:33Z |
26 |
MoeMemosAndroid |
728 |
77 |
Kotlin |
71 |
An app to help you capture thoughts and ideas |
2025-02-15T07:11:32Z |
27 |
UMOE-Scaling-Unified-Multimodal-LLMs |
709 |
42 |
Python |
11 |
The codes about “Uni-MoE: Scaling Unified Multimodal Models with Mixture of Experts” |
2025-01-27T13:40:11Z |
28 |
moepush |
700 |
85 |
TypeScript |
2 |
一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ |
2025-03-29T16:11:36Z |
29 |
moe |
677 |
33 |
Nim |
84 |
A command line based editor inspired by Vim. Written in Nim. |
2025-03-31T00:30:45Z |
30 |
vtbs.moe |
620 |
36 |
Vue |
32 |
Virtual YouTubers in bilibili |
2024-09-10T06:07:07Z |
31 |
moedict-webkit |
617 |
100 |
Objective-C |
102 |
萌典網站 |
2025-03-28T19:04:05Z |
32 |
satania.moe |
615 |
57 |
HTML |
3 |
Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you |
2022-10-09T23:19:01Z |
33 |
SmartImage |
608 |
27 |
C# |
7 |
Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) |
2025-03-26T19:57:47Z |
34 |
moebius |
606 |
43 |
Elixir |
3 |
A functional query tool for Elixir |
2024-10-23T18:55:45Z |
35 |
Awesome-Mixture-of-Experts-Papers |
605 |
45 |
None |
1 |
A curated reading list of research in Mixture-of-Experts(MoE). |
2024-10-30T07:48:14Z |
36 |
Chinese-Mixtral |
603 |
44 |
Python |
0 |
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) |
2024-04-30T04:29:06Z |
37 |
DeepSeek-671B-SFT-Guide |
571 |
72 |
Python |
0 |
An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) |
2025-03-13T03:51:33Z |
38 |
MoeGoe_GUI |
570 |
68 |
C# |
8 |
GUI for MoeGoe |
2023-08-22T07:32:08Z |
39 |
MoeList |
545 |
19 |
Kotlin |
25 |
Another unofficial Android MAL client |
2025-03-30T16:05:34Z |
40 |
moebooru |
542 |
82 |
Ruby |
25 |
Moebooru, a fork of danbooru1 that has been heavily modified |
2025-03-29T18:33:28Z |
41 |
MoeMemos |
535 |
47 |
Swift |
59 |
An app to help you capture thoughts and ideas |
2025-03-23T09:50:49Z |
42 |
trace.moe-telegram-bot |
530 |
82 |
JavaScript |
0 |
This Telegram Bot can tell the anime when you send an screenshot to it |
2025-03-19T14:39:12Z |
43 |
Time-MoE |
529 |
43 |
Python |
14 |
[ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” |
2025-03-30T13:28:06Z |
44 |
step_into_llm |
455 |
110 |
Jupyter Notebook |
27 |
MindSpore online courses: Step into LLM |
2025-01-06T01:50:19Z |
45 |
moerail |
433 |
30 |
JavaScript |
14 |
铁路车站代码查询 × 动车组交路查询 |
2023-02-27T03:37:18Z |
46 |
MOE |
421 |
76 |
Java |
18 |
Make Opensource Easy - tools for synchronizing repositories |
2022-06-20T22:41:08Z |
47 |
hydra-moe |
412 |
15 |
Python |
10 |
None |
2023-11-02T22:53:15Z |
48 |
pixiv.moe |
363 |
43 |
TypeScript |
0 |
😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. |
2023-03-08T06:54:34Z |
49 |
WThermostatBeca |
353 |
70 |
C++ |
3 |
Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery |
2023-08-26T22:10:38Z |
50 |
notify.moe |
351 |
45 |
Go |
86 |
:dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe |
2022-09-26T07:15:05Z |
51 |
MoeLoaderP |
350 |
24 |
C# |
11 |
🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. |
2023-10-18T23:13:10Z |
52 |
MOEAFramework |
335 |
128 |
Java |
0 |
A Free and Open Source Java Framework for Multiobjective Optimization |
2025-03-21T14:18:17Z |
53 |
st-moe-pytorch |
324 |
28 |
Python |
4 |
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch |
2024-06-17T00:48:47Z |
54 |
moe-sticker-bot |
323 |
35 |
Go |
25 |
A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. |
2024-06-06T15:28:28Z |
55 |
dialogue.moe |
322 |
8 |
Python |
1 |
None |
2022-12-14T14:50:38Z |
56 |
DiT-MoE |
303 |
14 |
Python |
5 |
Scaling Diffusion Transformers with Mixture of Experts |
2024-09-09T02:12:12Z |
57 |
moell-blog |
301 |
81 |
PHP |
2 |
基于 Laravel 开发,支持 Markdown 语法的博客 |
2022-07-31T11:51:54Z |
58 |
moeSS |
298 |
107 |
PHP |
11 |
moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser |
2015-02-27T08:44:30Z |
59 |
moe |
279 |
46 |
Scala |
18 |
An -OFun prototype of an Ultra Modern Perl 5 |
2013-09-27T18:39:18Z |
60 |
soft-moe-pytorch |
272 |
8 |
Python |
4 |
Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch |
2024-04-24T15:23:45Z |
61 |
Cornell-MOE |
268 |
63 |
C++ |
25 |
A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. |
2020-02-04T18:39:37Z |
62 |
GRIN-MoE |
261 |
16 |
None |
0 |
GRadient-INformed MoE |
2024-09-25T18:46:48Z |
63 |
MoeSR |
261 |
8 |
JavaScript |
7 |
An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 |
2024-04-17T12:34:26Z |
64 |
android-app |
260 |
25 |
Kotlin |
7 |
Official LISTEN.moe Android app |
2025-03-30T01:30:48Z |
65 |
parameter-efficient-moe |
253 |
16 |
Python |
1 |
None |
2023-10-31T19:21:15Z |
66 |
MoeQuest |
251 |
76 |
Java |
1 |
The meizi of a material design style welfare App. |
2017-02-14T14:13:53Z |
67 |
MoeLoader-Delta |
243 |
37 |
C# |
52 |
Improved branching version of MoeLoader |
2021-07-22T20:47:41Z |
68 |
moeins |
241 |
68 |
PHP |
2 |
萌音影视 - 在线影视应用 |
2018-10-31T01:47:27Z |
69 |
inferflow |
239 |
25 |
C++ |
8 |
Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). |
2024-03-15T06:52:33Z |
70 |
moebius |
231 |
4 |
PHP |
4 |
True coroutines for PHP>=8.1 without worrying about event loops and callbacks. |
2022-06-08T23:18:45Z |
71 |
MoH |
229 |
9 |
Python |
3 |
MoH: Multi-Head Attention as Mixture-of-Head Attention |
2024-10-29T15:22:54Z |
72 |
gdx-pay |
225 |
86 |
Java |
8 |
A libGDX cross-platform API for InApp purchasing. |
2025-01-02T19:28:20Z |
73 |
ModuleFormer |
217 |
11 |
Python |
2 |
ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. |
2024-04-10T18:16:32Z |
74 |
moe |
203 |
22 |
None |
1 |
Misspelling Oblivious Word Embeddings |
2019-08-06T12:42:31Z |
75 |
fiddler |
202 |
18 |
Python |
2 |
[ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration |
2024-11-18T00:25:45Z |
76 |
MoE-Adapters4CL |
201 |
15 |
Python |
4 |
Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 |
2024-11-17T05:47:00Z |
77 |
MoE-plus-plus |
198 |
6 |
Python |
0 |
[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts |
2024-10-16T06:21:31Z |
78 |
MoePhoto |
189 |
23 |
Python |
5 |
MoePhoto Image Toolbox萌图工具箱 |
2024-09-23T06:35:27Z |
79 |
Yuan2.0-M32 |
185 |
41 |
Python |
6 |
Mixture-of-Experts (MoE) Language Model |
2024-09-09T09:14:15Z |
80 |
PlanMoE |
177 |
19 |
Python |
5 |
This is a repository aimed at accelerating the training of MoE models, offering a more efficient scheduling method. |
2025-02-14T01:39:37Z |
81 |
MOEAD |
164 |
42 |
Python |
2 |
MOEAD.多目标差分进化算法的学习,Python实现&动态展示过程 |
2022-06-22T02:07:23Z |
82 |
ghost-theme-Moegi |
164 |
26 |
Handlebars |
3 |
An elegant & fresh ghost theme. |
2023-10-16T16:09:28Z |
83 |
MoE-Infinity |
159 |
12 |
Python |
10 |
PyTorch library for cost-effective, fast and easy serving of MoE models. |
2025-03-27T18:25:07Z |
84 |
Frequency_Aug_VAE_MoESR |
155 |
4 |
Python |
12 |
Latent-based SR using MoE and frequency augmented VAE decoder |
2023-11-26T10:33:36Z |
85 |
MOELoRA-peft |
154 |
19 |
Python |
4 |
[SIGIR’24] The official implementation code of MOELoRA. |
2024-07-22T07:32:43Z |
86 |
MixLoRA |
152 |
15 |
Python |
1 |
State-of-the-art Parameter-Efficient MoE Fine-tuning Method |
2024-08-22T08:02:04Z |
87 |
CoE |
152 |
18 |
Python |
0 |
Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models |
2025-03-14T16:57:31Z |
88 |
guide.encode.moe |
151 |
20 |
Markdown |
9 |
A guide for fansubbing |
2024-04-14T10:27:37Z |
89 |
moedict-desktop |
151 |
15 |
C++ |
6 |
MoeDict Desktop For MacOSX / Linux / Windows |
2016-10-14T06:49:17Z |
90 |
Teyvat.moe |
150 |
43 |
TypeScript |
102 |
A flexible, community-driven interactive website for Genshin Impact. |
2021-08-02T01:43:13Z |
91 |
moeda |
143 |
22 |
JavaScript |
5 |
:moneybag: :chart_with_upwards_trend: A foreign exchange rates and currency conversion using CLI |
2023-06-25T15:30:33Z |
92 |
MoeMusic |
142 |
52 |
Java |
2 |
一款基于萌否网站api的音乐管理软件 |
2017-01-22T06:29:59Z |
93 |
MoEx |
142 |
19 |
Python |
1 |
MoEx (Moment Exchange) |
2021-06-24T02:52:22Z |
94 |
Parameter-Efficient-MoE |
142 |
18 |
Python |
3 |
Parameter-Efficient Sparsity Crafting From Dense to Mixture-of-Experts for Instruction Tuning on General Tasks |
2024-09-20T02:18:30Z |
95 |
makegirlsmoe.github.io |
141 |
35 |
CSS |
0 |
MakeGirls.moe Official Blog |
2017-08-21T15:06:57Z |
96 |
moedict-data |
139 |
28 |
None |
0 |
教育部重編國語辭典 資料檔; 若有建議或 bug 請在 moedict-process 反應 |
2023-02-17T00:42:39Z |
97 |
awesome-adaptive-computation |
139 |
9 |
None |
0 |
A curated reading list of research in Adaptive Computation, Inference-Time Computation & Mixture of Experts (MoE). |
2025-01-01T13:49:09Z |
98 |
nonebot-plugin-moegoe |
135 |
15 |
Python |
9 |
用API让原神角色说话! |
2024-05-27T06:14:27Z |
99 |
theindex |
134 |
27 |
TypeScript |
1 |
The frontend, editor panel, and API of TheIndex.moe |
2025-03-28T22:16:31Z |
100 |
archbox |
131 |
5 |
Shell |
1 |
Easy to use Arch Linux chroot environment with some functionalities to integrate it with your existing Linux installation. Mirror of https://momodev.lemniskett.moe/lemniskett/archbox |
2022-05-28T15:46:32Z |