Github-Ranking-AI

A list of the most popular AI Topic repositories on GitHub based on the number of stars they have received.| AI相关主题Github仓库排名,每日自动更新。

View on GitHub

Github Ranking

Top 100 Stars in MoE

Ranking Project Name Stars Forks Language Open Issues Description Last Commit
1 LLaMA-Factory 47566 5804 Python 425 Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024) 2025-04-24T08:31:06Z
2 sglang 13526 1587 Python 513 SGLang is a fast serving framework for large language models and vision language models. 2025-04-24T23:09:59Z
3 trace.moe 4651 237 None 0 Anime Scene Search by Image 2024-10-13T03:00:58Z
4 Bangumi 4262 144 TypeScript 25 :electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。 2025-04-24T20:38:55Z
5 Moeditor 4134 273 JavaScript 106 (discontinued) Your all-purpose markdown editor. 2020-07-07T01:08:32Z
6 MoeGoe 2380 248 Python 27 Executable file for VITS inference 2023-08-22T07:17:37Z
7 Moe-Counter 2242 235 JavaScript 6 Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器 2025-02-06T06:16:00Z
8 MoeKoeMusic 2176 144 Vue 22 一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux :electron: 2025-04-22T09:31:34Z
9 MoE-LLaVA 2151 134 Python 63 Mixture-of-Experts for Large Vision-Language Models 2024-12-03T09:08:16Z
10 MoBA 1756 104 Python 6 MoBA: Mixture of Block Attention for Long-Context LLMs 2025-04-03T07:28:06Z
11 fastmoe 1711 196 Python 26 A fast MoE impl for PyTorch 2025-02-10T06:04:33Z
12 DeepSeek-MoE 1659 277 Python 16 DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models 2024-01-16T12:18:10Z
13 OpenMoE 1513 78 Python 5 A family of open-sourced Mixture-of-Experts (MoE) Large Language Models 2024-03-08T15:08:26Z
14 paimon-moe 1442 275 JavaScript 270 Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter. 2025-04-15T13:18:25Z
15 MOE 1315 139 C++ 170 A global, black box optimization engine for real world metric optimization. 2023-03-24T11:00:32Z
16 mixture-of-experts 1099 108 Python 5 PyTorch Re-Implementation of “The Sparsely-Gated Mixture-of-Experts Layer” by Noam Shazeer et al. https://arxiv.org/abs/1701.06538 2024-04-19T08:22:39Z
17 moemail 1093 467 TypeScript 13 一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉 2025-04-06T13:01:19Z
18 Aria 1029 87 Jupyter Notebook 31 Codebase for Aria - an Open Multimodal Native MoE 2025-01-22T03:25:37Z
19 MoeTTS 987 78 None 0 Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc 2023-03-03T07:30:05Z
20 llama-moe 957 56 Python 5 ⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024) 2024-12-06T04:47:07Z
21 moepush 939 157 TypeScript 4 一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨ 2025-04-20T04:19:32Z
22 moebius 810 45 JavaScript 38 Modern ANSI & ASCII Art Editor 2024-05-02T15:54:35Z
23 Tutel 808 96 Python 42 Tutel MoE: Optimized Mixture-of-Experts Library, Support DeepSeek FP8/FP4 2025-04-24T15:58:58Z
24 Adan 789 67 Python 3 Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models 2024-07-02T18:26:36Z
25 MixtralKit 769 80 Python 12 A toolkit for inference and evaluation of ‘mixtral-8x7b-32kseqlen’ from Mistral AI 2023-12-15T19:10:55Z
26 moe-theme.el 759 63 Emacs Lisp 14 A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun! 2025-02-03T18:08:33Z
27 MoeMemosAndroid 744 79 Kotlin 76 An app to help you capture thoughts and ideas 2025-02-15T07:11:32Z
28 UMOE-Scaling-Unified-Multimodal-LLMs 717 44 Python 11 The codes about “Uni-MoE: Scaling Unified Multimodal Models with Mixture of Experts” 2025-04-10T10:05:36Z
29 moe 680 33 Nim 84 A command line based editor inspired by Vim. Written in Nim. 2025-04-24T00:36:37Z
30 DeepSeek-671B-SFT-Guide 660 81 Python 1 An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。) 2025-03-13T03:51:33Z
31 vtbs.moe 620 36 Vue 32 Virtual YouTubers in bilibili 2024-09-10T06:07:07Z
32 moedict-webkit 619 99 Objective-C 102 萌典網站 2025-04-09T19:08:43Z
33 Awesome-Mixture-of-Experts-Papers 618 45 None 1 A curated reading list of research in Mixture-of-Experts(MoE). 2024-10-30T07:48:14Z
34 satania.moe 616 57 HTML 3 Satania IS the BEST waifu, no really, she is, if you don’t believe me, this website will convince you 2022-10-09T23:19:01Z
35 SmartImage 609 27 C# 8 Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more) 2025-04-16T07:43:29Z
36 moebius 607 43 Elixir 3 A functional query tool for Elixir 2024-10-23T18:55:45Z
37 Chinese-Mixtral 604 44 Python 1 中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs) 2024-04-30T04:29:06Z
38 Time-MoE 573 51 Python 12 [ICLR 2025 Spotlight] Official implementation of “Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts” 2025-03-30T13:28:06Z
39 MoeGoe_GUI 570 68 C# 8 GUI for MoeGoe 2023-08-22T07:32:08Z
40 moebooru 547 83 Ruby 25 Moebooru, a fork of danbooru1 that has been heavily modified 2025-04-22T06:38:32Z
41 MoeMemos 546 47 Swift 60 An app to help you capture thoughts and ideas 2025-03-23T09:50:49Z
42 MoeList 545 20 Kotlin 25 Another unofficial Android MAL client 2025-04-13T07:41:36Z
43 trace.moe-telegram-bot 530 80 JavaScript 0 This Telegram Bot can tell the anime when you send an screenshot to it 2025-04-18T02:00:09Z
44 step_into_llm 458 111 Jupyter Notebook 27 MindSpore online courses: Step into LLM 2025-01-06T01:50:19Z
45 moerail 441 31 JavaScript 14 铁路车站代码查询 × 动车组交路查询 2023-02-27T03:37:18Z
46 MOE 422 76 Java 18 Make Opensource Easy - tools for synchronizing repositories 2022-06-20T22:41:08Z
47 hydra-moe 412 15 Python 10 None 2023-11-02T22:53:15Z
48 pixiv.moe 364 43 TypeScript 0 😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity. 2023-03-08T06:54:34Z
49 MoeLoaderP 354 25 C# 11 🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc. 2025-04-15T15:58:36Z
50 WThermostatBeca 354 70 C++ 3 Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery 2023-08-26T22:10:38Z
51 notify.moe 351 45 Go 86 :dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe 2022-09-26T07:15:05Z
52 MOEAFramework 336 127 Java 0 A Free and Open Source Java Framework for Multiobjective Optimization 2025-04-15T13:10:59Z
53 st-moe-pytorch 328 28 Python 4 Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch 2024-06-17T00:48:47Z
54 moe-sticker-bot 328 35 Go 26 A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set. 2024-06-06T15:28:28Z
55 dialogue.moe 325 8 Python 1 None 2022-12-14T14:50:38Z
56 DiT-MoE 313 14 Python 5 Scaling Diffusion Transformers with Mixture of Experts 2024-09-09T02:12:12Z
57 moell-blog 301 81 PHP 2 基于 Laravel 开发,支持 Markdown 语法的博客 2022-07-31T11:51:54Z
58 moeSS 298 107 PHP 11 moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser 2015-02-27T08:44:30Z
59 soft-moe-pytorch 283 8 Python 4 Implementation of Soft MoE, proposed by Brain’s Vision team, in Pytorch 2025-04-02T12:47:40Z
60 moe 279 46 Scala 18 An -OFun prototype of an Ultra Modern Perl 5 2013-09-27T18:39:18Z
61 Cornell-MOE 269 63 C++ 25 A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++. 2020-02-04T18:39:37Z
62 MoeSR 268 8 JavaScript 7 An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用 2024-04-17T12:34:26Z
63 GRIN-MoE 262 16 None 0 GRadient-INformed MoE 2024-09-25T18:46:48Z
64 android-app 262 25 Kotlin 7 Official LISTEN.moe Android app 2025-04-24T22:46:48Z
65 parameter-efficient-moe 255 16 Python 1 None 2023-10-31T19:21:15Z
66 MoeQuest 251 76 Java 1 The meizi of a material design style welfare App. 2017-02-14T14:13:53Z
67 MoeLoader-Delta 244 37 C# 52 Improved branching version of MoeLoader 2021-07-22T20:47:41Z
68 inferflow 242 25 C++ 8 Inferflow is an efficient and highly configurable inference engine for large language models (LLMs). 2024-03-15T06:52:33Z
69 moeins 241 68 PHP 2 萌音影视 - 在线影视应用 2018-10-31T01:47:27Z
70 MoH 237 10 Python 4 MoH: Multi-Head Attention as Mixture-of-Head Attention 2024-10-29T15:22:54Z
71 moebius 232 4 PHP 4 True coroutines for PHP>=8.1 without worrying about event loops and callbacks. 2022-06-08T23:18:45Z
72 gdx-pay 228 87 Java 8 A libGDX cross-platform API for InApp purchasing. 2025-01-02T19:28:20Z
73 ModuleFormer 220 11 Python 2 ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters. 2024-04-10T18:16:32Z
74 MoE-Adapters4CL 210 15 Python 4 Code for paper “Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters” CVPR2024 2024-11-17T05:47:00Z
75 MoE-plus-plus 210 6 Python 0 [ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts 2024-10-16T06:21:31Z
76 fiddler 208 20 Python 2 [ICLR’25] Fast Inference of MoE Models with CPU-GPU Orchestration 2024-11-18T00:25:45Z
77 moe 203 22 None 1 Misspelling Oblivious Word Embeddings 2019-08-06T12:42:31Z
78 MoePhoto 189 23 Python 5 MoePhoto Image Toolbox萌图工具箱 2024-09-23T06:35:27Z
79 Yuan2.0-M32 186 40 Python 6 Mixture-of-Experts (MoE) Language Model 2024-09-09T09:14:15Z
80 PlanMoE 177 19 Python 5 This is a repository aimed at accelerating the training of MoE models, offering a more efficient scheduling method. 2025-02-14T01:39:37Z
81 MoE-Infinity 171 13 Python 9 PyTorch library for cost-effective, fast and easy serving of MoE models. 2025-04-21T12:17:23Z
82 MOEAD 166 42 Python 2 MOEAD.多目标差分进化算法的学习,Python实现&动态展示过程 2022-06-22T02:07:23Z
83 ghost-theme-Moegi 164 26 Handlebars 3 An elegant & fresh ghost theme. 2023-10-16T16:09:28Z
84 MOELoRA-peft 160 19 Python 4 [SIGIR’24] The official implementation code of MOELoRA. 2024-07-22T07:32:43Z
85 CoE 158 21 Python 0 Chain of Experts (CoE) enables communication between experts within Mixture-of-Experts (MoE) models 2025-04-24T13:20:30Z
86 MixLoRA 157 16 Python 3 State-of-the-art Parameter-Efficient MoE Fine-tuning Method 2024-08-22T08:02:04Z
87 Frequency_Aug_VAE_MoESR 155 4 Python 12 Latent-based SR using MoE and frequency augmented VAE decoder 2023-11-26T10:33:36Z
88 moedict-desktop 153 15 C++ 6 MoeDict Desktop For MacOSX / Linux / Windows 2016-10-14T06:49:17Z
89 guide.encode.moe 151 20 Markdown 9 A guide for fansubbing 2024-04-14T10:27:37Z
90 awesome-moe-inference 151 6 None 0 Curated collection of papers in MoE model inference 2025-02-19T07:45:29Z
91 Teyvat.moe 150 42 TypeScript 102 A flexible, community-driven interactive website for Genshin Impact. 2021-08-02T01:43:13Z
92 moeda 143 22 JavaScript 5 :moneybag: :chart_with_upwards_trend: A foreign exchange rates and currency conversion using CLI 2023-06-25T15:30:33Z
93 Ling 143 14 Python 2 Ling is a MoE LLM provided and open-sourced by InclusionAI. 2025-04-18T23:15:00Z
94 awesome-adaptive-computation 143 9 None 0 A curated reading list of research in Adaptive Computation, Inference-Time Computation & Mixture of Experts (MoE). 2025-01-01T13:49:09Z
95 MoeMusic 142 52 Java 2 一款基于萌否网站api的音乐管理软件 2017-01-22T06:29:59Z
96 Parameter-Efficient-MoE 142 18 Python 3 Parameter-Efficient Sparsity Crafting From Dense to Mixture-of-Experts for Instruction Tuning on General Tasks 2024-09-20T02:18:30Z
97 makegirlsmoe.github.io 141 35 CSS 0 MakeGirls.moe Official Blog 2017-08-21T15:06:57Z
98 moedict-data 141 28 None 0 教育部重編國語辭典 資料檔; 若有建議或 bug 請在 moedict-process 反應 2023-02-17T00:42:39Z
99 MoEx 141 19 Python 1 MoEx (Moment Exchange) 2021-06-24T02:52:22Z
100 LLaVA-MoD 140 10 Python 14 [ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation 2025-03-31T09:41:38Z