Skip to content
Navigation menu
Search
Powered by Algolia
Search
Log in
Create account
DEV Community
Close
#
moe
Follow
Hide
Posts
Left menu
đź‘‹
Sign in
for the ability to sort posts by
relevant
,
latest
, or
top
.
Right menu
Mixture of Experts (MoE)
Gideon Onyewuenyi
Gideon Onyewuenyi
Gideon Onyewuenyi
Follow
Jan 5
Mixture of Experts (MoE)
#
ai
#
agents
#
machinelearning
#
moe
Comments
Add Comment
2 min read
Edge computing et AMD MI300X pour training et inférence : alternative à NVIDIA
Camille Vingere
Camille Vingere
Camille Vingere
Follow
Nov 26 '25
Edge computing et AMD MI300X pour training et inférence : alternative à NVIDIA
#
amd
#
moe
#
ibm
#
tpu
Comments
Add Comment
7 min read
Mixture of Experts Implementation using Granite4: Harnessing Specialization with the Latest Granite Family Model
Alain Airom
Alain Airom
Alain Airom
Follow
Oct 5 '25
Mixture of Experts Implementation using Granite4: Harnessing Specialization with the Latest Granite Family Model
#
granite
#
ollama
#
llm
#
moe
5
 reactions
Comments
Add Comment
11 min read
🚀 LLMs are getting huge. But do we need all that firepower all the time?
Aleksei Aleinikov
Aleksei Aleinikov
Aleksei Aleinikov
Follow
Apr 11 '25
🚀 LLMs are getting huge. But do we need all that firepower all the time?
#
ai
#
llm
#
machinelearning
#
moe
1
 reaction
Comments
Add Comment
1 min read
DBRX, Grok, Mixtral: Mixture-of-Experts is a trending architecture for LLMs
AI/ML API
AI/ML API
AI/ML API
Follow
Apr 11 '24
DBRX, Grok, Mixtral: Mixture-of-Experts is a trending architecture for LLMs
#
llm
#
moe
#
ai
Comments
Add Comment
7 min read
loading...
We're a place where coders share, stay up-to-date and grow their careers.
Log in
Create account