Subscribe to SOOFTWARE

Stay up to date! Get all the latest & greatest posts delivered straight to your inbox

mixtral

A collection of 1 post

What is MoE? (Mixture of Experts) cover image
nlp, mixtral, 

What is MoE? (Mixture of Experts)

What it MoE? (Mixture of Experts) 현존 최강 LLM인 GPT-4에서 “MoE (Mixture of Experts)” 방식을 채택하여 사용하고 있다고 알려졌는데요, 최근 AI계의 뜨거운 감자 Mistral AI…