MiniMax will release the first large model of MoE in China
【數(shù)據(jù)猿導(dǎo)讀】 MiniMax will release the first large model of MoE in China

On December 28, Wei Wei, vice president of MiniMax, a large model startup company in China, revealed at a sub-forum of the Digital China Forum and Digital Development Forum that the first large model based on MoE(Mixture of Experts) architecture will be released in the near future, marking OpenAI GPT-4. MoE, which stands for Expert Blending, is a deep learning technique that combines multiple models directly together to speed up model training and achieve better predictive performance. A recent paper published by researchers from Google, UC Berkeley, MIT, and other institutions demonstrates that the combination of MoE and instruction tuning can significantly improve the performance of large language models.
來源:DIYuan
刷新相關(guān)文章
我要評論
不容錯過的資訊
-
12024深圳國際物聯(lián)網(wǎng)展將于6月召開!
-
2【金猿信創(chuàng)展】鼎捷軟件——賦能產(chǎn)業(yè)自主
-
3華為技術(shù)機密被盜!公安部督辦大案,一批
-
4【金猿信創(chuàng)展】英方軟件——為用戶提供自
-
5【金猿案例展】國元證券——建立一體化智
-
6Lenovo launches Personal Agent, o
-
7Huawei insider: The PC Hongmeng o
-
8《2023中國企業(yè)數(shù)智化轉(zhuǎn)型升級服務(wù)全景圖
-
9Huawei, together with industry p
-
10微軟棄用Windows混合現(xiàn)實;科大訊飛投資
大數(shù)據(jù)企業(yè)推薦more >
大家都在搜
