Kunlun open source 200 billion sparse large model Tiangong MoE, the world's first with 4090 reasoning...
DIYuan | 2024-06-04 17:14
【數(shù)據(jù)猿導讀】 Kunlun open source 200 billion sparse large model Tiangong MoE, the world's first with 4090 reasoning

June 3 news, according to Kunlun official micro, Kunlun announced open source 200 billion sparse large model Skywork-MoE, strong performance, while reasoning costs are lower. Skywork-MoE is based on the intermediate checkpoint extension of Kunlun Open Source Skywork-13B model, and is the first open source 100 billion MoE large model that fully applies and lands MoE Upcycling technology. It is also the first open source 100 billion MoE model to support inference with a single 4090 server.
來源:DIYuan
刷新相關(guān)文章
我要評論
不容錯過的資訊
-
1《2024中國AI大模型產(chǎn)業(yè)圖譜1.0版》重磅
-
2《2024中國數(shù)據(jù)要素產(chǎn)業(yè)圖譜1.0版》重磅
-
3喜訊|預策科技獲得阿里巴巴前參謀長曾鳴
-
4CBDT 2024第二屆中國出海品牌數(shù)字科技峰
-
5中國石油、中國移動、華為、科大訊飛共建
-
6字節(jié)AI Bot扣子底層已接入通義千問、Min
-
7有贊2024春季發(fā)布會:通過智能化的營銷轉(zhuǎn)
-
8內(nèi)容與出海,熱點創(chuàng)造焦點——2024虎嘯盛
-
9ETL的痛,Denodo數(shù)據(jù)編織都懂!
-
10Huasheng Tiancheng officially join
大數(shù)據(jù)企業(yè)推薦more >
大家都在搜
