Tongyi thousand ask open source 100 billion parameter model
【數(shù)據(jù)猿導讀】 Tongyi thousand ask open source 100 billion parameter model

April 28 news, Tongyi Qianwen open source 110 billion parameter model Qwen1.5-110B, becoming the first 100 billion level parameter open source model in the whole series. The 110 billion parameter model continues the Qwen1.5 series Transformer decoder architecture and adopts the group query attention method (GQA) to make the model more efficient in inference. The 110B model supports 32K context length and has excellent multilingual capability, supporting Chinese, English, French, German, Spanish, Russian, Japanese, Korean, Vietnamese, Arabic and other languages.
來源:DIYuan
刷新相關(guān)文章
我要評論
不容錯過的資訊
-
1Kimi大模型API已支持Tool Calling功能;
-
2第八屆零售銀行發(fā)展大會隆重召開
-
3釋放數(shù)據(jù)要素價值:2024數(shù)據(jù)基礎(chǔ)設施技術(shù)
-
4百望云楊正道:從業(yè)財稅融一體化到企業(yè)數(shù)
-
5月之暗面回應創(chuàng)始人套現(xiàn)不屬實;圖靈公司
-
6小鵬汽車首次發(fā)布AI天璣系統(tǒng);臺積電首度
-
7北京規(guī)劃支撐萬億級大模型的智算集群;科
-
8Dongfeng Motor and DJI vehicle re
-
9助推數(shù)據(jù)資產(chǎn)入表 中經(jīng)社成立數(shù)據(jù)資產(chǎn)運
-
10全國首個醫(yī)聯(lián)體區(qū)域私有化部署:北京海淀
大數(shù)據(jù)企業(yè)推薦more >
大家都在搜
