70 billion parameters APUS large model 3.0 Lingli officially open source...
【數(shù)據(jù)猿導讀】 70 billion parameters APUS large model 3.0 Lingli officially open source

On February 7, APUS and the National Engineering Laboratory of Big Data System Computing Technology of Shenzhen University (hereinafter referred to as the "National Engineering Laboratory of Big Data") jointly trained the Lingli Linly-70B Chinese large model, and officially launched the open source on GitHub, which is the first open source large model of APUS Big Model 3.0. APUS Grand Model 3.0 Lingli scored 80.6 points in the Chinese benchmark assessment list C-Eval, surpassing GPT-4 in Chinese proficiency and ranking third among all participating models, significantly improving compared with the original open source model benchmark LLaMa2-70B. It is understood that based on the computing power support of APUS Zhengzhou Intelligent Computing Center, APUS large model 3.0 Lingli takes 3 months to complete training, and the current context length is set to 4K (about 8000-10,000 Chinese characters).
來源:DIYuan
刷新相關(guān)文章
我要評論
不容錯過的資訊
-
1【金猿產(chǎn)品展】有數(shù)ChatBI——基于大模型
-
2微軟為新聞編輯行業(yè)推出 AI 工具;天工
-
3阿維塔科技擬2025年赴港上市;通用Cruise
-
4Tiangong 2.0 MoE large model rel
-
5艾倫人工智能研究所發(fā)布大模型;卡爾動力
-
6估值下跌百億美元?日活過億,小紅書陷入
-
7New patent application for displa
-
8【金猿人物展】數(shù)元靈科技CEO朱亞東:何
-
9Zte: Has launched the R6900 G5 A
-
102023中國工業(yè)互聯(lián)網(wǎng)領(lǐng)域最具商業(yè)合作價值
大數(shù)據(jù)企業(yè)推薦more >
大家都在搜
