China Telecom will open source 100 billion level parameter large model within the year...
【數(shù)據(jù)猿導(dǎo)讀】 China Telecom will open source 100 billion level parameter large model within the year

April 16 news, China Telecom open source 12B parameter scale star semantic large model TeleChat-12B. Compared with the open source 7B version in January, the overall effect of the new version in terms of content, performance and application has been improved by 30%, among which, the fields of multi-round reasoning and security issues have been improved by more than 40%. In addition, China Telecom will open source 100 billion level parameter large model within the year.
來源:DIYuan
聲明:數(shù)據(jù)猿尊重媒體行業(yè)規(guī)范,相關(guān)內(nèi)容都會注明來源與作者;轉(zhuǎn)載我們原創(chuàng)內(nèi)容時,也請務(wù)必注明“來源:數(shù)據(jù)猿”與作者名稱,否則將會受到數(shù)據(jù)猿追責(zé)。
刷新相關(guān)文章
我要評論
不容錯過的資訊
-
1ByteHouse如何將OLAP性能提升百倍?
-
2海爾智家2023年業(yè)績穩(wěn)中向好,國際市場新
-
3CeMeta Sen Universe debuted at AW
-
4OpenAI和微軟都看好的AI Pin怎么就成了
-
5出門問問預(yù)計4月24日上市;2024阿里全球
-
6Create 2024百度AI開發(fā)者大會:李彥宏帶
-
7AI 對話95%準確率 、100%可解釋,Ky
-
8Baidu Intelligent Cloud signed str
-
9中科曙光收監(jiān)管工作函;李彥宏:大模型開
-
10AI公司擠破頭搶數(shù)據(jù):OpenAI“扒”視頻內(nèi)
大數(shù)據(jù)企業(yè)推薦more >
大家都在搜
