福模

免费开源AI模型下载_本地AI工具资源平台

大语言模型Large Language Models

ERNIE Bot 4.5语义理解AI模型 - 中文NLP优化

ERNIE Bot 4.5 Semantic Understanding AI Model - Chinese NLP Optimized

ERNIE Bot 4.5语义理解AI模型,专门为中文NLP优化的语义理解模型。在中文语言理解、生成和推理方面表现出色,支持多种应用场景。

ERNIE Bot 4.5 semantic understanding AI model, a semantic understanding model specifically optimized for Chinese NLP. Excels in Chinese language understanding, generation, and reasoning, supporting multiple application scenarios.

ERNIE Bot中文NLP语义理解语言模型ERNIE BotChinese NLPSemantic UnderstandingLanguage Model

文件大小

7.8 GB

Upload Size

7.8 GB

上传日期

2025-02-13

Upload Date

2025-02-13

下载次数

18,900

Downloads

18,900

评分

4.6/5.0

Rating

4.6/5.0

下载资源 Download Resources

下载资源表示您同意我们的使用条款和隐私政策

By downloading this resource, you agree to our Terms of Service and Privacy Policy

相关资源推荐

PaLM-E具身AI模型 - 多模态大语言模型PaLM-E Embodied AI Model - Multimodal Large Language Model

PaLM-E具身AI模型,结合视觉和语言能力的多模态大语言模型。能够在物理世界中执行任务,将语言理解与感知相结合。

PaLM-E embodied AI model, a multimodal large language model combining vision and language capabilities. Capable of performing tasks in the physical world, combining language understanding with perception.

PaLM-E具身AI多模态PaLM-EEmbodied AIMultimodal
22.3 GB2025-02-05
LLaMA 3 中文优化模型 - 支持多轮对话和代码生成LLaMA 3 Chinese-Optimized Model - Supporting Multi-Turn Conversations and Code Generation

LLaMA 3中文优化模型,支持多轮对话和代码生成。经过中文语料增强训练,对话连贯性强,代码生成准确率高。提供不同参数量版本以适应不同需求。

LLaMA 3 Chinese-optimized model supporting multi-turn conversations and code generation. Trained with enhanced Chinese corpora, featuring strong conversational coherence and high accuracy in code generation. Provides different parameter versions to suit varying needs.

LLaMA 3中文优化对话模型LLaMA 3Chinese OptimizedConversation Model
32.4 GB2024-01-13
Chinchilla大语言AI模型 - 训练效率优化Chinchilla Large Language AI Model - Training Efficiency Optimization

Chinchilla大语言AI模型,训练效率优化的语言模型。证明了缩放定律,展示了参数数量与训练数据的关系,实现了更高效的训练。

Chinchilla large language AI model, a training efficiency optimized language model. Proves the scaling law, demonstrating the relationship between parameter count and training data, achieving more efficient training.

Chinchilla大语言模型训练效率ChinchillaLarge Language ModelTraining Efficiency
25.6 GB2025-02-11