-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Qwen2.5运行无法成功 #71
Comments
外部运行./run.sh --model qwen2.5-7b也一样的错误 |
前面的问题是项目缺少了cnpy库,解决方法: 克隆 cnpy 仓库git clone https://github.com/rogersce/cnpy.git 编译安装mkdir build 清理之前的构建rm -rf build 重新构建cmake .. 解决完后又出现: |
[BMRT][get_bdc_cmd_len:249] FATAL:BMRT_ASSERT: 0 这个报错挺奇怪的,没遇到过 看了一下的bm-smi版本,是0.4.9 LTS 有点太古老了,换成0.5.0 |
标准示例:
环境:
soc环境
transformers:4.45.2
torch: 2.5.1
LLM-TPU:commit db27775 (HEAD -> main, origin/main, origin/HEAD)
Author: yi.chu <[email protected]>
Date: Mon Jan 13 20:50:07 2025 +0800
[language_model] run bmodel successfully in pcie
tpu-mlir:? 1684x默认,项目12月14日pull
driver版本:0.4.9 LTS
libsophon:#1 SMP Wed May 22 10:11:21 CST 2024
路径:
/workspace/LLM-TPU/models/Qwen2_5/python_demo
操作:
问题:
模型使用的是官方的,下载到另一个目录中
The text was updated successfully, but these errors were encountered: