Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: Downgrade ORT to 1.19.2 until OpenVINO NPU EP issue is resolved #7945

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

rmccorm4
Copy link
Contributor

@rmccorm4 rmccorm4 commented Jan 16, 2025

Downgrade to working version (1.19.2 used in r24.12) for stable support of ONNX Runtime OpenVino Execution Provider until we address the issues with NPU support.

@rmccorm4 rmccorm4 requested review from mc-nv and yinggeh January 16, 2025 06:13
@yinggeh
Copy link
Contributor

yinggeh commented Jan 16, 2025

I am fine with downgrading ORT verison but need @mc-nv to confirm.

@mc-nv
Copy link
Contributor

mc-nv commented Jan 16, 2025

I"m afraid that in r24.12 we been using version 1.20.1:
https://github.com/triton-inference-server/server/blob/r24.12/build.py#L77
Container data:

$ docker run --rm -it nvcr.io/nvidia/tritonserver:24.12-py3 readelf -V /opt/tritonserver/backends/onnxruntime/libonnxruntime.so  | grep gnu.version_d -A3
Version definition section '.gnu.version_d' contains 2 entries:
 Addr: 0x00000000000058f0  Offset: 0x000058f0  Link: 4 (.dynstr)
  000000: Rev: 1  Flags: BASE  Index: 1  Cnt: 1  Name: libonnxruntime.so.1
  0x001c: Rev: 1  Flags: none  Index: 2  Cnt: 1  Name: VERS_1.20.1

Believe our fix for this issue was using of previous version of the models based on OpenVINO 2023.3
https://github.com/triton-inference-server/server/blob/r24.12/qa/common/gen_qa_model_repository#L58

@rmccorm4
Copy link
Contributor Author

rmccorm4 commented Jan 16, 2025

Believe our fix for this issue was using of previous version of the models based on OpenVINO 2023.3
https://github.com/triton-inference-server/server/blob/r24.12/qa/common/gen_qa_model_repository#L58

This is a runtime issue and not a model generation issue. It passed with the previous versions in build.py for r24.11 (confirmed with a new pipeline 22573279 for this PR as well that all ONNX jobs pass), but failed when we updated ORT to 1.20 in r24.12 as well.

I'm happy to keep/upgrade the ORT version if we can fix the OpenVino EP issues - but I don't think we should keep it broken until we have clear bandwidth or steps to fix it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants