We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I follow the instruction with nvcr.io/nvidia/deepstream:5.1-21.02-triton, run change dim and run the app but I got this error
2021-07-06 08:31:11.413549: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.2 I0706 08:31:12.450158 4142 metrics.cc:164] found 1 GPUs supporting NVML metrics I0706 08:31:12.455850 4142 metrics.cc:173] GPU 0: NVIDIA Quadro P4000 I0706 08:31:12.456117 4142 server.cc:120] Initializing Triton Inference Server I0706 08:31:12.534433 4142 server_status.cc:55] New status tracking for model 'centerface' I0706 08:31:12.534802 4142 model_repository_manager.cc:680] loading: centerface:1 I0706 08:31:12.538328 4142 onnx_backend.cc:203] Creating instance centerface_0_0_gpu0 on GPU 0 (6.1) using model.onnx I0706 08:31:13.018687 4142 model_repository_manager.cc:837] successfully loaded 'centerface' version 1 INFO: infer_trtis_backend.cpp:206 TrtISBackend id:1 initialized model: centerface 0:00:01.979633096 4142 0x562898ea3ef0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger:<primary_gie> nvinferserver[UID 1]: Error in specifyBackendDims() <infer_trtis_context.cpp:124> [UID = 1]: failed to create trtis backend on model:centerface because tensor:input.1 input-dims is not correct 0:00:01.979667124 4142 0x562898ea3ef0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger:<primary_gie> nvinferserver[UID 1]: Error in createNNBackend() <infer_trtis_context.cpp:228> [UID = 1]: failed to specify trtis backend input dims for model:centerface, nvinfer error:NVDSINFER_CONFIG_FAILED I0706 08:31:13.020304 4142 model_repository_manager.cc:708] unloading: centerface:1 I0706 08:31:13.020987 4142 model_repository_manager.cc:816] successfully unloaded 'centerface' version 1 I0706 08:31:13.021180 4142 server.cc:179] Waiting for in-flight inferences to complete. I0706 08:31:13.021192 4142 server.cc:194] Timeout 30: Found 0 live models and 0 in-flight requests 0:00:01.980676621 4142 0x562898ea3ef0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger:<primary_gie> nvinferserver[UID 1]: Error in initialize() <infer_base_context.cpp:78> [UID = 1]: create nn-backend failed, check config file settings, nvinfer error:NVDSINFER_CONFIG_FAILED 0:00:01.980687041 4142 0x562898ea3ef0 WARN nvinferserver gstnvinferserver_impl.cpp:439:start:<primary_gie> error: Failed to initialize InferTrtIsContext 0:00:01.980691218 4142 0x562898ea3ef0 WARN nvinferserver gstnvinferserver_impl.cpp:439:start:<primary_gie> error: Config file path: /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/deepstream_triton_model_deploy/centerface/config/centerface.txt 0:00:01.980768517 4142 0x562898ea3ef0 WARN nvinferserver gstnvinferserver.cpp:460:gst_nvinfer_server_start:<primary_gie> error: gstnvinferserver_impl start failed ** ERROR: main:655: Failed to set pipeline to PAUSED Quitting ERROR from primary_gie: Failed to initialize InferTrtIsContext Debug info: gstnvinferserver_impl.cpp(439): start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInferServer:primary_gie: Config file path: /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/deepstream_triton_model_deploy/centerface/config/centerface.txt ERROR from primary_gie: gstnvinferserver_impl start failed Debug info: gstnvinferserver.cpp(460): gst_nvinfer_server_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInferServer:primary_gie App run failed
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I follow the instruction with nvcr.io/nvidia/deepstream:5.1-21.02-triton, run change dim and run the app but I got this error
2021-07-06 08:31:11.413549: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.2
I0706 08:31:12.450158 4142 metrics.cc:164] found 1 GPUs supporting NVML metrics
I0706 08:31:12.455850 4142 metrics.cc:173] GPU 0: NVIDIA Quadro P4000
I0706 08:31:12.456117 4142 server.cc:120] Initializing Triton Inference Server
I0706 08:31:12.534433 4142 server_status.cc:55] New status tracking for model 'centerface'
I0706 08:31:12.534802 4142 model_repository_manager.cc:680] loading: centerface:1
I0706 08:31:12.538328 4142 onnx_backend.cc:203] Creating instance centerface_0_0_gpu0 on GPU 0 (6.1) using model.onnx
I0706 08:31:13.018687 4142 model_repository_manager.cc:837] successfully loaded 'centerface' version 1
INFO: infer_trtis_backend.cpp:206 TrtISBackend id:1 initialized model: centerface
0:00:01.979633096 4142 0x562898ea3ef0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger:<primary_gie> nvinferserver[UID 1]: Error in specifyBackendDims() <infer_trtis_context.cpp:124> [UID = 1]: failed to create trtis backend on model:centerface because tensor:input.1 input-dims is not correct
0:00:01.979667124 4142 0x562898ea3ef0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger:<primary_gie> nvinferserver[UID 1]: Error in createNNBackend() <infer_trtis_context.cpp:228> [UID = 1]: failed to specify trtis backend input dims for model:centerface, nvinfer error:NVDSINFER_CONFIG_FAILED
I0706 08:31:13.020304 4142 model_repository_manager.cc:708] unloading: centerface:1
I0706 08:31:13.020987 4142 model_repository_manager.cc:816] successfully unloaded 'centerface' version 1
I0706 08:31:13.021180 4142 server.cc:179] Waiting for in-flight inferences to complete.
I0706 08:31:13.021192 4142 server.cc:194] Timeout 30: Found 0 live models and 0 in-flight requests
0:00:01.980676621 4142 0x562898ea3ef0 ERROR nvinferserver gstnvinferserver.cpp:362:gst_nvinfer_server_logger:<primary_gie> nvinferserver[UID 1]: Error in initialize() <infer_base_context.cpp:78> [UID = 1]: create nn-backend failed, check config file settings, nvinfer error:NVDSINFER_CONFIG_FAILED
0:00:01.980687041 4142 0x562898ea3ef0 WARN nvinferserver gstnvinferserver_impl.cpp:439:start:<primary_gie> error: Failed to initialize InferTrtIsContext
0:00:01.980691218 4142 0x562898ea3ef0 WARN nvinferserver gstnvinferserver_impl.cpp:439:start:<primary_gie> error: Config file path: /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/deepstream_triton_model_deploy/centerface/config/centerface.txt
0:00:01.980768517 4142 0x562898ea3ef0 WARN nvinferserver gstnvinferserver.cpp:460:gst_nvinfer_server_start:<primary_gie> error: gstnvinferserver_impl start failed
** ERROR: main:655: Failed to set pipeline to PAUSED
Quitting
ERROR from primary_gie: Failed to initialize InferTrtIsContext
Debug info: gstnvinferserver_impl.cpp(439): start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInferServer:primary_gie:
Config file path: /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/deepstream_triton_model_deploy/centerface/config/centerface.txt
ERROR from primary_gie: gstnvinferserver_impl start failed
Debug info: gstnvinferserver.cpp(460): gst_nvinfer_server_start (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInferServer:primary_gie
App run failed
The text was updated successfully, but these errors were encountered: