Running onnx in Jetson nano

Good morning,
I am trying to run a program that contains an onnx model on my jetson nano, but I can only run it on the CPU, as it is not visible on the GPU.

I have followed the steps of: Run onnx model on jetson nano installing the different versions, but now I am facing this problem.

2025-06-18 17:45:31.919933066 [E:onnxruntime:Default, provider_bridge_ort.cc:1992 TryGetProviderInfo_CUDA] /home/yifanl/Documents/onnxruntime/onnxruntime/core/session/provider_bridge_ort.cc:1637 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: /usr/lib/aarch64-linux-gnu/libcudnn.so.8: version `libcudnn.so.8' not found (required by /home/user/.local/lib/python3.10/site-packages/onnxruntime/capi/libonnxruntime_providers_cuda.so) 2025-06-18 17:45:31.919989418 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:965 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 8.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported. 

Can someone guide me on how to run the model on the GPU?

Hi,

Do you use Orin Nano with JetPack 6.2?
If yes, please try to install the ONNXRuntime package shared in the below link:

Thanks.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.