File Not found Error, File exists in the path mentioned
Following are the details
Created the Docker container called as taoyolov4 using the following command
docker run --runtime=nvidia -it -p 7001:8001 -v /var/run/docker.sock:/var/run/docker.sock nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.5-py3 /bin/bash
I login into the container using
'Docker exec -it taoyolov4 bash
Following is the directory structure of my workspace
/workspace/cv_samples_v1.3.0/yolo_v4/
init.py
specs
tao-experiments
yolo_v4.ipynb
#Set up env variables and map drives
Setting up env variables for cleaner command line commands.
import os print("Please replace the variable with your key.") %env KEY=ZGNpYXZ0NHE1czFmbDBlcGR0Z2RzOHJqcWw6NGZjMjUwMDMtN2QyNC00MzYzLTlhZDctOTA1MDM3YTUwYTMy %env USER_EXPERIMENT_DIR=/workspace/cv_samples_v1.3.0/yolo_v4/tao-experiments/yolo_v4 %env DATA_DOWNLOAD_DIR=/workspace/cv_samples_v1.3.0/yolo_v4/tao-experiments/data # Set this path if you don't run the notebook from the samples directory. %env NOTEBOOK_ROOT=/workspace/cv_samples_v1.3.0/yolo_v4 # Please define this local project directory that needs to be mapped to the TAO docker session. # The dataset expected to be present in $LOCAL_PROJECT_DIR/data, while the results for the steps # in this notebook will be stored at $LOCAL_PROJECT_DIR/yolo_v4 %env LOCAL_PROJECT_DIR=tao-experiments os.environ["LOCAL_DATA_DIR"] = os.path.join(os.getenv("LOCAL_PROJECT_DIR", os.getcwd()), "data") os.environ["LOCAL_EXPERIMENT_DIR"] = os.path.join(os.getenv("LOCAL_PROJECT_DIR", os.getcwd()), "yolo_v4") # The sample spec files are present in the same path as the downloaded samples. os.environ["LOCAL_SPECS_DIR"] = os.path.join( os.getenv("NOTEBOOK_ROOT", os.getcwd()), "specs" ) %env SPECS_DIR=/workspace/cv_samples_v1.3.0/yolo_v4/specs # Showing list of specification files. !ls -rlt $LOCAL_SPECS_DIR `Preformatted text`
Mapping up the local directories to the TAO docker.
import json mounts_file = os.path.expanduser("~/.tao_mounts.json") # Define the dictionary with the mapped drives drive_map = { "Mounts": [ # Mapping the data directory { "source": os.environ["LOCAL_PROJECT_DIR"], "destination": "/workspace/cv_samples_v1.3.0/yolo_v4/tao-experiments" }, # Mapping the specs directory. { "source": os.environ["LOCAL_SPECS_DIR"], "destination": os.environ["SPECS_DIR"] }, ] }
!tao yolo_v4 dataset_convert -d $SPECS_DIR/yolo_v4_tfrecords_kitti_train.txt \ -o $DATA_DOWNLOAD_DIR/training/tfrecords/train
#ERROR
2022-04-26 11:41:32,999 [INFO] root: Registry: ['nvcr.io'] 2022-04-26 11:41:33,232 [INFO] tlt.components.instance_handler.local_instance: Running command in container: nvcr.io/nvidia/tao/tao-toolkit-tf:v3.21.11-tf1.15.5-py3 2022-04-26 11:41:33,467 [WARNING] tlt.components.docker_handler.docker_handler: Docker will run the commands as root. If you would like to retain your local host permissions, please add the "user":"UID:GID" in the DockerOptions portion of the "/root/.tao_mounts.json" file. You can obtain your users UID and GID by using the "id -u" and "id -g" commands on the terminal. Using TensorFlow backend. WARNING:tensorflow:Deprecation warnings have been disabled. Set TF_ENABLE_DEPRECATION_WARNINGS=1 to re-enable them. Using TensorFlow backend. Traceback (most recent call last): File "/root/.cache/bazel/_bazel_root/ed34e6d125608f91724fda23656f1726/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/yolo_v4/scripts/dataset_convert.py", line 18, in <module> File "/root/.cache/bazel/_bazel_root/ed34e6d125608f91724fda23656f1726/execroot/ai_infra/bazel-out/k8-fastbuild/bin/magnet/packages/iva/build_wheel.runfiles/ai_infra/iva/detectnet_v2/scripts/dataset_convert.py", line 110, in main FileNotFoundError: [Errno 2] No such file or directory: '/workspace/cv_samples_v1.3.0/yolo_v4/specs/yolo_v4_tfrecords_kitti_train.txt' 2022-04-26 11:41:51,298 [INFO] tlt.components.docker_handler.docker_handler: Stopping container.