This repository contains a minimal setup for an Ackermann drive robot using the nav2 stack on ROS 2 Foxy. It provides launch files and parameters to simulate the vehicle in Gazebo and autonomously navigate around a map.
The project aims to demonstrate the capabilities of the ROS 2 navigation stack with an Ackermann steering model, which is commonly used in autonomous vehicles.
At first, we considered using the AWS DeepRacer platform, but it was not compatible with ROS 2 Foxy. Therefore, we created a custom Ackermann robot model and navigation stack that works seamlessly with the nav2 stack.
Then, we tried to use the ackermann steering model from the official ROS2 Gazebo / Nav2 stack, but there were no out-of-the-box solutions for Ackermann steering in ROS 2 Foxy.
Thus, we implemented our own custom odometry publisher and control bridge to handle Ackermann steering kinematics, allowing the robot to navigate effectively in simulated environments.
src/ackermann_robot_description
– URDF model and Gazebo launch files for the robotsrc/ackermann_control_bridge
– node that converts/cmd_vel
into steering and wheel commandssrc/ackermann_robot_navigation
– navigation configuration and launch files- The robot description now includes a 2D LIDAR publishing
/scan
and a depth camera publishing/depth/image_raw
in Gazebo.
You can find detailed documentation for each package in the docs
directory:
- ackermann_robot_description – Overview of the Ackermann robot structure and URDF
- ackermann_bridge – Overview of the control bridge node and its functionality
- ackermann_control_bridge – Custom odometry publisher for Ackermann steering
- ackermann_robot_navigation – Instructions for mapping and navigation setup
The nav2 stack is a collection of ROS 2 nodes that provide navigation capabilities:
- map_server – loads an occupancy grid map for global planning.
- amcl – localizes the robot using Adaptive Monte Carlo Localization.
- planner_server – computes paths from the current pose to a goal pose.
- controller_server – follows the global plan by generating velocity commands.
- bt_navigator – runs a Behavior Tree (BT) that sequences high‑level navigation behaviors.
- recoveries_server – executes recovery behaviors when navigation fails.
- nav2_lifecycle_manager – manages node startup and shutdown.
The behavior tree used here is navigate_w_replanning_and_recovery.xml
, which repeatedly replans the path at 1 Hz and performs clearing, spinning, waiting, and backing up if issues occur.
The entire navigation stack, along with the robot simulation and visualization, can be launched using the bringup_all.launch.py
file:
ros2 launch ackermann_robot_navigation bringup_all.launch.py
This will:
- Launch the Gazebo simulation with the Ackermann robot.
- Start the control bridge and odometry nodes.
- Bring up the nav2 stack (map server, AMCL, planner, controller, etc.).
- Open RViz2 for visualization.
Before building the workspace, install all required apt packages:
bash install_ros2_foxy_dependencies.sh
From the repository root:
source /opt/ros/foxy/setup.bash # make sure ROS 2 Foxy is installed colcon build --symlink-install
- Start Gazebo and the robot
ros2 launch ackermann_robot_navigation bringup_all.launch.py
- Start the nav2 stack (in another terminal):
ros2 launch ackermann_robot_navigation nav2_bringup.launch.py
The robot will read the map in src/ackermann_robot_navigation/maps/ackermann_steering_map.yaml
and you can send goals using RViz2 or the /navigate_to_pose
action.
The LiDAR and depth camera data is available on /scan
and /depth/image_raw
.
You can send navigation goals using RViz2 or directly via command line:
ros2 action send_goal /navigate_to_pose nav2_msgs/action/NavigateToPose "{pose: {position: {x: 1.0, y: 1.0, z: 0.0}, orientation: {w: 1.0}}}"
- Visualize in RViz2 (in another terminal):
ros2 run rviz2 rviz2 -d src/ackermann_robot_navigation/rviz/nav2_default_view.rviz
- Send goals using the RViz2 interface or command line as shown above.
- Monitor the robot's state in RViz2, where you can see the robot's position, planned path, and sensor data.