A simulated mecanum robot with a 3D LiDAR sensor for testing, recording and processing LiDAR point cloud data in ROS2.
A 3D LiDAR sensor attached to an initial robot simulation from a Udemy tutorial produced blurred and distorted point clouds in RViz, even after trying IMU and odometry fusion with an EKF filter from robot_localization. To get a cleaner baseline, a new Yahboom mecanum robot was rebuilt from another tutorial, integrating Nav2 for navigation and robot_localization for EKF‑based sensor fusion.
Several approaches were tried to minimise point cloud distortion in RViz:
Calibrating the EKF, tweaking noise covariances, and adjusting how much odom vs IMU contributed to the robot's yaw rotation.
Switching between odom and the Nav2 map as the world reference, and later using Nav2's amcl_pose for localisation while still relying on odom.
Despite these changes, point clouds from rotations were consistently out of sync when the Yahboom robot turned, then snapped back into place when the robot translated. This led to the idea of filtering out point clouds during rotation and only recording during translation, but a better systemic fix was needed.
The project explored using SLAM Toolbox for real‑time mapping and revealed that the control profile was a major source of distortion. With default ROS2 teleop keyboard controls, the robot's sudden, jerky motions produced distorted point clouds, while another script that applied acceleration and deceleration generated cleaner maps.
The rotation distortion issue was ultimately resolved by building SLAM‑friendly controls:
Instead of step‑changes in velocity, the custom controller script applies acceleration and deceleration to all movement, including rotation.
This produces smoother, more predictable trajectories that SLAM can track reliably, resulting in consistent point cloud alignment during both translation and rotation.
point cloud lab packages this setup into a reusable ROS2 project:
3D LiDAR scanner integration on a simulated Yahboom mecanum robot.
RViz configuration to visualise both point clouds and SLAM output.
An optimised EKF filter setup for fusing IMU and odometry.
A Gazebo world with perimeter walls and obstacles for navigation and mapping tests.
A custom controller script with acceleration and velocity profiles suitable for localisation and SLAM.
Docker containerisation with GUI support, plus a headless mode for running Gazebo without a full desktop session.
This project incorporates elements from the yahboom_rosmaster project by Automatic Addison, licensed under the BSD 3‑Clause License, with additional integration, configuration and control logic layered on top.
Tech stack: C++, ROS2, Gazebo, RViz2, SLAM Toolbox, robot_localization (EKF), Docker