location_searching Sensor Fusion Localization
Precise trash positioning by fusing computer vision detections with LiDAR depth cloud data.
How it works
Our sensor fusion algorithm combines 2D object detection from cameras with 2D point cloud data from LiDAR. By projecting the 2D bounding boxes into the 2D space, we can accurately determine the distance and position of waste objects relative to the vehicle, even in challenging lighting conditions or complex environments.
This approach eliminates the ambiguity of single-sensor systems and provides robust localization data for the navigation stack and the robotic arm.
Technical Implementation
- Sensors: LD06 LiDAR & RGB Camera
- Algorithm: Projection of 2D bounding boxes to 2D point cloud
- Output: 2D Coordinates (X, Y) of the target object
- Framework: ROS2 Node with custom fusion logic
Explore the Code
Check out the implementation details, ROS2 nodes, and configuration files on our GitHub repository.
View on GitHub open_in_new