Duc Anh V. Trinh1 and Nguyen Truong Thinh2, 1University of South Florida, USA, 2Institute of Intelligent and Interactive Technologies, Vietnam
In this paper, an algorithm is developed for the robot to take odometry combined with LiDAR (Light Detection and Ranging) input to perform localization and 3D mapping inside a swiftlet house model. The position of the walls in the swiftlet's house for calibrating LiDAR data is obtained beforehand and the robot system would superimpose the LiDAR map and swiftlet's nest to the provided global swiftlet house map. The LiDAR is able to generate a 2D map from point clouds with its 360-degree scan angle. Additionally, it is mounted to a 1 DOF arm for height variation thanks to a Stepper motor to achieve a 3D map from 2D layers. Swiftlet's nests are detected by differentiating their distinctive shape from the planar concrete wall, recorded by the robot, and monitored until they are harvested. When the robot is powered up, it can localize itself in the global map as long as the calibrating wall is in view in one scan. We evaluate the robot's functionality in the swiftlet's cell model with swiftlet's nest scanned. We propose a bird nest-oriented Simultaneous Localization and Mapping (SLAM) system that builds a map of birds' nests on wood frames of swiftlet houses. The robot system takes 3D point clouds reconstructed by a feature-based SLAM system and creates a map of the nests on the house frame. Nests are detected through segmentation and shape estimation. Experiments show that the system has reproduced the shape and size of the nests with high accuracy.
Intelligent Systems, Recognition, Lidar, Bird's nest, Monitoring system, SLAM, identified system