Stereo-camera Occupancy Grid Mapping

Open Access
- Author:
- Chien, Wen Yu
- Graduate Program:
- Aerospace Engineering
- Degree:
- Master of Science
- Document Type:
- Master Thesis
- Date of Defense:
- June 29, 2020
- Committee Members:
- Eric Norman Johnson, Thesis Advisor/Co-Advisor
Jacob Willem Langelaan, Committee Member
Amy Ruth Pritchett, Program Head/Chair - Keywords:
- 3D occupancy grid mapping
stereo camera
RealSense
SLAM
UAV
inverse sensor model - Abstract:
- Unmanned aerial vehicles (UAVs) are widely used to explore dangerous or unknown areas, i.e., battlefield reconnaissance and collapsed buildings. The risk of operating in these complex environments is high due to the difficulty in performing precision mapping for collision prevention. The primary reason is vehicle hardware limitation, where the performance of the onboard computer and sensors are usually restricted by the maximum vehicle weight. Recently, the rise of mobile devices makes micro sensors much cheaper and lighter, i.e., camera sensors, Global Positioning System (GPS) and Inertial Measurement Unit (IMU). This tendency also benefits UAVs and makes it easier and safer to operate during outdoor flight. Nevertheless, for a GPS-denied environment such as forests or indoor environments, UAVs once again encounter challenges due to weight restriction, lack of GPS data, and complexity of those environments. Unlike large UAVs, micro air vehicles (MAVs), which are more suitable for these complex environments, are restricted by their payload capacity. Laser scanners, which are commonly used for large UAVs, are too heavy for MAVs. Therefore, a lightweight camera system is a better solution to enable these vehicles to fly in GPS-denied, complex areas. This study presents a framework to show how to enable a drone to map an unknown, complex, GPS-denied environment by integrating two Intel RealSense cameras, which are used to provide the vehicle mapping and localization ability and testing the mapping and localization ability to explore an unknown complex GPS-denied environment. The specific objectives of this research are as follows: reconstruct a 3D occupancy map by extracting depth data from the RealSense depth camera, get pose data from the RealSense tracking camera to localize the vehicle, and give reference to the extracted point clouds, all while running these processes in real-time on the onboard Intel NUC computer. An occupancy grid map is used to store the environment information from the cameras and save memory usage. The inverse sensor model of the RealSense depth camera is studied and implemented based on the camera spec sheets. The performance of two different ray tracing methods, line drawing and voxel ray traversal, are also studied. Furthermore, the map shifting function is developed to allow the 3D map to move as the vehicle moves.