Sensor Fusion in Autonomous Navigation Using Fast SLAM 3.0 – An Improved SLAM Method

Public Deposited
Resource Type
Creator
Abstract
  • This thesis introduces three important components of autonomous navigation: visual odometry and image fusion, Kalman filtering and its application, simultaneous localization and mapping (SLAM). And presents Fast - SLAM 3.0: an improved approach to SLAM compared to Fast SLAM 2.0 and Extended Kalman Filter (EKF) SLAM. The Fast SLAM 3.0 models the particles as the robot pose mean of a Gaussian distribution, which keeps the error covariance matrix (P) of pose estimation propagating as normal EKF SLAM. Hence uncertainty could be remembered over the whole trajectories, avoiding Fast SLAM 2.0's tendency to become over-confident and keeping the best feature of Fast SLAM that locally avoids linearization of the robot model and provides a high level of robustness to the clutter and ambiguous data association. Extensive experiments in the randomly generated simulated environment show that the Fast SLAM 3.0 significantly outperforms either Fast SLAM 2.0 or EKF SLAM.

Subject
Language
Publisher
Thesis Degree Level
Thesis Degree Name
Thesis Degree Discipline
Identifier
Rights Notes
  • Copyright © 2020 the author(s). Theses may be used for non-commercial research, educational, or related academic purposes only. Such uses include personal study, research, scholarship, and teaching. Theses may only be shared by linking to Carleton University Institutional Repository and no part may be used without proper attribution to the author. No part may be used for commercial purposes directly or indirectly via a for-profit platform; no adaptation or derivative works are permitted without consent from the copyright owner.

Date Created
  • 2020

Relations

In Collection:

Items