Removal of space debris requires pose (position and orientation) information and motion estimation of targets; thus, real-time detection and tracking of uncooperative targets are critical. To estimate a target's pose and motion, it must track over the entire frame. In addition, due to the harsh space environment, different types of sensors are required, which means practical sensor fusion algorithms are essential. However, as there is little prior information about a target's structure and motion, detection and tracking, pose and motion estimation and multi-sensor fusion tasks are very challenging. This thesis develops new approaches for target tracking, pose and motion estimation and multi-sensor fusion. For the target tracking, a new adaptation scheme of the Unscented Kalman Filter (UKF) is proposed to manage difficulties associated with the real-time tracking of an uncooperative space target. After the detection and tracking steps, the relative position, linear and angular velocities, and attitude of space debris are estimated based on visual measurements. The projection of tracked feature points on two cameras creates an observation model of the filters, and the structure of a non-cooperative spacecraft is determined by estimating feature point positions. The relative attitude estimation is derived from the Modified Rodrigues Parameters (MRPs). Furthermore, to achieve minimal estimation errors, data from multiple sensors were integrated. To increase the system accuracy and manage difficulties associate with high uncertainty and noises associated with space navigation, two novel adaptive frameworks were proposed for sensor fusion.