This study examines the feasibility of a motion and haptic integrated system for the purpose of controlling a humanoid robotic arm. An Oculus head-mounted display was integrated to determine if there was an observable difference between third-person and first-person perspective control. We examine different methods of robotic control in humanoid robots and the precedence of head-mounted displays and motion control in current literature, as well as look at vibration as haptic feedback to relay the limitations of the robot. An experiment was completed with the prototype system in which 30 participants were able to complete the given tasks without fail. A learning period was observed when comparing completion times of the first task attempted to subsequent tasks. The majority of participants found the method of control to be intuitive, the inclusion of first-person perspective to be beneficial, and the vibration feedback to be either inconsequential or confusing rather than helpful.