The objective of this letter is to enhance the accuracy of the monocular visual odometry estimation. The proposed approach estimates pose of the camera via direct approach, by minimizing a novel photometric residue between the current image and warped version of the consecutive image. The novel photometric residue incorporates orientation information accessible from complementary sensor to overcome the inalienable nonlinearity of pose estimation, accordingly lessening the susceptibility to trajectory noise. In addition, this letter introduces a homography-based rotation motion estimation embedded into the photometric residue to reduce tracking failure amid rotations, which could be attributed to lesser number of features with parallax. The proposed methodology permits to integrate Gaussian mixture based prior into visual odometry estimation to cater for changes in motion type, from general to rotation and other way round. Comprehensive experiments have been conducted to exhibit the efficacy of the proposed solution over the best in class visual odometry and visual-inertial systems.