Reduced egomotion estimation drift using omnidirectional views
نویسنده
چکیده
Estimation of camera motion from a given image sequence is a common task for multi-view 3D computer vision applications. Salient features (lines, corners etc.) in the images are used to estimate the motion of the camera, also called egomotion. This estimation suffers from an error built-up as the length of the image sequence increases and this causes a drift in the estimated position. In this letter, this phenomenon is demonstrated and an approach to improve the estimation accuracy is proposed. The main idea of the proposed method is using an omnidirectional camera (360° horizontal field of view) in addition to a conventional (perspective) camera. Taking advantage of the correspondences between the omnidirectional and perspective images, the accuracy of camera position estimates can be improved. In our work, we adopt the sequential structure-from-motion approach which starts with estimating the motion between first two views and more views are added one by one. We automatically match points between omnidirectional and perspective views. Point correspondences are used for the estimation of epipolar geometry, followed by the reconstruction of 3D points with iterative linear triangulation. In addition, we calibrate our cameras using sphere camera model which covers both omnidirectional and perspective cameras. This enables us to treat the cameras in the same way at any step of structure-from-motion. We performed simulated and real image experiments to compare the estimation accuracy when only perspective views are used and when an omnidirectional view is added. Results show that the proposed idea of adding omnidirectional views reduces the drift in egomotion estimation.
منابع مشابه
Real-Time Estimation of Fast Egomotion with Feature Classification Using Compound Omnidirectional Vision Sensor
For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a l...
متن کاملEnvironmental Map Generation and Egomotion Dynamic Environment for an Omnidirectional Image Sensor
Generation of a stationary environmental map i s one of the important tasks for vision based robot navigation. Under the assumption of known motion of a robot, environmental maps of a real scene can be successfilly generated by monitoring azimuth changes in a n image. Several researchers have used this property for robot navagation However, it i s difficult to observe the exact motion parameter...
متن کاملUsing Symmetry as a Feature in Panoramic Images for Mobile Robot Applications
We propose to use symmetry as a global feature for mobile robot applications in an indoor environment. Our mobile robot solely uses an omnidirectional vision sensor consisting of a digital colour video camera and a hyperbolic mirror. Thus, robust image feature extraction is required for good performance in each application. The detection of symmetry is an effective natural vision routine result...
متن کاملInsect-Inspired Estimation of Egomotion
Tangential neurons in the fly brain are sensitive to the typical optic flow patterns generated during egomotion. In this study, we examine whether a simplified linear model based on the organization principles in tangential neurons can be used to estimate egomotion from the optic flow. We present a theory for the construction of an estimator consisting of a linear combination of optic flow vect...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1307.6962 شماره
صفحات -
تاریخ انتشار 2013