Visual-Inertial 2D Feature Tracking based on an Affine Photometric Model
AffiliationSouth Westphalia University of Applied Sciences, University of Chester, South Westphalia University of Applied Sciences
MetadataShow full item record
AbstractThe robust tracking of point features throughout an image sequence is one fundamental stage in many different computer vision algorithms (e.g. visual modelling, object tracking, etc.). In most cases, this tracking is realised by means of a feature detection step and then a subsequent re-identification of the same feature point, based on some variant of a template matching algorithm. Without any auxiliary knowledge about the movement of the camera, actual tracking techniques are only robust for relatively moderate frame-to-frame feature displacements. This paper presents a framework for a visual-inertial feature tracking scheme, where images and measurements of an inertial measurement unit (IMU) are fused in order to allow a wider range of camera movements. The inertial measurements are used to estimate the visual appearance of a feature’s local neighbourhood based on a affine photometric warping model.
CitationAufderheide, D., Edwards, G., & Krybus, W. (2015). Visual-inertial 2D feature tracking based on an affine photometric model. In J. M. R. S. Tavares & R. N. Jorge (eds.) Developments in medical image processing and computational vision, volume 19 of the series lecture notes in computational vision and biomechanics (pp. 297-317). Switzerland: Springer International Publishing.
The following license files are associated with this item:
- Creative Commons
Except where otherwise noted, this item's license is described as http://creativecommons.org/licenses/by-nc-nd/4.0/