Visual-Inertial 2D Feature Tracking based on an Affine Photometric Model
dc.contributor.author | Aufderheide, Dominik | * |
dc.contributor.author | Edwards, Gerard | * |
dc.contributor.author | Krybus, Werner | * |
dc.date.accessioned | 2016-11-01T10:02:39Z | |
dc.date.available | 2016-11-01T10:02:39Z | |
dc.date.issued | 2015-04-08 | |
dc.identifier.citation | Aufderheide, D., Edwards, G., & Krybus, W. (2015). Visual-inertial 2D feature tracking based on an affine photometric model. In J. M. R. S. Tavares & R. N. Jorge (eds.) Developments in medical image processing and computational vision, volume 19 of the series lecture notes in computational vision and biomechanics (pp. 297-317). Switzerland: Springer International Publishing. | en |
dc.identifier.isbn | 9783319134062 | en |
dc.identifier.isbn | 9783319134079 | en |
dc.identifier.doi | 10.1007/978-3-319-13407-9_18 | |
dc.identifier.uri | http://hdl.handle.net/10034/620235 | |
dc.description.abstract | The robust tracking of point features throughout an image sequence is one fundamental stage in many different computer vision algorithms (e.g. visual modelling, object tracking, etc.). In most cases, this tracking is realised by means of a feature detection step and then a subsequent re-identification of the same feature point, based on some variant of a template matching algorithm. Without any auxiliary knowledge about the movement of the camera, actual tracking techniques are only robust for relatively moderate frame-to-frame feature displacements. This paper presents a framework for a visual-inertial feature tracking scheme, where images and measurements of an inertial measurement unit (IMU) are fused in order to allow a wider range of camera movements. The inertial measurements are used to estimate the visual appearance of a feature’s local neighbourhood based on a affine photometric warping model. | |
dc.language.iso | en | en |
dc.publisher | Springer | en |
dc.relation.url | http://link.springer.com/chapter/10.1007/978-3-319-13407-9_18 | en |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/4.0/ | * |
dc.subject | Computer Vision | en |
dc.subject | Feature Detection | en |
dc.subject | Data Fusion | en |
dc.subject | Inertial Measurements | en |
dc.subject | Affine photometric warping model | en |
dc.title | Visual-Inertial 2D Feature Tracking based on an Affine Photometric Model | en |
dc.type | Book chapter | en |
dc.contributor.department | South Westphalia University of Applied Sciences, University of Chester, South Westphalia University of Applied Sciences | en |
or.grant.openaccess | Yes | en |
rioxxterms.funder | Unfunded | en |
rioxxterms.identifier.project | Unfunded | en |
rioxxterms.version | AM | en |
rioxxterms.versionofrecord | https://doi.org/10.1007/978-3-319-13407-9_18 | |
rioxxterms.licenseref.startdate | 2215-04-08 | |
html.description.abstract | The robust tracking of point features throughout an image sequence is one fundamental stage in many different computer vision algorithms (e.g. visual modelling, object tracking, etc.). In most cases, this tracking is realised by means of a feature detection step and then a subsequent re-identification of the same feature point, based on some variant of a template matching algorithm. Without any auxiliary knowledge about the movement of the camera, actual tracking techniques are only robust for relatively moderate frame-to-frame feature displacements. This paper presents a framework for a visual-inertial feature tracking scheme, where images and measurements of an inertial measurement unit (IMU) are fused in order to allow a wider range of camera movements. The inertial measurements are used to estimate the visual appearance of a feature’s local neighbourhood based on a affine photometric warping model. | |
rioxxterms.publicationdate | 2015-04-08 | |
dc.dateAccepted | 2015-04-08 | |
dc.date.deposited | 2016-11-01 |