Show simple item record

dc.contributor.authorAufderheide, Dominik*
dc.contributor.authorEdwards, Gerard*
dc.contributor.authorKrybus, Werner*
dc.date.accessioned2016-11-01T10:02:39Z
dc.date.available2016-11-01T10:02:39Z
dc.date.issued2015-04-08
dc.identifier.citationAufderheide, D., Edwards, G., & Krybus, W. (2015). Visual-inertial 2D feature tracking based on an affine photometric model. In J. M. R. S. Tavares & R. N. Jorge (eds.) Developments in medical image processing and computational vision, volume 19 of the series lecture notes in computational vision and biomechanics (pp. 297-317). Switzerland: Springer International Publishing.en
dc.identifier.isbn9783319134062en
dc.identifier.isbn9783319134079en
dc.identifier.doi10.1007/978-3-319-13407-9_18
dc.identifier.urihttp://hdl.handle.net/10034/620235
dc.description.abstractThe robust tracking of point features throughout an image sequence is one fundamental stage in many different computer vision algorithms (e.g. visual modelling, object tracking, etc.). In most cases, this tracking is realised by means of a feature detection step and then a subsequent re-identification of the same feature point, based on some variant of a template matching algorithm. Without any auxiliary knowledge about the movement of the camera, actual tracking techniques are only robust for relatively moderate frame-to-frame feature displacements. This paper presents a framework for a visual-inertial feature tracking scheme, where images and measurements of an inertial measurement unit (IMU) are fused in order to allow a wider range of camera movements. The inertial measurements are used to estimate the visual appearance of a feature’s local neighbourhood based on a affine photometric warping model.
dc.language.isoenen
dc.publisherSpringeren
dc.relation.urlhttp://link.springer.com/chapter/10.1007/978-3-319-13407-9_18en
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectComputer Visionen
dc.subjectFeature Detectionen
dc.subjectData Fusionen
dc.subjectInertial Measurementsen
dc.subjectAffine photometric warping modelen
dc.titleVisual-Inertial 2D Feature Tracking based on an Affine Photometric Modelen
dc.typeBook chapteren
dc.contributor.departmentSouth Westphalia University of Applied Sciences, University of Chester, South Westphalia University of Applied Sciencesen
or.grant.openaccessYesen
rioxxterms.funderUnfundeden
rioxxterms.identifier.projectUnfundeden
rioxxterms.versionAMen
rioxxterms.versionofrecordhttps://doi.org/10.1007/978-3-319-13407-9_18
rioxxterms.licenseref.startdate2215-04-08
html.description.abstractThe robust tracking of point features throughout an image sequence is one fundamental stage in many different computer vision algorithms (e.g. visual modelling, object tracking, etc.). In most cases, this tracking is realised by means of a feature detection step and then a subsequent re-identification of the same feature point, based on some variant of a template matching algorithm. Without any auxiliary knowledge about the movement of the camera, actual tracking techniques are only robust for relatively moderate frame-to-frame feature displacements. This paper presents a framework for a visual-inertial feature tracking scheme, where images and measurements of an inertial measurement unit (IMU) are fused in order to allow a wider range of camera movements. The inertial measurements are used to estimate the visual appearance of a feature’s local neighbourhood based on a affine photometric warping model.
rioxxterms.publicationdate2015-04-08
dc.dateAccepted2015-04-08
dc.date.deposited2016-11-01


Files in this item

Thumbnail
Name:
D. Aufderheide et al. Volume 19 ...
Embargo:
2215-12-31
Size:
820.5Kb
Format:
PDF
Request:
Main Article

This item appears in the following Collection(s)

Show simple item record

http://creativecommons.org/licenses/by-nc-nd/4.0/
Except where otherwise noted, this item's license is described as http://creativecommons.org/licenses/by-nc-nd/4.0/