• Alternative Representations of 3D-Reconstructed Heritage Data

      Miles, Helen C.; Wilson, Andrew T.; Labrosse, Frédéric; Tiddeman, Bernard; Griffiths, Seren; Edwards, Ben; Ritsos, Panagiotis D.; Mearman, Joseph W.; Moller, Katharina; Karl, Raimund; et al. (ACM, 2016-02-20)
      By collecting images of heritage assets from members of the public and processing them to create 3D-reconstructed models, the HeritageTogether project has accomplished the digital recording of nearly 80 sites across Wales, UK. A large amount of data has been collected and produced in the form of photographs, 3D models, maps, condition reports, and more. Here we discuss some of the different methods used to realize the potential of this data in different formats and for different purposes. The data are explored in both virtual and tangible settings, and—with the use of a touch table—a combination of both. We examine some alternative representations of this community-produced heritage data for educational, research, and public engagement applications.
    • An Augmented Reality Tool to aid Radiotherapy Set Up implemented on a Tablet Device

      Cosentino, Francesco; Vaarkamp, Japp; John, Nigel W.; University of Chester, North Wales Cancer Treatment Centre (International Conference on the use of Computers in Radiation Therapy, 2016-06)
      The accurate daily set up of patients for radiotherapy treatment remains a challenge for which the development of new strategies and solutions continues to be an area of active research. We have developed an augmented reality tool to view the real world scene, i.e. the patient on a treatment couch, combined with computer graphics content, such as planning image data and any defined outlines of organ structures. We have built this on widely available hand held consumer tablet devices and describe here the implementation and initial experience. We suggest that, in contrast to other augmented reality tools explored for radiotherapy[1], due to the wide availability and low cost of the hardware platform the application has further potential as a tool for patients to visualize their treatment and demonstrate to patients e.g. the importance of compliance with instructions around bladder filling and rectal suppositories.
    • Real-time Geometry-Aware Augmented Reality in Minimally Invasive Surgery

      Chen, Long; Tang, Wen; John, Nigel W.; Bournemouth University; University of Chester (IET, 2017-10-27)
      The potential of Augmented Reality (AR) technology to assist minimally invasive surgeries (MIS) lies in its computational performance and accuracy in dealing with challenging MIS scenes. Even with the latest hardware and software technologies, achieving both real-time and accurate augmented information overlay in MIS is still a formidable task. In this paper, we present a novel real-time AR framework for MIS that achieves interactive geometric aware augmented reality in endoscopic surgery with stereo views. Our framework tracks the movement of the endoscopic camera and simultaneously reconstructs a dense geometric mesh of the MIS scene. The movement of the camera is predicted by minimising the re-projection error to achieve a fast tracking performance, while the 3D mesh is incrementally built by a dense zero mean normalised cross correlation stereo matching method to improve the accuracy of the surface reconstruction. Our proposed system does not require any prior template or pre-operative scan and can infer the geometric information intra-operatively in real-time. With the geometric information available, our proposed AR framework is able to interactively add annotations, localisation of tumours and vessels, and measurement labelling with greater precision and accuracy compared with the state of the art approaches.
    • Real-Time Guidance and Anatomical Information by Image Projection onto Patients

      Edwards, Marc R.; Pop, Serban R.; John, Nigel W.; Ritsos, Panagiotis D.; Avis, Nick J.; University of Chester (Eurographics Association, 2016-09)
      The Image Projection onto Patients (IPoP) system is work in progress intended to assist medical practitioners perform procedures such as biopsies, or provide a novel anatomical education tool, by projecting anatomy and other relevant information from the operating room directly onto a patient’s skin. This approach is not currently used widely in hospitals but has the benefit of providing effective procedure guidance without the practitioner having to look away from the patient. Developmental work towards the alpha-phase of IPoP is presented including tracking methods for tools such as biopsy needles, patient tracking, image registration and problems encountered with the multi-mirror effect.
    • SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality

      Chen, Long; Tang, Wen; John, Nigel W.; Wan, Tao R.; Zhang, Jian Jun; Bournemouth University; University of Chester; University of Bradford (Elsevier, 2018-02-08)
      Background and Objective While Minimally Invasive Surgery (MIS) offers considerable benefits to patients, it also imposes big challenges on a surgeon's performance due to well-known issues and restrictions associated with the field of view (FOV), hand-eye misalignment and disorientation, as well as the lack of stereoscopic depth perception in monocular endoscopy. Augmented Reality (AR) technology can help to overcome these limitations by augmenting the real scene with annotations, labels, tumour measurements or even a 3D reconstruction of anatomy structures at the target surgical locations. However, previous research attempts of using AR technology in monocular MIS surgical scenes have been mainly focused on the information overlay without addressing correct spatial calibrations, which could lead to incorrect localization of annotations and labels, and inaccurate depth cues and tumour measurements. In this paper, we present a novel intra-operative dense surface reconstruction framework that is capable of providing geometry information from only monocular MIS videos for geometry-aware AR applications such as site measurements and depth cues. We address a number of compelling issues in augmenting a scene for a monocular MIS environment, such as drifting and inaccurate planar mapping. Methods A state-of-the-art Simultaneous Localization And Mapping (SLAM) algorithm used in robotics has been extended to deal with monocular MIS surgical scenes for reliable endoscopic camera tracking and salient point mapping. A robust global 3D surface reconstruction framework has been developed for building a dense surface using only unorganized sparse point clouds extracted from the SLAM. The 3D surface reconstruction framework employs the Moving Least Squares (MLS) smoothing algorithm and the Poisson surface reconstruction framework for real time processing of the point clouds data set. Finally, the 3D geometric information of the surgical scene allows better understanding and accurate placement AR augmentations based on a robust 3D calibration. Results We demonstrate the clinical relevance of our proposed system through two examples: a) measurement of the surface; b) depth cues in monocular endoscopy. The performance and accuracy evaluations of the proposed framework consist of two steps. First, we have created a computer-generated endoscopy simulation video to quantify the accuracy of the camera tracking by comparing the results of the video camera tracking with the recorded ground-truth camera trajectories. The accuracy of the surface reconstruction is assessed by evaluating the Root Mean Square Distance (RMSD) of surface vertices of the reconstructed mesh with that of the ground truth 3D models. An error of 1.24mm for the camera trajectories has been obtained and the RMSD for surface reconstruction is 2.54mm, which compare favourably with previous approaches. Second, in vivo laparoscopic videos are used to examine the quality of accurate AR based annotation and measurement, and the creation of depth cues. These results show the potential promise of our geometry-aware AR technology to be used in MIS surgical scenes. Conclusions The results show that the new framework is robust and accurate in dealing with challenging situations such as the rapid endoscopy camera movements in monocular MIS scenes. Both camera tracking and surface reconstruction based on a sparse point cloud are eff active and operated in real-time. This demonstrates the potential of our algorithm for accurate AR localization and depth augmentation with geometric cues and correct surface measurements in MIS with monocular endoscopes.