9 research outputs found

    Effect of marker position and size on the registration accuracy of HoloLens in a non-clinical setting with implications for high-precision surgical tasks

    Get PDF
    Acknowledgments: We are grateful to Mike Whyment for the purchase of the holographic headset used in this study and to Rute Vieira and Fiona Saunders for their advice on statistics. We would also like to thank Denise Tosh and the Anatomy staff at the University of Aberdeen for their support. This research was funded by The Roland Sutton Academic Trust (RSAT 0053/R/17) and the University of Aberdeen (via an Elphinstone Scholarship, IKEC Award and Medical Sciences Honours project funding). Funding: This study was funded by The Roland Sutton Academic Trust (RSAT 0053/R/17) and the University of Aberdeen (via an Elphinstone Scholarship, IKEC Award and Medical Sciences Honours project funding).Peer reviewedPublisher PD

    How Wrong Can You Be:Perception of Static Orientation Errors in Mixed Reality

    Get PDF

    Sviluppo di un algoritmo di pose estimation robusto per applicazioni aeronautiche

    Get PDF
    Il problema nella sua generalitĂ  consiste nel ricavare la stima di posizione e di orientamento di un oggetto nello spazio osservandolo con una sola telecamera, sfruttando la posizione sul piano immagine delle proiezioni di alcuni marker luminosi posti sull'oggetto stesso. Partendo dall'algoritmo OI (Orthogonal Iteration) per il calcolo della pose estimation, sono state sviluppate modifiche allo stesso per aumentare la casistica trattata e migliorarne la robustezza

    Comparing Measured and Theoretical Target Registration Error of an Optical Tracking System

    Get PDF
    The goal of this thesis is to experimentally measure the accuracy of an optical tracking system used in commercial surgical navigation systems. We measure accuracy by constructing a mechanism that allows a tracked target to move with spherical motion (i.e., there exists a single point on the mechanism—the center of the sphere—that does not change position when the tracked target is moved). We imagine that the center of the sphere is the tip of a surgical tool rigidly attached to the tracked target. The location of the tool tip cannot be measured directly by the tracking system (because it is impossible to attach a tracking marker to the tool tip) and must be calculated using the measured location and orientation of the tracking target. Any measurement error in the tracking system will cause the calculated position of the tool tip to change as the target is moved; the spread of the calculated tool tip positions is a measurement of tracking error called the target registration error (TRE). The observed TRE will be compared to an analytic model of TRE to assess the predictions of the analytic model

    Réalité Augmentée et Environnement Collaboratif : Un Tour d'Horizon

    Get PDF
    National audienceLa Réalité Augmentée (RA) est généralement définie comme une branche dérivée de la Réalité Virtuelle. D'une façon plus générale, le concept de réalité augmentée regroupe une approche multidisciplinaire visant un mélange entre réel et virtuel. La forte potentialité induite par cette connexion promet un cadre adéquat pour l'interaction 3D ou les applications collaboratives. On présente dans cet article un tour d'horizon des principaux travaux menés à ce jour dans le cadre de l'image et de la RA et plus particuliÚrement le cadre collaboratif

    Single and multiple stereo view navigation for planetary rovers

    Get PDF
    © Cranfield UniversityThis thesis deals with the challenge of autonomous navigation of the ExoMars rover. The absence of global positioning systems (GPS) in space, added to the limitations of wheel odometry makes autonomous navigation based on these two techniques - as done in the literature - an inviable solution and necessitates the use of other approaches. That, among other reasons, motivates this work to use solely visual data to solve the robot’s Egomotion problem. The homogeneity of Mars’ terrain makes the robustness of the low level image processing technique a critical requirement. In the first part of the thesis, novel solutions are presented to tackle this specific problem. Detection of robust features against illumination changes and unique matching and association of features is a sought after capability. A solution for robustness of features against illumination variation is proposed combining Harris corner detection together with moment image representation. Whereas the first provides a technique for efficient feature detection, the moment images add the necessary brightness invariance. Moreover, a bucketing strategy is used to guarantee that features are homogeneously distributed within the images. Then, the addition of local feature descriptors guarantees the unique identification of image cues. In the second part, reliable and precise motion estimation for the Mars’s robot is studied. A number of successful approaches are thoroughly analysed. Visual Simultaneous Localisation And Mapping (VSLAM) is investigated, proposing enhancements and integrating it with the robust feature methodology. Then, linear and nonlinear optimisation techniques are explored. Alternative photogrammetry reprojection concepts are tested. Lastly, data fusion techniques are proposed to deal with the integration of multiple stereo view data. Our robust visual scheme allows good feature repeatability. Because of this, dimensionality reduction of the feature data can be used without compromising the overall performance of the proposed solutions for motion estimation. Also, the developed Egomotion techniques have been extensively validated using both simulated and real data collected at ESA-ESTEC facilities. Multiple stereo view solutions for robot motion estimation are introduced, presenting interesting benefits. The obtained results prove the innovative methods presented here to be accurate and reliable approaches capable to solve the Egomotion problem in a Mars environment

    Aspects of User Experience in Augmented Reality

    Get PDF

    Ein modulares optisches Trackingsystem fĂŒr medizintechnische Anwendungen: integrierte Datenflussarchitektur in Hard- und Software und Applikationsframework

    Get PDF
    Die vorliegende Arbeit beschreibt die Entwicklung eines modularen optischen Trackingsystems, ausgerichtet auf die speziellen Anforderungen im medizintechnischen Umfeld. Das Spektrum der vorgestellten Anwendungen des Systems reicht dabei von der Erfassung der Benutzerinteraktion in verschiedenen medizinischen Simulatoren (z.B. fĂŒr Ophthalmochirurgie, Ophthalmoskopie und Neurochirurgie) bis hin zur Positionserfassung eines handgehaltenen Operationsroboters. Im Unterschied zu verfĂŒgbaren kommerziellen Trackingsystemem mit ihren eng umrissenen Anwendungsbereichen wird ein universell ausgelegtes Baukastensystem vorgestellt, das sich mit geringem Entwicklungsaufwand an die speziellen Anforderungen der jeweiligen Anwendungen anpassen lĂ€sst (so u.a. sehr kleine Geometrien, deformierbare Objekte, Einsatz von Originalinstrumenten, geringe RessourcenverfĂŒgbarkeit im Simulator-PC). Zu diesem Zweck wird ein modulares Systemkonzept entwickelt, welches von der spezialisierten Datenverarbeitung gĂ€ngiger Trackingsysteme abstrahiert und auf einer generalisierten, modularen Systemarchitektur fĂŒr den Einsatz aller Arten von Markern mit drei Freiheitsgraden aufbaut. Neben den verbreiteten infrarotbasierten Signaliserungstechniken werden dabei auch passive Farbmarker zur Objektsignalisierung unterstĂŒtzt. Die Implementierung von Bildverarbeitungsaufgaben in spezialisierter Hardware (FPGAs) direkt auf dem Kameradatenstrom ermöglicht eine frĂŒhzeitige Datenreduktion und damit niedrige Latenzzeiten. Der Entwicklungsprozess fĂŒr neuartige Trackinglösungen wird vereinfacht durch die enge Integration der Hard- und Softwaremodule in einer einheitlichen durchgĂ€ngigen Datenflussarchitektur, die flexibel an die jeweilige Aufgabenstellung anpassbar ist. Ein erweiterbares graphisches Frontend schließlich unterstĂŒtzt bei Betrieb und Konfiguration und erlaubt auch die Simulation ganzer Systeme wĂ€hrend der Entwicklung
    corecore