15 research outputs found

    Integrating images from a moveable tracked display of three-dimensional data

    Get PDF
    abstract: This paper describes a novel method for displaying data obtained by three-dimensional medical imaging, by which the position and orientation of a freely movable screen are optically tracked and used in real time to select the current slice from the data set for presentation. With this method, which we call a “freely moving in-situ medical image”, the screen and imaged data are registered to a common coordinate system in space external to the user, at adjustable scale, and are available for free exploration. The three-dimensional image data occupy empty space, as if an invisible patient is being sliced by the moving screen. A behavioral study using real computed tomography lung vessel data established the superiority of the in situ display over a control condition with the same free exploration, but displaying data on a fixed screen (ex situ), with respect to accuracy in the task of tracing along a vessel and reporting spatial relations between vessel structures. A “freely moving in-situ medical image” display appears from these measures to promote spatial navigation and understanding of medical data.The electronic version of this article is the complete one and can be found online at: http://cognitiveresearchjournal.springeropen.com/articles/10.1186/s41235-017-0069-

    A Framework to Visualize 3D Breast Tumor Using X-Ray Vision Technique in Mobile Augmented Reality

    Get PDF
    Breast cancer patients who require breast biopsy have increased over the past years and Stereotactic Biopsy uses series of images to carefully position the imaging equipment and target the area of concern. However, it has the constraint of accurate 3D Tumor visualization. An Augmented Reality (AR) Guidance Biopsy system of breast has become the method of choice for researchers, yet this AR tumor visualization has limitation to the extent of superimposing the 3D Imaging Data only. In this paper, a framework to visualize 3D breast tumor technique is being introduced to accurately visualize 3D tumor to see through the skin of US-9 Opaque breast phantom on a mobile display. This mobile AR visualization technique consists of 4 phases where it initially acquires the image from Computed Tomography (CT) or Magnetic Resonance Images (MRI) and processes the medical images into 3D slices, secondly, it will purify these 3D grayscale slices into 3D breast tumor model using 3D modeling reconstruction technique. Furthermore, in visualization processing, this virtual 3D breast tumor model is enhanced using X-Ray Visualization technique to only see through the skin of the phantom for better visualization. Finally, the composition of it is displayed on a smartphone device with an optimized accuracy of the 3D tumor visualization in a six degree of freedom (6DOF). The experiment was made to test the visualization accuracy on US-9 breast phantom which has 12 tumors in different sizes and categorized in 3 levels. Our frame shows the 3D tumor visualization accuracy, however, the accuracy comparison is pending. The two radiologists from Hospital Serdang performed successful visualization of a 3D tumor in an X-ray vision. The framework is perceived as an improved visualization experience because the AR X-ray visualization allowed direct understanding of the breast tumor beyond the visible surface towards accurate biopsy targets

    Importance-Driven Composition of Multiple Rendering Styles

    Get PDF
    International audienceWe introduce a non-uniform composition that integrates multiple rendering styles in a picture driven by an importance map. This map, either issued from saliency estimation or designed by a user, is introduced both in the creation of the multiple styles and in the final composition. Our approach accommodates a variety of stylization techniques, such as color desaturation, line drawing, blurring, edge-preserving smoothing and enhancement. We illustrate the versatility of the proposed approach and the variety of rendering styles on different applications such as images, videos, 3D scenes and even mixed reality. We also demonstrate that such an approach may help in directing user attention

    Mobilni komunikacioni sistemi i aplikacije od značaja za integrisano upravljanje katastrofama

    Get PDF
    Savremeno upravljanje u katastrofama uslovljava iznalaženje optimalnih rešenja u pogledu mobilnih komunikacionih sistema i aplikacija koje se mogu koristiti za unapređivanje efikasnosti sistema zaštite i spasavanja. Multidisciplinarnost i suštinska komplikovanost procesa upravljanja rizicima od katastrofa uslovljava korišćenje različitih logističkih alata i opreme. U tom smislu, informaciono-komunikacione tehnologije igraju značajnu ulogu jer one na svojevrstan način podižu nivo sposobnosti ljudi za brzo odlučivanje i smanjuju mogućnosti nastanka različitih grešaka. U radu se opisuju karakteristike i načini korišćenja najpoznatiji mobilnih aplikacija koje se širom sveta koriste u integrisanom upravljanju katastrofama sa ciljem pružanja pomoći i podrške pripadnicima interventno-spasilačkih jedinica i drugim ugroženim građanima. Pored toga, sveobuhvatno se sagledavaju postojeći i očekivani izazovi i problemi u normalnom funkcionisanju mobilnih komunikacionih sistema i aplikacija u uslovima katastrofa

    Mobilni komunikacioni sistemi i aplikacije od značaja za integrisano upravljanje katastrofama

    Get PDF
    nem

    Collaborative Augmented Reality

    Get PDF
    Over the past number of years augmented reality (AR) has become an increasingly pervasive as a consumer level technology. The principal drivers of its recent development has been the evolution of mobile and handheld devices, in conjunction with algorithms and techniques from fields such as 3D computer vision. Various commercial platforms and SDKs are now available that allow developers to quickly develop mobile AR apps requiring minimal understanding of the underlying technology. Much of the focus to date, both in the research and commercial environment, has been on single user AR applications. Just as collaborative mobile applications have a demonstrated role in the increasing popularity of mobile devices, and we believe collaborative AR systems present a compelling use-case for AR technology. The aim of this thesis is the development a mobile collaborative augmented reality framework. We identify the elements required in the design and implementation stages of collaborative AR applications. Our solution enables developers to easily create multi-user mobile AR applications in which the users can cooperatively interact with the real environment in real time. It increases the sense of collaborative spatial interaction without requiring complex infrastructure. Assuming the given low level communication and AR libraries have modular structures, the proposed approach is also modular and flexible enough to adapt to their requirements without requiring any major changes

    Collaborative Augmented Reality

    Get PDF
    Over the past number of years augmented reality (AR) has become an increasingly pervasive as a consumer level technology. The principal drivers of its recent development has been the evolution of mobile and handheld devices, in conjunction with algorithms and techniques from fields such as 3D computer vision. Various commercial platforms and SDKs are now available that allow developers to quickly develop mobile AR apps requiring minimal understanding of the underlying technology. Much of the focus to date, both in the research and commercial environment, has been on single user AR applications. Just as collaborative mobile applications have a demonstrated role in the increasing popularity of mobile devices, and we believe collaborative AR systems present a compelling use-case for AR technology. The aim of this thesis is the development a mobile collaborative augmented reality framework. We identify the elements required in the design and implementation stages of collaborative AR applications. Our solution enables developers to easily create multi-user mobile AR applications in which the users can cooperatively interact with the real environment in real time. It increases the sense of collaborative spatial interaction without requiring complex infrastructure. Assuming the given low level communication and AR libraries have modular structures, the proposed approach is also modular and flexible enough to adapt to their requirements without requiring any major changes

    Evaluation of graphical user interfaces for augmented reality based manual assembly support

    Get PDF
    Augmented reality (AR) technology is advancing rapidly and promises benefits to a wide variety of applications&mdashincluding manual assembly and maintenance tasks. This thesis addresses the design of user interfaces for AR applications, focusing specifically on information presentation interface elements for assembly tasks. A framework was developed and utilized to understand and classify these elements, as well as to evaluate numerous existing AR assembly interfaces from literature. Furthermore, a user study was conducted to investigate the strengths and weaknesses of concrete and abstract AR interface elements in an assembly scenario, as well as to compare AR assembly instructions against common paper-based assembly instructions. The results of this study supported, at least partially, the three hypotheses that concrete AR elements are more suitable to convey part manipulation information than abstract AR elements, that concrete AR and paper-based instructions lead to faster assembly times than abstract AR instructions alone, and that concrete AR instructions lead to greater increases in user confidence than paper-based instructions. The study failed to support the hypothesis that abstract AR elements are more suitable for part identification than concrete AR elements. Finally, the study results and hypothesis conclusions are used to suggest future work regarding interface element design for AR assembly applications
    corecore