226,519 research outputs found

    Real-time augmented face

    Get PDF
    This real-time augmented reality demonstration relies on our tracking algorithm described in V. Lepetit et al (2003). This algorithm considers natural feature points, and then does not require engineering of the environment. It merges the information from preceding frames in traditional recursive tracking fashion with that provided by a very limited number of reference frames. This combination results in a system that does not suffer from jitter and drift, and can deal with drastic changes. The tracker recovers the full 3D pose of the tracked object, allowing insertion of 3D virtual objects for augmented reality application

    Sensing and mapping for interactive performance

    Get PDF
    This paper describes a trans-domain mapping (TDM) framework for translating meaningful activities from one creative domain onto another. The multi-disciplinary framework is designed to facilitate an intuitive and non-intrusive interactive multimedia performance interface that offers the users or performers real-time control of multimedia events using their physical movements. It is intended to be a highly dynamic real-time performance tool, sensing and tracking activities and changes, in order to provide interactive multimedia performances. From a straightforward definition of the TDM framework, this paper reports several implementations and multi-disciplinary collaborative projects using the proposed framework, including a motion and colour-sensitive system, a sensor-based system for triggering musical events, and a distributed multimedia server for audio mapping of a real-time face tracker, and discusses different aspects of mapping strategies in their context. Plausible future directions, developments and exploration with the proposed framework, including stage augmenta tion, virtual and augmented reality, which involve sensing and mapping of physical and non-physical changes onto multimedia control events, are discussed

    Utilization Augmented Reality Technology on Purchase Products in E-commerce

    Get PDF
    E-commerce or known with e-commerce is trading electronic, easy e-commerce is online catalo from traded products. Survey we are social conducted in April 2021 shows that as many as 88.1% of internet users in Indonesia use service e-commerce for buy product certain, one of is furniture products. Destination from study this is designing an application Android mobile based that works for educate Public about application technology Augmented Reality on purchase furniture products in e-commerce, methods on research this is descriptive analyst with approach qualitative. Whereas The method in Augmented Reality is multimarket, where user can scan markers _ together for showing virtual object to real world by real-time. Development application this could give description new to user e-commerce that will buy furniture products, users could see product in 3d and get more information _ useful compared with information through picture or description from furniture products, applications this aim for educate public that Augmented Reality is a technology that will Becomes solution new for face all challenge on purchasing furniture products in e-commerce

    2D Virtual Trial Room using Augmented Reality

    Get PDF
    Real Time Virtual Trial Room application using Augmented Reality is used in any shopping centers, malls, shops.Itallows a user to try on virtual clothes. Trying clothes in a malls are usually time consuming. Our aim is to build an interactive and highly realistic virtual system where the customers can choose many different clothes and proceed to simulate on users. Here, this paper gives user friendly interface which auto-detect human face and merge the chosen clothes on the users by using webcam as an input device and displays on the screen. Our motivation is to increase time efficiency and improve the accessibility of clothes try on by creating virtual dressing room environment

    The Implementation of Augmented Reality Hairstyles at Beauty Salons Using the Viola-Jones Method (Case Study: Eka Salon)

    Get PDF
    Augmented reality is the technology that superimposes a computer-generated digital content on a user's view of the real world in a real time so users can experience the real virtual objects. The use of augmented reality has spread into various industries, for an example in the fashion industry. One of fashion industry type is hairstyle industry. Eka Salon is a beauty salon that provides beauty treatments for women's hair care. This salon has a problem that customer are not satisfacted with the results of their new haircut because that doesnt match with their expectations. This can be seen from the results of observations at Eka Salon is resulted that 8 out of 15 interviewed customers were not satisfied with their new haircuts because it did not match the appearance in the catalog. In this research, an augmented reality hairstyles will be made that can visualize how the shape of the selected hairstyle by the customer without having to cut their hair first. The Viola-Jones method was chosen as the method used in this study because it has a high accuracy of 90% in face detection. The result of this research is that the Viola-Jones method can detect facial surfaces and generate a 3D hairstyle model distance to 100cm properly. The test of the acceptance level of this application is carried out by Eka Salon customers with an average percentage of 84.3%

    SpecTracle: Wearable Facial Motion Tracking from Unobtrusive Peripheral Cameras

    Full text link
    Facial motion tracking in head-mounted displays (HMD) has the potential to enable immersive "face-to-face" interaction in a virtual environment. However, current works on facial tracking are not suitable for unobtrusive augmented reality (AR) glasses or do not have the ability to track arbitrary facial movements. In this work, we demonstrate a novel system called SpecTracle that tracks a user's facial motions using two wide-angle cameras mounted right next to the visor of a Hololens. Avoiding the usage of cameras extended in front of the face, our system greatly improves the feasibility to integrate full-face tracking into a low-profile form factor. We also demonstrate that a neural network-based model processing the wide-angle cameras can run in real-time at 24 frames per second (fps) on a mobile GPU and track independent facial movement for different parts of the face with a user-independent model. Using a short personalized calibration, the system improves its tracking performance by 42.3% compared to the user-independent model
    • …
    corecore