5 research outputs found

    Hands-free wearable system for helping in assembly tasks in aerospace

    Get PDF
    Las operaciones de mantenimiento tienen un gran impacto en la seguridad y esperanza de vida de cualquier producto, especialmente en ciertas aplicaciones dentro de la industria aeronáutica que tiene que pasar procedimientos muy rigurosos de seguridad. Los sistemas de ayuda llevables (wearable) pueden ayudar a reducir costes y tiempo de trabajo guiando a los operarios en tareas difíciles. El propósito de este trabajo es presentar un sistema de guiado de manos libre y llevable para soporte y ayuda de operarios en tareas de ensamblaje y verificación dentro del campo de la aeronáutica. El operario es capaz de pedir información al sistema sobre una tarea específica de un modo no invasivo así como pedir asistencia técnica al líder del equipo. El sistema desarrollado ha sido probado en una compañía aeronáutica (Airbus Military) y se ha evaluado su implementación en ciertas tareas de ensamblaje. La conclusión de las pruebas ha sido que el sistema ayuda a los operarios a realizar sus tareas de una manera más rápida, precisa y segura.Maintenance operations have a great impact on the safety and life expectancy of any product. This is especially true for certain applications within the aerospace industry, which must pass rigorous security checking procedures. Wearable helping systems can help to reduce costs and working time by guiding workers in some specifi c and diffi cult tasks. The purpose of this work is developing a handless and wearable guided system that supports and helps workers in assembly and verifi cation tasks within the aeronautic fi eld. The worker is able to request information for the specifi c task in a non invasive way and also ask the Team Leader for real time technical support and assistance. The system developed has been tested in an aeronautic company (Airbus Military) and its implementation in specifi c assembly tasks assessed. It was found that the proposed system can help workers to make their tasks faster, more accurate and more secure

    Marker-less Vision Based Tracking for Mobile Augmented Reality

    No full text
    In this paper an object recognition and tracking approach for the mobile, marker-less and PDA-based augmented reality system AR-PDA is described. For object recognition and localization 2D features are extracted from images and compared with a priori known 3D models. The system consists of a 2D graph matching, 3D hypothesis generation and validation and an additional texture based validation step

    Visual based finger interactions for mobile phones

    Get PDF
    Vision based technology such as motion detection has long been limited to the domain of powerful processor intensive systems such as desktop PCs and specialist hardware solutions. With the advent of much faster mobile phone processors and memory, a plethora of feature rich software and hardware is being deployed onto the mobile platform, most notably onto high powered devices called smart phones. Interaction interfaces such as touchscreens allow for improved usability but obscure the phone’s screen. Since the majority of smart phones are equipped with cameras, it has become feasible to combine their powerful processors, large memory capacity and the camera to support new ways of interacting with the phone which do not obscure the screen. However, it is not clear whether or not these processor intensive visual interactions can in fact be run at an acceptable speed on current mobile handsets or whether they will offer the user a better experience than the current number pad and direction keys present on the majority of mobile phones. A vision based finger interaction technique is proposed which uses the back of device camera to track the user’s finger. This allows the user to interact with the mobile phone with mouse based movements, gestures and steering based interactions. A simple colour thresholding algorithm was implemented in Java, Python and C++. Various benchmarks and tests conducted on a Nokia N95 smart phone revealed that on current hardware and with current programming environments only native C++ yields results plausible for real time interactions (a key requirement for vision based interactions). It is also shown that different lighting levels and background environments affects the accuracy of the system with background and finger contrast playing a large role. Finally a user study was conducted to ascertain the overall user’s satisfaction between keypad interactions and the finger interaction techniques concluding that the new finger interaction technique is well suited to steering based interactions and in time, mouse style movements. Simple navigation is better suited to the directional keypad

    Towards exploring future landscapes using augmented reality

    Get PDF
    With increasing pressure to better manage the environment many government and private organisations are studying the relationships between social, economic and environmental factors to determine how they can best be optimised for increased sustainability. The analysis of such relationships are undertaken using computer-based Integrated Catchment Models (ICM). These models are capable of generating multiple scenarios depicting alternative land use alternatives at a variety of temporal and spatial scales, which present (potentially) better Triple-Bottom Line (TBL) outcomes than the prevailing situation. Dissemination of this data is (for the most part) reliant on traditional, static map products however, the ability of such products to display the complexity and temporal aspects is limited and ultimately undervalues both the knowledge incorporated in the models and the capacity of stakeholders to disseminate the complexities through other means. Geovisualization provides tools and methods for disseminating large volumes of spatial (and associated non-spatial) data. Virtual Environments (VE) have been utilised for various aspects of landscape planning for more than a decade. While such systems are capable of visualizing large volumes of data at ever-increasing levels of realism, they restrict the users ability to accurately perceive the (virtual) space. Augmented Reality (AR) is a visualization technique which allows users freedom to explore a physical space and have that space augmented with additional, spatially referenced information. A review of existing mobile AR systems forms the basis of this research. A theoretical mobile outdoor AR system using Common-Of-The-Shelf (COTS) hardware and open-source software is developed. The specific requirements for visualizing land use scenarios in a mobile AR system were derived using a usability engineering approach known as Scenario-Based Design (SBD). This determined the elements required in the user interfaces resulting in the development of a low-fidelity, computer-based prototype. The prototype user interfaces were evaluated using participants from two targeted stakeholder groups undertaking hypothetical use scenarios. Feedback from participants was collected using the cognitive walk-through technique and supplemented by evaluator observations of participants physical actions. Results from this research suggest that the prototype user interfaces did provide the necessary functionality for interacting with land use scenarios. While there were some concerns about the potential implementation of "yet another" system, participants were able to envisage the benefits of visualizing land use scenario data in the physical environment
    corecore