2,973 research outputs found

    Vision technology/algorithms for space robotics applications

    Get PDF
    The thrust of automation and robotics for space applications has been proposed for increased productivity, improved reliability, increased flexibility, higher safety, and for the performance of automating time-consuming tasks, increasing productivity/performance of crew-accomplished tasks, and performing tasks beyond the capability of the crew. This paper provides a review of efforts currently in progress in the area of robotic vision. Both systems and algorithms are discussed. The evolution of future vision/sensing is projected to include the fusion of multisensors ranging from microwave to optical with multimode capability to include position, attitude, recognition, and motion parameters. The key feature of the overall system design will be small size and weight, fast signal processing, robust algorithms, and accurate parameter determination. These aspects of vision/sensing are also discussed

    Learning through play: an educational computer game to introduce radar fundamentals

    Get PDF
    The information exchange has evolved from traditional books to computers and Internet in a few years' time. Our current university students were born in this age: they learn and have fun with different methods as previous generations did. These digital natives enjoy computer games. Thus, designing games for learning some selected topics could be a good teaching strategy for such collective and also for undergraduate university students. This paper describes the development and test of an educational computer game revolving around radar. The objective of the game RADAR Technology is to teach students about the fundamentals of radar, while having fun during the learning experience. Based on the principle that you learn better what you practice, the authors want to induce students to discover a difficult to understand topic by proposing them a different experience, in a format better adapted to their generation skills. The computer game has been tested with actual students and the obtained results seem to be very promising

    Ultra-Wideband Radar-Based Activity Recognition Using Deep Learning

    Get PDF
    With recent advances in the field of sensing, it has become possible to build better assistive technologies. This enables the strengthening of eldercare with regard to daily routines and the provision of personalised care to users. For instance, it is possible to detect a person’s behaviour based on wearable or ambient sensors; however, it is difficult for users to wear devices 24/7, as they would have to be recharged regularly because of their energy consumption. Similarly, although cameras have been widely used as ambient sensors, they carry the risk of breaching users’ privacy. This paper presents a novel sensing approach based on deep learning for human activity recognition using a non-wearable ultra-wideband (UWB) radar sensor. UWB sensors protect privacy better than RGB cameras because they do not collect visual data. In this study, UWB sensors were mounted on a mobile robot to monitor and observe subjects from a specific distance (namely, 1.5–2.0 m). Initially, data were collected in a lab environment for five different human activities. Subsequently, the data were used to train a model using the state-of-the-art deep learning approach, namely long short-term memory (LSTM). Conventional training approaches were also tested to validate the superiority of LSTM. As a UWB sensor collects many data points in a single frame, enhanced discriminant analysis was used to reduce the dimensions of the features through application of principal component analysis to the raw dataset, followed by linear discriminant analysis. The enhanced discriminant features were fed into the LSTMs. Finally, the trained model was tested using new inputs. The proposed LSTM-based activity recognition approach performed better than conventional approaches, with an accuracy of 99.6%. We applied 5-fold cross-validation to test our approach. We also validated our approach on publically available dataset. The proposed method can be applied in many prominent fields, including human–robot interaction for various practical applications, such as mobile robots for eldercare.publishedVersio

    Aerospace Medicine and Biology: A continuing bibliography with indexes, supplement 182, July 1978

    Get PDF
    This bibliography lists 165 reports, articles, and other documents introduced into the NASA scientific and technical information system in June 1978

    Adaptive Perception, State Estimation, and Navigation Methods for Mobile Robots

    Get PDF
    In this cumulative habilitation, publications with focus on robotic perception, self-localization, tracking, navigation, and human-machine interfaces have been selected. While some of the publications present research on a PR2 household robot in the Robotics Learning Lab of the University of California Berkeley on vision and machine learning tasks, most of the publications present research results while working at the AutoNOMOS-Labs at Freie Universität Berlin, with focus on control, planning and object tracking for the autonomous vehicles "MadeInGermany" and "e-Instein"

    Positioning and Sensing System Based on Impulse Radio Ultra-Wideband Technology

    Get PDF
    Impulse Radio Ultra-Wideband (IR-UWB) is a wireless carrier communication technology using nanosecond non-sinusoidal narrow pulses to transmit data. Therefore, the IR-UWB signal has a high resolution in the time domain and is suitable for high-precision positioning or sensing systems in IIoT scenarios. This thesis designs and implements a high-precision positioning system and a contactless sensing system based on the high temporal resolution characteristics of IR-UWB technology. The feasibility of the two applications in the IIoT is evaluated, which provides a reference for human-machine-thing positioning and human-machine interaction sensing technology in large smart factories. By analyzing the commonly used positioning algorithms in IR-UWB systems, this thesis designs an IRUWB relative positioning system based on the time of flight algorithm. The system uses the IR-UWB transceiver modules to obtain the distance data and calculates the relative position between the two individuals through the proposed relative positioning algorithm. An improved algorithm is proposed to simplify the system hardware, reducing the three serial port modules used in the positioning system to one. Based on the time of flight algorithm, this thesis also implements a contactless gesture sensing system with IR-UWB. The IR-UWB signal is sparsified by downsampling, and then the feature information of the signal is obtained by level-crossing sampling. Finally, a spiking neural network is used as the recognition algorithm to classify hand gestures

    Indoor Localization Techniques Based on Wireless Sensor Networks

    Get PDF
    • …
    corecore