2 research outputs found

    A 10-17 DOF Sensory Gloves with Harvesting Capability for Smart Healthcare

    Get PDF
    We here present a 10-17 Degrees of Freedom (DoF) sensory gloves for Smart Healthcare implementing an energy harvesting architecture, aimed at enhancing the battery lasting when powering the electronics of the two different types of gloves, used to sense fingers movements. In particular, we realized a comparison in terms of measurement repeatability and reliability, as well as power consumption and battery lasting, between two sensory gloves implemented by means of different technologies. The first is a 3D printed glove with 10 DoF, featuring low-cost, low-effort fabrication and low-power consumption. The second is a classical Lycra® glove with 14 DoF suitable for a more detailed assessment of the hand postures, featuring a relatively higher cost and power consumption. An electronic circuitry was designed to gather and elaborate data from both types of sensory gloves, differing for number of inputs only. Both gloves are equipped with flex sensors and in addiction with the electronics (including a microcontroller and a transmitter) allow the control of hand virtual limbs or mechanical arts in surgical, military, space and civil applications.Six healthy subjects were involved in tests suitable to evaluate the performances of the proposed gloves in terms of repeatability, reproducibility and reliability. Particular effort was devoted to increase battery lasting for both glove-based systems, with the electronics relaying on Radio Frequency, Piezoelectric and Thermoelectric harvesters. The harvesting part was built and tested as a prototype discrete element board, that is interfaced with an external microcontroller and a radiofrequency transmitter board. Measurement results demonstrated a meaningful improvement in battery operation time up to 25%, considering different operating scenarios

    Recognition of arm-and-hand visual signals by means of SVM to increase aircraft security

    No full text
    In aircraft scenarios the proper interpretation of communication meanings is mandatory for security reasons. In particular some communications, occurring between the signalman and the pilot, rely on arm-and-hand visual signals, which can be prone to misunderstanding in some circumstances as it can be, for instance, because of low-visibility. This work intends to equip the signalman with wearable sensors, to collect data related to the signals and to interpret such data by means of a SVM classification. In such a way, the pilot can count on both his/her own evaluation and on the automatic interpretation of the visual signal (redundancy increase the safety), and all the communications can be stored for further querying (if necessary). Results indicate that the system performs with a classification accuracy as high as 94.11 ± 5.54 % to 97.67 ± 3.53 %, depending on the type of gesture examined
    corecore