1,141 research outputs found

    The Evolution of First Person Vision Methods: A Survey

    Full text link
    The emergence of new wearable technologies such as action cameras and smart-glasses has increased the interest of computer vision scientists in the First Person perspective. Nowadays, this field is attracting attention and investments of companies aiming to develop commercial devices with First Person Vision recording capabilities. Due to this interest, an increasing demand of methods to process these videos, possibly in real-time, is expected. Current approaches present a particular combinations of different image features and quantitative methods to accomplish specific objectives like object detection, activity recognition, user machine interaction and so on. This paper summarizes the evolution of the state of the art in First Person Vision video analysis between 1997 and 2014, highlighting, among others, most commonly used features, methods, challenges and opportunities within the field.Comment: First Person Vision, Egocentric Vision, Wearable Devices, Smart Glasses, Computer Vision, Video Analytics, Human-machine Interactio

    IoT data processing pipeline in FoF perspective

    Get PDF
    With the development in the contemporary industry, the concepts of ICT and IoT are gaining more importance, as they are the foundation for the systems of the future. Most of the current solutions converge into transforming the traditional industry in new smart interconnected factories, aware of its context, adaptable to different environments and capable of fully using its resources. However, the full potential for ICT manufacturing has not been achieved, since there is not a universal or standard architecture or model that can be applied to all the existing systems, to tackle the heterogeneity of the existing devices. In a common factory, exists a large amount of information that needs to be processed into the system in order to define event rules accordingly to the related contextual knowledge, to later execute the needed actions. However, this information is sometimes heterogeneous, meaning that it cannot be accessed or understood by the components of the system. This dissertation analyses the existing theories and models that may lead to seamless and homogeneous data exchange and contextual interpretation. A framework based on these theories is proposed in this dissertation, that aims to explore the situational context formalization in order to adequately provide appropriate actions

    Development of Position-Dependent Luminescent Sensors: Spectral Rulers and Chemical Sensing Through Tissue

    Get PDF
    Assessing the performance of medical devices is critical for understanding device function and monitoring pathologies. With the use of a smart device clinically relevant chemical and mechanical information regarding fracture healing may be deduced. For example, strain on the device may be used as a mechanical indicator of weight-bearing capacity. In addition, changes in chemical environment may indicate the development of implant associated infections. Although optical methods are widely used for ex vivostrain/motion analysis and for chemical analyses in cells and histological tissue sections, there utility is limited through thick tissue because light scattering reduces spatial resolution. This dissertation presents four novel luminescent sensors that overcome this limitation. The sensors are capable of detecting chemical and physical changes by measuring position or orientation-dependent color/wavelength changes through tissue. The first three sensors are spectral rulers comprised of two patterned thin films: an encoder strip and an analyzer mask. The encoder strip is either a thin film patterned with stripes of alternating luminescent materials (quantum dots, particles or dyes) or a film containing alternating stripes of a dye that absorbs luminescence from a particle film placed below. The analyzer mask is patterned with a series of alternating transparent windows and opaque stripes equal in width to the encoder lines. The analyzer is overlaid upon the encoder strip such that displacement of the encoder relative to the analyzer modulates the color/spectrum visible through the windows. Relative displacement of the sensor layers is mechanically confined to a single axis. When the substrates are overlaid in the “home position” one line spectrum is observed, and in the “end position,” another line spectrum is observed. At intermediate positions, spectra are a linear combination of the “home” and “end” spectra. The position-modulated signal is collected by a spectrometer and a spectral intensity ratio from closely spaced emission peaks is calculated. By collecting luminescent spectra, rather than imaging the device surface, the sensors eliminate the need to spatially resolve small features through tissue by measuring displacement as a function of color. We measured micron scale displacements through at least 6 mm of tissue using three types of spectral ruler based upon 1) fluorescence, 2) x-ray excited optical luminescence (XEOL), and 3) near infrared upconversion luminescence. The sensors may be used to investigate strain on orthopedic implants, study interfragmentary motion, or assess tendon/ligament tears. In addition to monitoring mechanical strain it is important to investigate clinically relevant implant pathologies such as infection. To address this application, we have developed a fourth type of sensor. The sensor monitors changes in local pH, an indicator of biofilm formation, and uses magnetic fields to modulate position and orientation-dependent luminescence. This modulation allows the sensor signal to be separated from background tissue autofluorescence for spectrochemical sensing. This final sensor variation contains a cylindrical magnet with a fluorescent pH indicating surface on one side and a mask on the other. When the pH indicating surface is oriented towards the collection optics, the spectrum generated contains both the sensor and autofluorescence signals. Conversely, when the pH sensor is oriented away, the collected signal is composed solely of background signals. All four of the sensors described can be used to build smart devices for monitoring pathologies through tissue. Future work will include the application of the strain and chemical sensors in vivo and ex vivo in animal and cadaveric models

    A robotic arm for safe and accurate control of biomedical equipment during COVID-19

    Get PDF
    Purpose Hospital facilities and social life, along with the global economy, have been severely challenged by COVID-19 since the World Health Organization (WHO) declared it a pandemic in March 2020. Since then, countless ordinary citizens, as well as healthcare workers, have contracted the virus by just coming into contact with infected surfaces. In order to minimise the risk of getting infected by contact with such surfaces, our study aims to design, prototype, and test a new device able to connect users, such as common citizens, doctors or paramedics, with either common-use interfaces (e.g., lift and snack machine keyboards, trafc light push-buttons) or medical-use interfaces (e.g., any medical equipment keypad) Method To this purpose, the device was designed with the help of Unifed Modelling Language (UML) schemes, and was informed by a risk analysis, that highlighted some of its essential requirements and specifcations. Consequently, the chosen constructive solution of the robotic system, i.e., a robotic-arm structure, was designed and manufactured using computeraided design and 3D printing. Result The fnal prototype included a properly programmed micro-controller, linked via Bluetooth to a multi-platform mobile phone app, which represents the user interface. The system was then successfully tested on diferent physical keypads and touch screens. Better performance of the system can be foreseen by introducing improvements in the industrial production phase. Conclusion This frst prototype paves the way for further research in this area, allowing for better management and preparedness of next pandemic emergencies. © 2023, The Author(s)
    • …
    corecore