3 research outputs found

    Mirror mirror on the wall... an unobtrusive intelligent multisensory mirror for well-being status self-assessment and visualization

    Get PDF
    A person’s well-being status is reflected by their face through a combination of facial expressions and physical signs. The SEMEOTICONS project translates the semeiotic code of the human face into measurements and computational descriptors that are automatically extracted from images, videos and 3D scans of the face. SEMEOTICONS developed a multisensory platform in the form of a smart mirror to identify signs related to cardio-metabolic risk. The aim was to enable users to self-monitor their well-being status over time and guide them to improve their lifestyle. Significant scientific and technological challenges have been addressed to build the multisensory mirror, from touchless data acquisition, to real-time processing and integration of multimodal data

    Face morphology: Can it tell us something about body weight and fat?

    Get PDF
    This paper proposes a method for an automatic extraction of geometric features, related to weight parameters, from 3D facial data acquired with low-cost depth scanners. The novelty of the method relies both on the processing of the 3D facial data and on the definition of the geometric features which are conceptually simple, robust against noise and pose estimation errors, computationally efficient, invariant with respect to rotation, translation, and scale changes. Experimental results show that these measurements are highly correlated with weight, BMI, and neck circumference, and well correlated with waist and hip circumference, which are markers of central obesity. Therefore the proposed method strongly supports the development of interactive, non-obtrusive systems able to provide a support for the detection of weight-related problems
    corecore