This paper presents a complete machine vision system for automatic descriptive sensory evaluation of meals. A human sensory panel\ud first developed a set of 72 sensory attributes describing the appearance of a prototypical meal, and then evaluated the intensities of those attributes on a data set of 58 images of example meals. This data was then used both to train and validate the performance of the artificial system. This system covers all stages of image analysis from pre-processing to pattern recognition, including novel techniques for enhancing the segmentation of meal components and extracting image features that mimic the attributes developed by the panel. Artificial neural networks were used to learn the mapping from image features to attribute intensity values. The results showed that the new system was extremely good in learning and reproducing the opinion of the human sensory experts, achieving almost the same performance as the panel members themselves
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.