1 research outputs found

    Integrating user preference to similarity queries over medical images datasets

    No full text
    International audienceLarge amounts of images from medical exams are being stored in databases, so developing retrieval techniques is an important research problem. Retrieval based on the image visual content is usually better than using textual descriptions, as they seldom gives every nuances that the user may be interested in. Content-based image retrieval employs the similarity among images for retrieval. However, similarity is evaluated using numeric methods, and they often orders the images by similarity in a way rather distinct from the user's intention. In this paper, we propose a technique to allow expressing the user's preference over attributes associated to the images, so similarity queries can be refined by preference rules. Experiments performed over a dataset with computed tomography lung images shows that correctly expressing the user's preferences, the similarity query precision can increase from an average of 60% up to close to 100%, when enough interesting images exists in the database
    corecore