547 research outputs found

    Modelling the human perception of shape-from-shading

    Get PDF
    Shading conveys information on 3-D shape and the process of recovering this information is called shape-from-shading (SFS). This thesis divides the process of human SFS into two functional sub-units (luminance disambiguation and shape computation) and studies them individually. Based on results of a series of psychophysical experiments it is proposed that the interaction between first- and second-order channels plays an important role in disambiguating luminance. Based on this idea, two versions of a biologically plausible model are developed to explain the human performances observed here and elsewhere. An algorithm sharing the same idea is also developed as a solution to the problem of intrinsic image decomposition in the field of image processing. With regard to the shape computation unit, a link between luminance variations and estimated surface norms is identified by testing participants on simple gratings with several different luminance profiles. This methodology is unconventional but can be justified in the light of past studies of human SFS. Finally a computational algorithm for SFS containing two distinct operating modes is proposed. This algorithm is broadly consistent with the known psychophysics on human SFS

    A study of image quality for radar image processing

    Get PDF
    Methods developed for image quality metrics are reviewed with focus on basic interpretation or recognition elements including: tone or color; shape; pattern; size; shadow; texture; site; association or context; and resolution. Seven metrics are believed to show promise as a way of characterizing the quality of an image: (1) the dynamic range of intensities in the displayed image; (2) the system signal-to-noise ratio; (3) the system spatial bandwidth or bandpass; (4) the system resolution or acutance; (5) the normalized-mean-square-error as a measure of geometric fidelity; (6) the perceptual mean square error; and (7) the radar threshold quality factor. Selective levels of degradation are being applied to simulated synthetic radar images to test the validity of these metrics

    Filling-in the Forms: Surface and Boundary Interactions in Visual Cortex

    Full text link
    Defense Advanced Research Projects Agency and the Office of Naval Research (NOOOI4-95-l-0409); Office of Naval Research (NOOO14-95-1-0657)

    Uniform color spaces based on CIECAM02 and IPT color difference equations

    Get PDF
    Color difference equations based on the CIECAM02 color appearance model and IPT color space have been developed to fit experimental data. There is no color space in which these color difference equations are Euclidean, e.g. describe distances along a straight line. In this thesis, Euclidean color spaces have been derived for the CIECAM02 and IPT color difference equations, respectively, so that the color difference can be calculated as a simple color distance. Firstly, the Euclidean line element was established, from which terms were derived for the new coordinates of lightness, chroma, and hue angle. Then the spaces were analyzed using performance factors and statistics to test how well they fit various data. The results show that the CIECAM02 Euclidean color space has performance factors similar to the optimized CIECAM02 color difference equation. To statistical significance, the CIECAM02 Euclidean color space had superior fit to the data when compared to the CIECAM02 color difference equation. Conversely, the IPT Euclidean color space performed poorer than the optimized IPT color difference equation. The main reason is that the line element for the lightness vector dimension could not be directly calculated so an approximation was used. To resolve this problem, a new IPT color difference equation should be designed such that line elements can be established directly

    Factors affecting brightness and colour vision under water

    Get PDF
    Both theoretical and practical importance can be attached to attempts to model human threshold and supra-threshold visual performance under water. Previously, emphasis has been given to the integration of visual data from experiments conducted in air with data of the physical specification of the underwater light field. However, too few underwater studies have been undertaken for the validity of this approach to be assessed. The present research therefore was concerned with the acquisition of such data. Four experiments were carried out: (a) to compare the predicted and obtained detection thresholds of achromatic targets, (b) to measure the relative recognition thresholds of coloured targets, (c) to compare the predicted and obtained supra-threshold appearance of coloured targets at various viewing distances and under different experimental instructions, (d) to compare the predicted and obtained detection thresholds for achromatic targets under realistic search conditions. Within each experiment, observers were tested on visual tasks in the field and in laboratory simulations. Physical specifications of targets and backgrounds were determined by photometry and spectroradiometry. The data confirmed that: (a) erroneous predictions of the detection threshold could occur when the contributions of absorption and scattering to the attenuation of light were not differentiated, (b) the successful replication of previous findings for the relative recognition thresholds of colours depended on the brightness of the targets, (c) the perceived change in target colour with increasing viewing distance was less than that measured physically, implying the presence of a colour constancy mechanism other than chromatic adaptation and simultaneous colour contrast; the degree of colour constancy also varied with the type of target and experimental instructions, (d) the successful prediction of the effects of target-observer motion and target location uncertainty required more than simple numerical corrections to the basic detection threshold model. It was concluded that further progress in underwater visibility modelling is possible provided that the tendency to oversimplify human visual performance is suppressed

    Particle Filters for Colour-Based Face Tracking Under Varying Illumination

    Get PDF
    Automatic human face tracking is the basis of robotic and active vision systems used for facial feature analysis, automatic surveillance, video conferencing, intelligent transportation, human-computer interaction and many other applications. Superior human face tracking will allow future safety surveillance systems which monitor drowsy drivers, or patients and elderly people at the risk of seizure or sudden falls and will perform with lower risk of failure in unexpected situations. This area has actively been researched in the current literature in an attempt to make automatic face trackers more stable in challenging real-world environments. To detect faces in video sequences, features like colour, texture, intensity, shape or motion is used. Among these feature colour has been the most popular, because of its insensitivity to orientation and size changes and fast process-ability. The challenge of colour-based face trackers, however, has been dealing with the instability of trackers in case of colour changes due to the drastic variation in environmental illumination. Probabilistic tracking and the employment of particle filters as powerful Bayesian stochastic estimators, on the other hand, is increasing in the visual tracking field thanks to their ability to handle multi-modal distributions in cluttered scenes. Traditional particle filters utilize transition prior as importance sampling function, but this can result in poor posterior sampling. The objective of this research is to investigate and propose stable face tracker capable of dealing with challenges like rapid and random motion of head, scale changes when people are moving closer or further from the camera, motion of multiple people with close skin tones in the vicinity of the model person, presence of clutter and occlusion of face. The main focus has been on investigating an efficient method to address the sensitivity of the colour-based trackers in case of gradual or drastic illumination variations. The particle filter is used to overcome the instability of face trackers due to nonlinear and random head motions. To increase the traditional particle filter\u27s sampling efficiency an improved version of the particle filter is introduced that considers the latest measurements. This improved particle filter employs a new colour-based bottom-up approach that leads particles to generate an effective proposal distribution. The colour-based bottom-up approach is a classification technique for fast skin colour segmentation. This method is independent to distribution shape and does not require excessive memory storage or exhaustive prior training. Finally, to address the adaptability of the colour-based face tracker to illumination changes, an original likelihood model is proposed based of spatial rank information that considers both the illumination invariant colour ordering of a face\u27s pixels in an image or video frame and the spatial interaction between them. The original contribution of this work lies in the unique mixture of existing and proposed components to improve colour-base recognition and tracking of faces in complex scenes, especially where drastic illumination changes occur. Experimental results of the final version of the proposed face tracker, which combines the methods developed, are provided in the last chapter of this manuscript

    Engineering data compendium. Human perception and performance. User's guide

    Get PDF
    The concept underlying the Engineering Data Compendium was the product of a research and development program (Integrated Perceptual Information for Designers project) aimed at facilitating the application of basic research findings in human performance to the design and military crew systems. The principal objective was to develop a workable strategy for: (1) identifying and distilling information of potential value to system design from the existing research literature, and (2) presenting this technical information in a way that would aid its accessibility, interpretability, and applicability by systems designers. The present four volumes of the Engineering Data Compendium represent the first implementation of this strategy. This is the first volume, the User's Guide, containing a description of the program and instructions for its use

    Optical flow estimation using steered-L1 norm

    Get PDF
    Motion is a very important part of understanding the visual picture of the surrounding environment. In image processing it involves the estimation of displacements for image points in an image sequence. In this context dense optical flow estimation is concerned with the computation of pixel displacements in a sequence of images, therefore it has been used widely in the field of image processing and computer vision. A lot of research was dedicated to enable an accurate and fast motion computation in image sequences. Despite the recent advances in the computation of optical flow, there is still room for improvements and optical flow algorithms still suffer from several issues, such as motion discontinuities, occlusion handling, and robustness to illumination changes. This thesis includes an investigation for the topic of optical flow and its applications. It addresses several issues in the computation of dense optical flow and proposes solutions. Specifically, this thesis is divided into two main parts dedicated to address two main areas of interest in optical flow. In the first part, image registration using optical flow is investigated. Both local and global image registration has been used for image registration. An image registration based on an improved version of the combined Local-global method of optical flow computation is proposed. A bi-lateral filter was used in this optical flow method to improve the edge preserving performance. It is shown that image registration via this method gives more robust results compared to the local and the global optical flow methods previously investigated. The second part of this thesis encompasses the main contribution of this research which is an improved total variation L1 norm. A smoothness term is used in the optical flow energy function to regularise this function. The L1 is a plausible choice for such a term because of its performance in preserving edges, however this term is known to be isotropic and hence decreases the penalisation near motion boundaries in all directions. The proposed improved L1 (termed here as the steered-L1 norm) smoothness term demonstrates similar performance across motion boundaries but improves the penalisation performance along such boundaries
    • …
    corecore