4 research outputs found

    Correcting veiling glare of refined CIECAM02 for mobile display

    No full text
    Small displays are widely used; they are small enough to be carried around and are often viewed under extreme surround conditions. Under bright illumination, mobile display users experience veiling glare caused by bright ambient lighting. A refined version of CIECAM02 called Refined CIECAM02 and original CIECAM02 were tested to predict visual results in terms of lightness (J), colourfulness (M), and brightness (Q) on a 2-inch sized display (2?) mobile phone under four surround conditions; dark (0 cd/m2), dim (5 cd/m2), average (1000 cd/m2), and bright (10,000 cd/m2). Other than the two versions of CIECAM02 using the original data, a correction to the models' predicted lightness J and a black correction to the original data were developed. Overall, the refined CIECAM02 plus the J correction performed the best for predicting the lightness, brightness and colourfulness under all the viewing conditions, especially for bright surround condition. Furthermore, another experiment was carried out using complex images to verify different versions of CIECAM02. The images were reproduced using JMh (lightness, colourfulness, and hue) spaces from the modified CIECAM02 versions. The experiment was conducted by comparing original images viewed under dim, average, or bright surround conditions and the predicted images were viewed under dark surround condition on two identical mobile displays. The different versions of the CIECAM02 showed similar results to each other for dim and average surround conditions but large differences when predicting the images under bright surround condition. The refined CIECAM02 with the J' formula performed the best amongst all four CIECAM02 versions.close1

    Autocalibrating vision guided navigation of unmanned air vehicles via tactical monocular cameras in GPS denied environments

    Get PDF
    This thesis presents a novel robotic navigation strategy by using a conventional tactical monocular camera, proving the feasibility of using a monocular camera as the sole proximity sensing, object avoidance, mapping, and path-planning mechanism to fly and navigate small to medium scale unmanned rotary-wing aircraft in an autonomous manner. The range measurement strategy is scalable, self-calibrating, indoor-outdoor capable, and has been biologically inspired by the key adaptive mechanisms for depth perception and pattern recognition found in humans and intelligent animals (particularly bats), designed to assume operations in previously unknown, GPS-denied environments. It proposes novel electronics, aircraft, aircraft systems, systems, and procedures and algorithms that come together to form airborne systems which measure absolute ranges from a monocular camera via passive photometry, mimicking that of a human-pilot like judgement. The research is intended to bridge the gap between practical GPS coverage and precision localization and mapping problem in a small aircraft. In the context of this study, several robotic platforms, airborne and ground alike, have been developed, some of which have been integrated in real-life field trials, for experimental validation. Albeit the emphasis on miniature robotic aircraft this research has been tested and found compatible with tactical vests and helmets, and it can be used to augment the reliability of many other types of proximity sensors
    corecore