7 research outputs found

    A validation study of the measurement accuracy of SCENE and SceneVision 3D software programs.

    Get PDF
    This descriptive study sought to determine the measurement accuracy of two 3D modeling software programs used in crime scene processing and reconstruction. These two programs are FARO's SCENE and 3rdTech's SceneVision 3D. This study compared the measurement difference means to guidelines published by the National Institute for Standards and Technology (NIST). A statistical analysis was performed by subtracting the manual measurement from the measurements from SCENE and SceneVision 3D. These differences were used in a paired t-test. The measurement difference means for each program were found to be within the NIST guidelines. The outcome of the paired t-test showed a statistical but not practical significance in the measurement differences. SCENE was found to be slightly more accurate than SceneVision 3D

    COMPRESSIVE IMAGING AND DUAL MOIRE´ LASER INTERFEROMETER AS METROLOGY TOOLS

    Get PDF
    Metrology is the science of measurement and deals with measuring different physical aspects of objects. In this research the focus has been on two basic problems that metrologists encounter. The first problem is the trade-off between the range of measurement and the corresponding resolution; measurement of physical parameters of a large object or scene accompanies by losing detailed information about small regions of the object. Indeed, instruments and techniques that perform coarse measurements are different from those that make fine measurements. This problem persists in the field of surface metrology, which deals with accurate measurement and detailed analysis of surfaces. For example, laser interferometry is used for fine measurement (in nanometer scale) while to measure the form of in object, which lies in the field of coarse measurement, a different technique like moire technique is used. We introduced a new technique to combine measurement from instruments with better resolution and smaller measurement range with those with coarser resolution and larger measurement range. We first measure the form of the object with coarse measurement techniques and then make some fine measurement for features in regions of interest. The second problem is the measurement conditions that lead to difficulties in measurement. These conditions include low light condition, large range of intensity variation, hyperspectral measurement, etc. Under low light condition there is not enough light for detector to detect light from object, which results in poor measurements. Large range of intensity variation results in a measurement with some saturated regions on the camera as well as some dark regions. We use compressive sampling based imaging systems to address these problems. Single pixel compressive imaging uses a single detector instead of array of detectors and reconstructs a complete image after several measurements. In this research we examined compressive imaging for different applications including low light imaging, high dynamic range imaging and hyperspectral imaging

    Handbook of Optical and Laser Scanning

    Get PDF
    From its initial publication titled Laser Beam Scanning in 1985 to Handbook of Optical and Laser Scanning, now in its second edition, this reference has kept professionals and students at the forefront of optical scanning technology. Carefully and meticulously updated in each iteration, the book continues to be the most comprehensive scanning resource on the market. It examines the breadth and depth of subtopics in the field from a variety of perspectives. The Second Edition covers: Technologies such as piezoelectric devices Applications of laser scanning such as Ladar (laser radar) Underwater scanning and laser scanning in CTP As laser costs come down, and power and availability increase, the potential applications for laser scanning continue to increase. Bringing together the knowledge and experience of 26 authors from England, Japan and the United States, the book provides an excellent resource for understanding the principles of laser scanning. It illustrates the significance of scanning in society today and would help the user get started in developing system concepts using scanning. It can be used as an introduction to the field and as a reference for persons involved in any aspect of optical and laser beam scanning

    Handbook of Optical and Laser Scanning

    Get PDF
    From its initial publication titled Laser Beam Scanning in 1985 to Handbook of Optical and Laser Scanning, now in its second edition, this reference has kept professionals and students at the forefront of optical scanning technology. Carefully and meticulously updated in each iteration, the book continues to be the most comprehensive scanning resource on the market. It examines the breadth and depth of subtopics in the field from a variety of perspectives. The Second Edition covers: Technologies such as piezoelectric devices Applications of laser scanning such as Ladar (laser radar) Underwater scanning and laser scanning in CTP As laser costs come down, and power and availability increase, the potential applications for laser scanning continue to increase. Bringing together the knowledge and experience of 26 authors from England, Japan and the United States, the book provides an excellent resource for understanding the principles of laser scanning. It illustrates the significance of scanning in society today and would help the user get started in developing system concepts using scanning. It can be used as an introduction to the field and as a reference for persons involved in any aspect of optical and laser beam scanning

    Image Quality Evaluation in Lossy Compressed Images

    Get PDF
    This research focuses on the quantification of image quality in lossy compressed images, exploring the impact of digital artefacts and scene characteristics upon image quality evaluation. A subjective paired comparison test was implemented to assess perceived quality of JPEG 2000 against baseline JPEG over a range of different scene types. Interval scales were generated for both algorithms, which indicated a subjective preference for JPEG 2000, particularly at low bit rates, and these were confirmed by an objective distortion measure. The subjective results did not follow this trend for some scenes however, and both algorithms were found to be scene dependent as a result of the artefacts produced at high compression rates. The scene dependencies were explored from the interval scale results, which allowed scenes to be grouped according to their susceptibilities to each of the algorithms. Groupings were correlated with scene measures applied in a linked study. A pilot study was undertaken to explore perceptibility thresholds of JPEG 2000 of the same set of images. This work was developed with a further experiment to investigate the thresholds of perceptibility and acceptability of higher resolution JPEG 2000 compressed images. A set of images was captured using a professional level full-frame Digital Single Lens Reflex camera, using a raw workflow and carefully controlled image-processing pipeline. The scenes were quantified using a set of simple scene metrics to classify them according to whether they were average, higher than, or lower than average, for a number of scene properties known to affect image compression and perceived image quality; these were used to make a final selection of test images. Image fidelity was investigated using the method of constant stimuli to quantify perceptibility thresholds and just noticeable differences (JNDs) of perceptibility. Thresholds and JNDs of acceptability were also quantified to explore suprathreshold quality evaluation. The relationships between the two thresholds were examined and correlated with the results from the scene measures, to identify more or less susceptible scenes. It was found that the level and differences between the two thresholds was an indicator of scene dependency and could be predicted by certain types of scene characteristics. A third study implemented the soft copy quality ruler as an alternative psychophysical method, by matching the quality of compressed images to a set of images varying in a single attribute, separated by known JND increments of quality. The imaging chain and image processing workflow were evaluated using objective measures of tone reproduction and spatial frequency response. An alternative approach to the creation of ruler images was implemented and tested, and the resulting quality rulers were used to evaluate a subset of the images from the previous study. The quality ruler was found to be successful in identifying scene susceptibilities and observer sensitivity. The fourth investigation explored the implementation of four different image quality metrics. These were the Modular Image Difference Metric, the Structural Similarity Metric, The Multi-scale Structural Similarity Metric and the Weighted Structural Similarity Metric. The metrics were tested against the subjective results and all were found to have linear correlation in terms of predictability of image quality

    Microscopy Conference 2017 (MC 2017) - Proceedings

    Get PDF
    Das Dokument enthält die Kurzfassungen der Beiträge aller Teilnehmer an der Mikroskopiekonferenz "MC 2017", die vom 21. bis 25.08.2017, in Lausanne stattfand

    Microscopy Conference 2017 (MC 2017) - Proceedings

    Get PDF
    Das Dokument enthält die Kurzfassungen der Beiträge aller Teilnehmer an der Mikroskopiekonferenz "MC 2017", die vom 21. bis 25.08.2017, in Lausanne stattfand
    corecore