12 research outputs found

    Camera System Performance Derived from Natural Scenes

    Get PDF
    The Modulation Transfer Function (MTF) is a well-established measure of camera system performance, commonly employed to characterize optical and image capture systems. It is a measure based on Linear System Theory; thus, its use relies on the assumption that the system is linear and stationary. This is not the case with modern-day camera systems that incorporate non-linear image signal processes (ISP) to improve the output image. Non-linearities result in variations in camera system performance, which are dependent upon the specific input signals. This paper discusses the development of a novel framework, designed to acquire MTFs directly from images of natural complex scenes, thus making the use of traditional test charts with set patterns redundant. The framework is based on extraction, characterization and classification of edges found within images of natural scenes. Scene derived performance measures aim to characterize non-linear image processes incorporated in modern cameras more faithfully. Further, they can produce β€˜live’ performance measures, acquired directly from camera feeds

    Natural Scene Derived Camera Edge Spatial Frequency Response for Autonomous Vision Systems

    Get PDF
    The edge Spatial Frequency Response (eSFR) is an established measure for camera system quality performance, traditionally measured under laboratory conditions. With the increasing use of Deep Neural Networks (DNNs) in autonomous vision systems, the input signal quality becomes crucial for optimal operation. This paper proposes a method to estimate the system eSFR (sys-SFR) from pictorial natural scene derived SFRs (NS-SFRs) as previously presented, laying the foundation for adapting the traditional method to a real-time measure. In this study, the NS-SFR input parameter variations are first investigated to establish suitable ranges that give a stable estimate. Using the NS-SFR framework with the established parameter ranges, the system eSFR, as per ISO 12233, is estimated. Initial validation of results is obtained from implementing the measuring framework with images from a linear and a non-linear camera system. For the linear system, results closely approximate the ISO 12233 eSFR measurement. Non-linear system measurements exhibit scene dependant characteristics expected from edge-based methods. The requirements to implement this method in real-time for autonomous systems are then discussed

    Measurements of the modulation transfer function of image displays

    Get PDF
    Measurements of the Modulation Transfer Function (MTF) of image displays are often required for objective image quality assessments, but are difficult to carry out due to the need for specialized apparatus. This article presents a simple method for the measurement of the MTF of a sample CRT display system which involves the use of a still digital camera for the acquisition of displayed test targets. Measurements are carried out using, first, the sine wave method, where a number of artificial sine wave images of discrete spatial frequency and constant modulation are captured from a close distance. Fourier techniques are employed to extract the amplitude of the display signal from the resulting macroimages. In a second phase, displayed artificial step edges are captured, and the ISO 12333 SFR (Spatial Frequency Response) Slanted Edge plug-in is used for automatic edge analysis. The display MTF, in both cases, is cascaded from the closed-loop system MTF. The two measuring techniques produced matching results, indicating that under controlled test conditions accurate measurements of the display MTF can be achieved with the use of relatively simple equipment

    Ispitivanje parametara kvalitete snimki digitalnih kamera za potrebe fotogrametrijske izmjere primjenom bespilotnih letjelica

    Get PDF
    Nowadays, unmanned aircrafts are more frequently used for measurement purposes. Size of aircrafts is often proportional to its price and load. Aircraft load of 2–3 kg, as required to lift DSLR camera, lens and gimbal (camera stabilizer) in the air, are higher-priced (>50,000 kn). Those kinds of aircrafts have their limits within the law, but also practical limitations because of its size. With the development of autonomous small size cameras such as action cameras appeared the ability to use cheaper, smaller and unmanned aircrafts with lower load in photogrammetric purposes. Of course, to use such a camera in measuring purposes first it is necessary to carry out adequate calibration method and define the elements of internal orientation of the camera. It is important to emphasize that the geometric calibration, or the elimination of geometric errors in the mapping is the key precondition to create idealized images i.e. images of actual optical mapping. This paper researches the quality of content mapped on images with the purpose of investigating the possibility of using action cameras in measuring purposes. The study is based on objective indicators such as global statistical image quality parameters, Modulation Transfer Function and visual analysis of test field images. For the purpose of the paper a modified test field based on the ISO 12233 standard was developed and for the first time used.U danaΕ‘nje je vrijeme sve čeΕ‘Δ‡a upotreba bespilotnih letjelica u mjerne svrhe. Veličina bespilotne letjelice često je proporcionalna cijeni i nosivosti. Letjelice nosivosti 2–3 kg, koliko je potrebno da se u zrak podigne DSLR kamera i objektiv te stabilizator kamere (engl. gimbal), viΕ‘eg su cjenovnog razreda (>50000 kn). Takve letjelice imaju svoja ograničenja u zakonskim okvirima, ali i praktična ograničenja zbog svoje veličine. Razvojem autonomnih kamera malih dimenzija, kao Ε‘to su akcijske kamere, pojavila se moguΔ‡nost koriΕ‘tenja jeftinijih, manjih te bespilotnih letjelica manje nosivosti u fotogrametrijske svrhe. Naravno, kako bi se takva kamera mogla koristiti u mjerne svrhe potrebno je prije svega provesti adekvatnu metodu kalibracije te definirati elemente unutarnje orijentacije kamere. VaΕΎno je naglasiti kako je geometrijska kalibracija, odnosno eliminacija geometrijskih pogreΕ‘aka u preslikavanju, ključan preduvjet u stvaranju idealizirane snimke, tj. snimke stvarnog optičkog preslikavanja. U ovom radu provedeno je ispitivanje kvalitete preslikanog sadrΕΎaja na snimke s ciljem ispitivanja moguΔ‡nosti koriΕ‘tenja akcijskih kamera u mjerne svrhe. IstraΕΎivanje se temelji na objektivnim pokazateljima kao Ε‘to su globalni statistički parametri kvalitete snimki i modulacijska prijenosna funkcija te vizualna analiza snimki testnog polja. Za potrebe rada razvijeno je i po prvi puta koriΕ‘teno modificirano testno polje temeljeno na normi ISO 12233

    Edge Detection Techniques for Quantifying Spatial Imaging System Performance and Image Quality

    Get PDF
    Measuring camera system performance and associating it directly to image quality is very relevant, whether images are aimed for viewing, or as input to machine learning and automated recognition algorithms. The Modulation Transfer Function (MTF) is a well- established measure for evaluating this performance. This study proposes a novel methodology for measuring system MTFs directly from natural scenes, by adapting the standardized Slanted Edge Method (ISO 12233). The method involves edge detection techniques, to select and extract suitable step edges from pictorial images. The scene MTF aims to account for camera non-linear scene dependent processes. This measure is more relevant to image quality modelling than the traditionally measured MTFs. Preliminary research results indicate that the proposed method can provide reliable MTFs, following the trends of the ISO 12233. Further development and validation are required before it is proposed as a universal camera measuring technique

    Camera Spatial Frequency Response Derived from Pictorial Natural Scenes

    Get PDF
    Camera system performance is a prominent part of many aspects of imaging science and computer vision. There are many aspects to camera performance that determines how accurately the image represents the scene, including measurements of colour accuracy, tone reproduction, geometric distortions, and image noise evaluation. The research conducted in this thesis focuses on the Modulation Transfer Function (MTF), a widely used camera performance measurement employed to describe resolution and sharpness. Traditionally measured under controlled conditions with characterised test charts, the MTF is a measurement restricted to laboratory settings. The MTF is based on linear system theory, meaning the input to output must follow a straightforward correlation. Established methods for measuring the camera system MTF include the ISO12233:2017 for measuring the edge-based Spatial Frequency Response (e-SFR), a sister measure of the MTF designed for measuring discrete systems. Many modern camera systems incorporate non-linear, highly adaptive image signal processing (ISP) to improve image quality. As a result, system performance becomes scene and processing dependant, adapting to the scene contents captured by the camera. Established test chart based MTF/SFR methods do not describe this adaptive nature; they only provide the response of the camera to a test chart signal. Further, with the increased use of Deep Neural Networks (DNN) for image recognition tasks and autonomous vision systems, there is an increased need for monitoring system performance outside laboratory conditions in real-time, i.e. live-MTF. Such measurements would assist in monitoring the camera systems to ensure they are fully operational for decision critical tasks. This thesis presents research conducted to develop a novel automated methodology that estimates the standard e-SFR directly from pictorial natural scenes. This methodology has the potential to produce scene dependant and real-time camera system performance measurements, opening new possibilities in imaging science and allowing live monitoring/calibration of systems for autonomous computer vision applications. The proposed methodology incorporates many well-established image processes, as well as others developed for specific purposes. It is presented in two parts. Firstly, the Natural Scene derived SFR (NS-SFR) are obtained from isolated captured scene step-edges, after verifying that these edges have the correct profile for implementing into the slanted-edge algorithm. The resulting NS-SFRs are shown to be a function of both camera system performance and scene contents. The second part of the methodology uses a series of derived NS-SFRs to estimate the system e-SFR, as per the ISO12233 standard. This is achieved by applying a sequence of thresholds to segment the most likely data corresponding to the system performance. These thresholds a) group the expected optical performance variation across the imaging circle within radial distance segments, b) obtain the highest performance NS-SFRs per segment and c) select the NS-SFRs with input edge and region of interest (ROI) parameter ranges shown to introduce minimal e-SFR variation. The selected NS-SFRs are averaged per radial segment to estimate system e-SFRs across the field of view. A weighted average of these estimates provides an overall system performance estimation. This methodology is implemented for e-SFR estimation of three characterised camera systems, two near-linear and one highly non-linear. Investigations are conducted using large, diverse image datasets as well as restricting scene content and the number of images used for the estimation. The resulting estimates are comparable to ISO12233 e-SFRs derived from test chart inputs for the near-linear systems. Overall estimate stays within one standard deviation of the equivalent test chart measurement. Results from the highly non-linear system indicate scene and processing dependency, potentially leading to a more representative SFR measure than the current chart-based approaches for such systems. These results suggest that the proposed method is a viable alternative to the ISO technique

    A testing procedure to characterize color and spatial quality of digital cameras used to image cultural heritage

    Get PDF
    A testing procedure for characterizing both the color and spatial image quality of trichromatic digital cameras, which are used to photograph paintings in cultural heritage institutions, is described. This testing procedure is target-based, thus providing objective measures of quality. The majority of the testing procedure followed current standards from national and international organizations such as ANSI, ISO, and IEC. The procedure was developed in an academic research laboratory and used to benchmark four representative American museum’s digital-camera systems and workflows. The quality parameters tested included system spatial uniformity, tone reproduction, color reproduction accuracy, noise, dynamic range, spatial cross-talk, spatial frequency response, color-channel registration, and depth of field. In addition, two paintings were imaged and processed through each museum’s normal digital workflow. The results of the four case studies showed many dissimilarities among the digital-camera systems and workflows of American museums, which causes a significant range in the archival quality of their digital masters

    Scene-Dependency of Spatial Image Quality Metrics

    Get PDF
    This thesis is concerned with the measurement of spatial imaging performance and the modelling of spatial image quality in digital capturing systems. Spatial imaging performance and image quality relate to the objective and subjective reproduction of luminance contrast signals by the system, respectively; they are critical to overall perceived image quality. The Modulation Transfer Function (MTF) and Noise Power Spectrum (NPS) describe the signal (contrast) transfer and noise characteristics of a system, respectively, with respect to spatial frequency. They are both, strictly speaking, only applicable to linear systems since they are founded upon linear system theory. Many contemporary capture systems use adaptive image signal processing, such as denoising and sharpening, to optimise output image quality. These non-linear processes change their behaviour according to characteristics of the input signal (i.e. the scene being captured). This behaviour renders system performance β€œscene-dependent” and difficult to measure accurately. The MTF and NPS are traditionally measured from test charts containing suitable predefined signals (e.g. edges, sinusoidal exposures, noise or uniform luminance patches). These signals trigger adaptive processes at uncharacteristic levels since they are unrepresentative of natural scene content. Thus, for systems using adaptive processes, the resultant MTFs and NPSs are not representative of performance β€œin the field” (i.e. capturing real scenes). Spatial image quality metrics for capturing systems aim to predict the relationship between MTF and NPS measurements and subjective ratings of image quality. They cascade both measures with contrast sensitivity functions that describe human visual sensitivity with respect to spatial frequency. The most recent metrics designed for adaptive systems use MTFs measured using the dead leaves test chart that is more representative of natural scene content than the abovementioned test charts. This marks a step toward modelling image quality with respect to real scene signals. This thesis presents novel scene-and-process-dependent MTFs (SPD-MTF) and NPSs (SPDNPS). They are measured from imaged pictorial scene (or dead leaves target) signals to account for system scene-dependency. Further, a number of spatial image quality metrics are revised to account for capture system and visual scene-dependency. Their MTF and NPS parameters were substituted for SPD-MTFs and SPD-NPSs. Likewise, their standard visual functions were substituted for contextual detection (cCSF) or discrimination (cVPF) functions. In addition, two novel spatial image quality metrics are presented (the log Noise Equivalent Quanta (NEQ) and Visual log NEQ) that implement SPD-MTFs and SPD-NPSs. The metrics, SPD-MTFs and SPD-NPSs were validated by analysing measurements from simulated image capture pipelines that applied either linear or adaptive image signal processing. The SPD-NPS measures displayed little evidence of measurement error, and the metrics performed most accurately when they used SPD-NPSs measured from images of scenes. The benefit of deriving SPD-MTFs from images of scenes was traded-off, however, against measurement bias. Most metrics performed most accurately with SPD-MTFs derived from dead leaves signals. Implementing the cCSF or cVPF did not increase metric accuracy. The log NEQ and Visual log NEQ metrics proposed in this thesis were highly competitive, outperforming metrics of the same genre. They were also more consistent than the IEEE P1858 Camera Phone Image Quality (CPIQ) metric when their input parameters were modified. The advantages and limitations of all performance measures and metrics were discussed, as well as their practical implementation and relevant applications

    An evaluation of the current state of digital photography

    Get PDF
    The field of digital photography is always changing. Due to the rapid pace of new technologies being developed for this area, it is becoming more commonly used by people in all walks of life, particularly that of the consumer. It is therefore important to critically evaluate the current state of this technology to gain a better understanding of how advanced it has become. This research has evaluated the digital photographic systems from digital camera input to printer output. The metrics used to judge the performance of the camera and printer performance were the modulation transfer functions, or MTFs, of the devices, and subjective evaluations of their prints. Previous research of this kind has been done on specific devices, but this project is unique in that it looks at digital photography as a system, and incorporates not only the traditional MTF, but also the ratings of observers. From the results gained by doing this research, several conclusions have been made. The first is that a generic model of digital photography has been gained. What is meant by generic is that the cameras and printers span a wide range of quality and expense, so it is possible to substitute devices of similar qualities and obtain information from the results. The second conclusion to be made is that for low end cameras, an upgraded printer makes little difference in the output of the system. Finally, the effects of interline vs. frame transfer CCDs have not been determined by this research due to the unexpected differences in the two cameras used for this question

    νŽΈκ΄‘ 닀쀑화λ₯Ό μ΄μš©ν•˜μ—¬ ν–₯μƒλœ κΈ°λŠ₯을 μ œκ³΅ν•˜λŠ” λ„νŒŒκ΄€ 기반의 κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄

    Get PDF
    ν•™μœ„λ…Όλ¬Έ (박사) -- μ„œμšΈλŒ€ν•™κ΅ λŒ€ν•™μ› : κ³΅κ³ΌλŒ€ν•™ 전기·정보곡학뢀, 2021. 2. μ΄λ³‘ν˜Έ.This dissertation presents the studies on the optical design method that enhances the display performance of see-through waveguide-based near-eye displays (WNEDs) using the polarization multiplexing technique. The studies focus on the strategies to improve the crucial display performances without compromising a small form factor, the most attractive merit of the WNEDs. To achieve this goal, thin and lightweight polarization-dependent optical elements are devised and employed in the WNED structure. The polarization-dependent devices can allow multiple optical functions or optical paths depending on the polarization state of the input beam, which can break through the limitation of the waveguide system with the polarization multiplexing. To realize the function-selective eyepiece for AR applications, the proposed devices should operate as an optically transparent window for the real scene while performing specific optical functions for the virtual image. The proposed devices are manufactured in a combination structure in which polarization-dependent optical elements are stacked. The total thickness of the stacked structure is about 1 mm, and it can be attached to the waveguide surface without conspicuously increasing the form factor of the optical system. Using the proposed polarization-dependent devices, the author proposes three types of novel WNED systems with enhanced performance. First, the author suggests a compact WNED with dual focal planes. Conventional WNEDs have an inherent limitation that the focal plane of the virtual image is at an infinite distance because they extract a stream of collimated light at the out-coupler. By using the polarization-dependent eyepiece lens, an additional focal plane can be generated with the polarization multiplexing in addition to infinity depth. The proposed configuration can provide comfortable AR environments by alleviating visual fatigue caused by vergence-accommodation conflict. Second, the novel WNED configuration with extended field-of-view (FOV) is presented. In the WNEDs, the maximum allowable FOV is determined by the material properties of the diffraction optics and the substrate. By using the polarization-dependent steering combiner, the FOV can be extended up to two times, which can provide more immersive AR experiences. In addition, this dissertation demonstrates that the distortion for the real scene caused by the stacked structure cannot severely disturb the image quality, considering the acuity of human vision. Lastly, the author presents a retinal projection-based WNED with switchable viewpoints by simultaneously adopting the polarization-dependent lens and grating. The proposed system can convert the viewpoint according to the position of the eye pupil without mechanical movement. The polarization-dependent viewpoint switching can resolve the inherent problem of a narrow eyebox in retinal projection displays without employing the bulky optics for mechanical movement. In conclusion, the dissertation presents the practical optical design and detailed analysis for enhanced WNED based on the polarization multiplexing technique through various simulations and experiments. The proposed approaches are expected to be utilized as an innovative solution for compact wearable displays.λ³Έ λ°•μ‚¬ν•™μœ„ λ…Όλ¬Έμ—μ„œλŠ” νŽΈκ΄‘ 닀쀑화 기법을 μ΄μš©ν•˜μ—¬ λ„νŒŒκ΄€ 기반의 μ¦κ°•ν˜„μ‹€ κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄μ˜ μ„±λŠ₯을 ν–₯μƒμ‹œν‚€λŠ” κ΄‘ν•™ 섀계 및 이에 λŒ€ν•œ 뢄석에 λŒ€ν•΄ λ…Όμ˜ν•œλ‹€. λ³Έ μ—°κ΅¬λŠ” λ„νŒŒκ΄€ 기반 κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄μ˜ κ°€μž₯ 큰 μž₯점인 μ†Œν˜• 폼 νŒ©ν„°λ₯Ό μœ μ§€ν•˜λ©΄μ„œ λ””μŠ€ν”Œλ ˆμ΄ μ„±λŠ₯을 κ°œμ„ ν•˜λŠ” 것에 쀑점을 λ‘”λ‹€. 이λ₯Ό μœ„ν•΄ κΈ°μ‘΄ κ΄‘ν•™ μ†Œμžμ— λΉ„ν•΄ 맀우 가볍고 얇은 νŽΈκ΄‘ μ˜μ‘΄ν˜• κ²°ν•©κΈ° κ΄‘ν•™ μ†Œμžκ°€ μƒˆλ‘­κ²Œ μ œμ•ˆλ˜λ©°, μ΄λŠ” μž…μ‚¬κ΄‘μ˜ νŽΈκ΄‘ μƒνƒœμ— 따라 독립적인 κ΄‘ 경둜 μ œμ–΄λ₯Ό κ°€λŠ₯μΌ€ ν•˜μ—¬ νŽΈκ΄‘ 닀쀑화λ₯Ό 톡해 ν–₯μƒλœ μ„±λŠ₯을 제곡 ν•  수 μžˆλ‹€. λ˜ν•œ μ‹€μ œ μ˜μƒμ˜ 빛은 κ·ΈλŒ€λ‘œ 투과 μ‹œν‚΄μœΌλ‘œμ¨ μ¦κ°•ν˜„μ‹€μ„ μœ„ν•œ μ†Œμžλ‘œ ν™œμš© κ°€λŠ₯ν•˜λ‹€. λ³Έ μ—°κ΅¬μ—μ„œ μ œμ•ˆν•˜λŠ” νŽΈκ΄‘ μ˜μ‘΄ν˜• κ²°ν•©κΈ° κ΄‘ν•™ μ†ŒμžλŠ” κΈ°ν•˜ν•™μ  μœ„μƒ(geometric phase, GP)에 κΈ°λ°˜ν•˜μ—¬ λ™μž‘ν•œλ‹€. GP 기반의 κ΄‘ν•™μ†Œμžκ°€ μ„œλ‘œ μ§κ΅ν•˜λŠ” μ›ν˜• νŽΈκ΄‘ μž…μ‚¬κ΄‘μ— λŒ€ν•΄ 상보적인 κΈ°λŠ₯을 μˆ˜ν–‰ν•˜λŠ” 것을 μ΄μš©ν•˜μ—¬, 두 개 μ΄μƒμ˜ GP μ†Œμžμ™€ νŽΈκ΄‘ μ œμ–΄λ₯Ό μœ„ν•œ κ΄‘ν•™ 필름듀을 쀑첩 μ‹œν‚΄μœΌλ‘œμ¨ μ¦κ°•ν˜„μ‹€ κ²°ν•©κΈ° κ΄‘ν•™ μ†Œμžλ₯Ό κ΅¬ν˜„ν•  수 μžˆλ‹€. 이듀 κ΄‘ν•™μ†ŒμžλŠ” 맀우 μ–‡κΈ° λ•Œλ¬Έμ—, λ³Έ μ—°κ΅¬μ—μ„œ μ œμž‘λœ νŽΈκ΄‘ μ˜μ‘΄ν˜• κ²°ν•©κΈ° κ΄‘ν•™ μ†Œμžμ˜ 총 λ‘κ»˜λŠ” 1 mm μˆ˜μ€€μœΌλ‘œ 폼 νŒ©ν„° μ œμ•½μœΌλ‘œλΆ€ν„° μžμœ λ‘­λ‹€. λ˜ν•œ ν‰ν‰ν•œ 필름 ν˜•νƒœμ΄λ―€λ‘œ, ν‰νŒν˜• λ„νŒŒκ΄€μ— λΆ€μ°©ν•˜κΈ° μ‰½λ‹€λŠ” 이점을 μ§€λ‹Œλ‹€. κ³ μ•ˆλœ νŽΈκ΄‘ μ˜μ‘΄ν˜• κ²°ν•©κΈ° κ΄‘ν•™ μ†Œμžλ₯Ό μ‚¬μš©ν•˜μ—¬ μ„Έ 가지 μœ ν˜•μ˜ μƒˆλ‘œμš΄ λ„νŒŒκ΄€ 기반의 κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄ ꡬ쑰λ₯Ό μ œμ•ˆν•œλ‹€. 첫 λ²ˆμ§ΈλŠ” μž…μ‚¬κ΄‘μ˜ νŽΈκ΄‘ μƒνƒœμ— 따라 투λͺ… κ΄‘ν•™ μ°½ λ˜λŠ” 였λͺ© 렌즈둜 μž‘λ™ν•˜λŠ” νŽΈκ΄‘ μ˜μ‘΄ν˜• κ²°ν•©κΈ° 렌즈λ₯Ό μ μš©ν•˜μ—¬ 가상 μ˜μƒμ— λŒ€ν•΄ 이쀑 μ΄ˆμ λ©΄μ„ μ œκ³΅ν•˜λŠ” μ‹œμŠ€ν…œμ΄λ‹€. μ œμ•ˆλœ κ΅¬μ‘°λŠ” 기쑴의 λ„νŒŒκ΄€ 기반 κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄κ°€ λ¬΄ν•œλŒ€ μœ„μΉ˜μ— 단일 μ΄ˆμ λ©΄μ„ μ œκ³΅ν•¨μœΌλ‘œμ¨ λ°œμƒν•˜λŠ” μ‹œκ°μ  ν”Όλ‘œ 및 νλ¦Ών•œ μ¦κ°•ν˜„μ‹€ μ˜μƒμ˜ 문제λ₯Ό μ™„ν™”ν•  수 μžˆλ‹€. 두 λ²ˆμ§Έλ‘œλŠ” μž…μ‚¬κ΄‘μ˜ νŽΈκ΄‘ μƒνƒœμ— 따라 κ΄‘ 경둜λ₯Ό 쒌츑 λ˜λŠ” 우츑으둜 μ œμ–΄ν•  수 μžˆλŠ” νŽΈκ΄‘ 격자λ₯Ό ν™œμš©ν•˜μ—¬ 가상 μ˜μƒμ˜ μ‹œμ•Όκ°μ„ 기쑴보닀 μ΅œλŒ€ 2λ°°κΉŒμ§€ ν™•μž₯ν•  수 μžˆλŠ” μ‹œμŠ€ν…œμ„ μ œμ•ˆν•œλ‹€. μ΄λŠ” 단일 λ„νŒŒκ΄€ 기반 κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄μ—μ„œ μ˜μƒ κ²°ν•©κΈ° (imaging combiner)둜 ν™œμš©λ˜λŠ” 회절 μ†Œμžμ˜ 섀계 λ³€μˆ˜μ— μ˜ν•΄ μ œν•œλ˜λŠ” μ‹œμ•Όκ° ν•œκ³„μ μ„ λŒνŒŒν•  수 μžˆλŠ” ꡬ쑰둜 μ»΄νŒ©νŠΈν•œ 폼 νŒ©ν„°λ‘œ λ”μš± λͺ°μž…감 μžˆλŠ” λŒ€ν™”λ©΄ μ¦κ°•ν˜„μ‹€ μ˜μƒμ„ μ œκ³΅ν•  수 μžˆλ‹€. λ§ˆμ§€λ§‰μœΌλ‘œ μœ„μ—μ„œ μ œμ•ˆλœ 두 가지 νŽΈκ΄‘ μ˜μ‘΄ν˜• κ΄‘ν•™ μ†Œμžλ₯Ό λͺ¨λ‘ μ‚¬μš©ν•˜μ—¬ μ‹œμ  μ „ν™˜μ΄ κ°€λŠ₯ν•œ λ„νŒŒκ΄€ 기반의 망막 νˆ¬μ‚¬ν˜• λ””μŠ€ν”Œλ ˆμ΄ ꡬ쑰λ₯Ό μ œμ•ˆν•œλ‹€. νŽΈκ΄‘ 닀쀑화λ₯Ό 톡해 닀쀑 μ΄ˆμ λ“€μ„ μ„ νƒμ μœΌλ‘œ ν™œμ„±ν™”ν•¨μœΌλ‘œμ¨, ν™•μž₯된 μ‹œμ²­μ˜μ—­μ„ μ œκ³΅ν•˜λŠ” λ™μ‹œμ— 동곡 크기 λ³€ν™” λ˜λŠ” μ›€μ§μž„μ— μ˜ν•œ 이쀑 μ˜μƒ 문제λ₯Ό μ™„ν™”ν•  수 μžˆλ‹€. λ˜ν•œ 기계적 μ›€μ§μž„ 없이 μ‹œμ  κ°„μ˜ 고속 μ „ν™˜μ΄ κ°€λŠ₯ν•˜λ‹€λŠ” μž₯점을 μ§€λ‹ˆκ³  μžˆλ‹€. λ³Έ λ°•μ‚¬ν•™μœ„ λ…Όλ¬Έμ—μ„œ μ œμ‹œν•œ νŽΈκ΄‘ 닀쀑화λ₯Ό ν™œμš©ν•œ μƒˆλ‘œμš΄ κ²°ν•©κΈ° κ΄‘ν•™ μ†Œμž 및 κ΄‘ν•™ ꡬ쑰듀은 λ„νŒŒκ΄€ 기반 κ·Όμ•ˆ λ””μŠ€ν”Œλ ˆμ΄μ˜ ν–₯μƒλœ μ„±λŠ₯을 μ œκ³΅ν•˜λŠ” ν•΄κ²°μ±… 및 μƒˆλ‘œμš΄ κ°€λŠ₯μ„±μœΌλ‘œ μ œμ‹œν•  수 μžˆμ„ 것이라 κΈ°λŒ€λœλ‹€.Abstract i Contents iii List of Tables vi List of Figures vii Chapter. 1 Introduction 1 1.1 Augmented reality near-eye display 1 1.2 Key performance parameters of near-eye displays 4 1.3 Basic scheme of waveguide-based near-eye displays 22 1.4 Motivation and purpose of this dissertation 33 1.5 Scope and organization 37 Chapter 2 Dual-focal waveguide-based near-eye display using polarization-dependent combiner lens 39 2.1 Introduction 39 2.2 Optical design for polarization-dependent combiner lens 42 2.2.1 Design and principle of polarization-dependent combiner lens 42 2.2.2 Prototype implementation 48 2.3 Waveguide-based augmented reality near-eye display with dual-focal plane using polarization-dependent combiner lens 51 2.3.1 Implementation of the prototype and experimental results 51 2.3.2 Performance analysis and discussion 57 2.4 Conclusion 69 Chapter 3 Extended-field-of-view waveguide-based near-eye display via polarization-dependent steering combiner 70 3.1 Introduction 70 3.2 Optical design for polarization-dependent steering combiner 73 3.2.1 Principle of polarization grating 73 3.2.2 Principle of polarization-dependent steering combiner 76 3.2.3 Analysis and verification experiment for real-scene distortion 77 3.3 Waveguide-based augmented reality near-eye display with extended-field-of-view 81 3.3.1 Field-of-view for volume grating based waveguide technique 81 3.3.2 Implementation of the prototype and experimental results 84 3.3.3 Performances analysis and discussion 87 3.4 Conclusion 92 Chapter 4 Viewpoint switchable retinal-projection-based near-eye display with waveguide configuration 93 4.1 Introduction 93 4.2 Polarization-dependent switchable eyebox 97 4.2.1 Optical devices for polarization-dependent switching of viewpoints 97 4.2.2 System configuration for proposed method 100 4.2.3 Design of waveguide and imaging combiner 105 4.3 Compact retinal projection-based near-eye display with switchable viewpoints via waveguide configuration 114 4.3.1 Implementation of the prototype and experimental results 114 4.3.2 Performance analysis and discussion 118 4.4 Conclusion 122 Chapter 5. Conclusion 123 Bibliography 127 Appendix 135Docto
    corecore