118,957 research outputs found

    Extended depth-of-field imaging and ranging in a snapshot

    Get PDF
    Traditional approaches to imaging require that an increase in depth of field is associated with a reduction in numerical aperture, and hence with a reduction in resolution and optical throughput. In their seminal work, Dowski and Cathey reported how the asymmetric point-spread function generated by a cubic-phase aberration encodes the detected image such that digital recovery can yield images with an extended depth of field without sacrificing resolution [Appl. Opt. 34, 1859 (1995)]. Unfortunately recovered images are generally visibly degraded by artifacts arising from subtle variations in point-spread functions with defocus. We report a technique that involves determination of the spatially variant translation of image components that accompanies defocus to enable determination of spatially variant defocus. This in turn enables recovery of artifact-free, extended depth-of-field images together with a two-dimensional defocus and range map of the imaged scene. We demonstrate the technique for high-quality macroscopic and microscopic imaging of scenes presenting an extended defocus of up to two waves, and for generation of defocus maps with an uncertainty of 0.036 waves

    Multi-Scale 3D Scene Flow from Binocular Stereo Sequences

    Full text link
    Scene flow methods estimate the three-dimensional motion field for points in the world, using multi-camera video data. Such methods combine multi-view reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results using only two cameras by fusing stereo and optical flow estimation into a single coherent framework. Internally, the proposed algorithm generates probability distributions for optical flow and disparity. Taking into account the uncertainty in the intermediate stages allows for more reliable estimation of the 3D scene flow than previous methods allow. To handle the aperture problems inherent in the estimation of optical flow and disparity, a multi-scale method along with a novel region-based technique is used within a regularized solution. This combined approach both preserves discontinuities and prevents over-regularization – two problems commonly associated with the basic multi-scale approaches. Experiments with synthetic and real test data demonstrate the strength of the proposed approach.National Science Foundation (CNS-0202067, IIS-0208876); Office of Naval Research (N00014-03-1-0108

    Kinect Range Sensing: Structured-Light versus Time-of-Flight Kinect

    Full text link
    Recently, the new Kinect One has been issued by Microsoft, providing the next generation of real-time range sensing devices based on the Time-of-Flight (ToF) principle. As the first Kinect version was using a structured light approach, one would expect various differences in the characteristics of the range data delivered by both devices. This paper presents a detailed and in-depth comparison between both devices. In order to conduct the comparison, we propose a framework of seven different experimental setups, which is a generic basis for evaluating range cameras such as Kinect. The experiments have been designed with the goal to capture individual effects of the Kinect devices as isolatedly as possible and in a way, that they can also be adopted, in order to apply them to any other range sensing device. The overall goal of this paper is to provide a solid insight into the pros and cons of either device. Thus, scientists that are interested in using Kinect range sensing cameras in their specific application scenario can directly assess the expected, specific benefits and potential problem of either device.Comment: 58 pages, 23 figures. Accepted for publication in Computer Vision and Image Understanding (CVIU

    Computational localization microscopy with extended axial range

    Get PDF
    A new single-aperture 3D particle-localization and tracking technique is presented that demonstrates an increase in depth range by more than an order of magnitude without compromising optical resolution and throughput. We exploit the extended depth range and depth-dependent translation of an Airy-beam PSF for 3D localization over an extended volume in a single snapshot. The technique is applicable to all bright-field and fluorescence modalities for particle localization and tracking, ranging from super-resolution microscopy through to the tracking of fluorescent beads and endogenous particles within cells. We demonstrate and validate its application to real-time 3D velocity imaging of fluid flow in capillaries using fluorescent tracer beads. An axial localization precision of 50 nm was obtained over a depth range of 120μm using a 0.4NA, 20× microscope objective. We believe this to be the highest ratio of axial range-to-precision reported to date

    Characterisation of Vertical Upward Gas-Liquid Flow Using a Non-Intrusive Optical Infrared Sensor

    Get PDF
    The pursuit to improve accuracy, cost effectiveness and safety in the operation of multiphase flow metering sums up the motivation for this work. Non-intrusive optical infrared sensors (NIOIRS) of 880 nm and 1480 nm wavelengths have been applied in this work for the objective identification of flow regimes, determination of phase fractions and ultimately for the measurement of phase volumetric flowrates in an upward vertical gas liquid flow. The sensing method detects flow structures based on the disparity of optical properties of each fluid. Air and water were used as working fluids to create GLF in vertical test and main rig setups with 0.018 m x 1 m and 0.0273 m x 5 m test section respectively under varied fluid flow rate combinations (0- 1.0 m/s of water and 0 - 13 m/s of air). Notable contributions were made in this work. These include (i) a derivation of a flow regime dependent phase fraction model, which accounts for interfacial scattering, hence improves phase fraction measurement (ii) A novel application of supervised learning methods to improve objective flow regime identification for a GLF (iii) Application of a modified calibration model to measure actual liquid velocities and flow rates In the absence of priori superficial velocities and slip ratio information (iv) a scheme to convert the NIOIRS into a GLF meter

    3D Simulation with virtual stereo rig for optimizing centrifugal fertilizer spreading

    Get PDF
    Stereovision can be used to characterize of the fertilizer centrifugal spreading process and to control the spreading fertilizer distribution pattern on the ground reference. Fertilizer grains, however, resemble each other and the grain images contain little information on texture. Therefore, the accuracy of stereo matching algorithms in literature cannot be used as a reference for stereo images of fertilizer grains. In order to evaluate stereo matching algorithms applied to images of grains a generator of synthetic stereo particle images is presented in this paper. The particle stereo image generator consists of two main parts: the particle 3D position generator and the virtual stereo rig. The particle 3D position generator uses a simple ballistic flight model and the disc characteristics to simulate the ejection and the displacement of grains. The virtual stereo rig simUlates the stereo acquisition system and generates stereo images, a disparity map and an occlusion map. The results are satisfying and present an accurate reference to evaluate stereo particles matching algorithms
    corecore