30 research outputs found

    AN ADAPTIVE SAR IMAGE DESPECKLING ALGORITHM USING STATIONARY WAVELET TRANSFORM

    Get PDF
    In this paper, we present a Stationary Wavelet Transform (SWT) based method for the purpose of despeckling the Synthetic Aperture radar (SAR) images by applying a maximum a posteriori probability (MAP) condition to estimate the noise free wavelet coefficients. The solution of the MAP estimator is based on the assumption that the wavelet coefficients have a known distribution. Rayleigh distribution is used for modeling the speckle noise and Laplacian distribution for modeling the statistics of the noise free wavelet coefficients for the purpose of designing the MAP estimator. Rayleigh distribution is used for modeling the speckle noise since speckle noise can be well described by it. The parameters required for MAP estimator is determined by the technique used for parameter estimation after SWT. The experimental results show that the proposed despeckling algorithm efficiently removes speckle noise from the SAR images

    HAND SEGMENTATION AND TRACKING OF CONTINUOUS HAND POSTURE USING MORPHOLOGICAL PROCESSING

    Get PDF
    This work reports the design of a continuous hand posture recognition system. Hand tracking and segmentation are the primary steps for any hand gesture recognition system. The aim of this paper is to report a noise resistant and efficient hand segmentation algorithm where a new method for hand segmentation using different hand detection schemes with required morphological processing are utilized. Problems such as skin colour detection, complex background removal and variable lighting condition are found to be efficiently handled with this system. Noise present in the segmented image due to dynamic background can be removed with the help of this technique. The proposed approach is found to be effective for a range of conditions

    A COLORED FINGER TIP-BASED TRACKING METHOD FOR CONTINUOUS HAND GESTURE RECOGNITION

    Get PDF
    Hand gesture recognition system can be used for human-computer interaction (HCI). Proper hand segmentation from the background and other body parts of the video is the primary requirement for the design of a hand-gesture based application. These video frames can be captured from a low cost webcam (camera) for use in a vision based gesture recognition technique. This paper discusses about the continuous hand gesture recognition. The aim of this paper is to report a robust and efficient hand segmentation algorithm where a new method, wearing glove on the hand is utilized. After that a new idea called “Finger-Pen”, is developed by segmenting only one finger from the hand for proper tracking. In this technique only a finger tip is segmented in spite of the full hand part. Hence this technique allows the hand (excepting the segmented finger tip) to move freely during the tracking time also. Problems such as skin colour detection, complexity from large numbers of people in front of the camera, complex background removal and variable lighting condition are found to be efficiently handled by the system. Noise present in the segmented image due to dynamic background can be removed with the help of this adaptive technique which is found to be effective for the application conceived

    [In Press] Key physiological traits for drought tolerance identified through phenotyping a large set of slicing cucumber (Cucumis sativus L.) genotypes under field and water-stress conditions

    No full text
    Cucumber is one of the important salad vegetables cultivated worldwide and is highly sensitive to water stress. However, till date, no drought-tolerant cucumber genotypes have been identified using a large set of diverse germplasms via high-throughput phenotyping methods. This study involved screening of a large set of Indian-origin cucumber germplasms for drought stress response using water-deficit and polyethylene glycol (PEG)-induced stress conditions in hydroponic solutions. Water-deficit and PEG-induced methods were optimized before screening of an entire set of germplasm for key physiological traits. Pearson’s correlation revealed non-significant difference between these two methods. Hierarchical cluster analysis and drought tolerance matrix score (DTMS) were calculated based on key physiological traits for ranking of the genotypes. Furthermore, the entire set of genotypes was exposed to water stress (< 8% soil moisture) for 15 d under field conditions to record yield-related traits for validation of the hydroponic-based ranking. Finally, eight tolerant genotypes were identified with high seedling survivability percentage, minimum reduction in root–shoot dry weight, fresh weight and water content percentage, highest DTMS and minimum yield reduction under field stress conditions compared with seven identified sensitive genotypes. The optimized rapid phenotyping method and identified drought-tolerant lines will be instrumental in understanding the physiological and molecular basis of drought tolerance and in facilitating the development of climate-resilient improved cucumber genotypes in the future

    Insights into the Mechanisms of Action of MDA-7/IL-24: A Ubiquitous Cancer-Suppressing Protein

    No full text
    Melanoma differentiation associated gene-7/interleukin-24 (MDA-7/IL-24), a secreted protein of the IL-10 family, was first identified more than two decades ago as a novel gene differentially expressed in terminally differentiating human metastatic melanoma cells. MDA-7/IL-24 functions as a potent tumor suppressor exerting a diverse array of functions including the inhibition of tumor growth, invasion, angiogenesis, and metastasis, and induction of potent &ldquo;bystander&rdquo; antitumor activity and synergy with conventional cancer therapeutics. MDA-7/IL-24 induces cancer-specific cell death through apoptosis or toxic autophagy, which was initially established in vitro and in preclinical animal models in vivo and later in a Phase I clinical trial in patients with advanced cancers. This review summarizes the history and our current understanding of the molecular/biological mechanisms of MDA-7/IL-24 action rendering it a potent cancer suppressor

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Impact of cross-section uncertainties on supernova neutrino spectral parameter fitting in the Deep Underground Neutrino Experiment

    No full text
    International audienceA primary goal of the upcoming Deep Underground Neutrino Experiment (DUNE) is to measure the O(10)  MeV neutrinos produced by a Galactic core-collapse supernova if one should occur during the lifetime of the experiment. The liquid-argon-based detectors planned for DUNE are expected to be uniquely sensitive to the νe component of the supernova flux, enabling a wide variety of physics and astrophysics measurements. A key requirement for a correct interpretation of these measurements is a good understanding of the energy-dependent total cross section σ(Eν) for charged-current νe absorption on argon. In the context of a simulated extraction of supernova νe spectral parameters from a toy analysis, we investigate the impact of σ(Eν) modeling uncertainties on DUNE’s supernova neutrino physics sensitivity for the first time. We find that the currently large theoretical uncertainties on σ(Eν) must be substantially reduced before the νe flux parameters can be extracted reliably; in the absence of external constraints, a measurement of the integrated neutrino luminosity with less than 10% bias with DUNE requires σ(Eν) to be known to about 5%. The neutrino spectral shape parameters can be known to better than 10% for a 20% uncertainty on the cross-section scale, although they will be sensitive to uncertainties on the shape of σ(Eν). A direct measurement of low-energy νe-argon scattering would be invaluable for improving the theoretical precision to the needed level

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore