167 research outputs found

    Pre-processing techniques to improve HEVC subjective quality

    Get PDF
    Nowadays, HEVC is the cutting edge encoding standard being the most efficient solution for transmission of video content. In this paper a subjective quality improvement based on pre-processing algorithms for homogeneous and chaotic regions detection is proposed and evaluated for low bit-rate applications at high resolutions. This goal is achieved by means of a texture classification applied to the input frames. Furthermore, these calculations help also reduce the complexity of the HEVC encoder. Therefore both the subjective quality and the HEVC performance are improved

    Information fusion based techniques for HEVC

    Get PDF
    Aiming at the conflict circumstances of multi-parameter H.265/HEVC encoder system, the present paper introduces the analysis of many optimizations\u27 set in order to improve the trade-off between quality, performance and power consumption for different reliable and accurate applications. This method is based on the Pareto optimization and has been tested with different resolutions on real-time encoders

    Characterizing Excitatory and Inhibitory Neuron Responses to Dark and Bright Stimuli in the Visual Cortex of Awake Mice

    Get PDF
    An increased understanding in mouse primary visual cortex (V1) will allow us to better characterize the internal electrical and neural circuit mechanisms of the visual process. So far, most studies directly recording single neurons have been mainly performed in anesthetized animals. This study is novel in that it aims to extend these findings to the awake visual cortex, and to multiple retinotopic locations in V1 of mice. To characterize how excitatory and inhibitory neurons in V1 respond to bright and dark visual stimuli the mice were presented with input stimulus of different colored bars, ranging in luminance defined by Michelson Contrast. Their response to these stimuli was recorded using multi-shank probes that were placed in V1. Once neurons were identified as being inhibitory and excitatory, their response properties were then be tied back to the different input stimuli. Results do not indicate any statistically significant differences between the response of the two classes.Undergraduat

    Digital signal processing with field programmable gate arrays

    Get PDF

    Small Lesions Evaluation Based on Unsupervised Cluster Analysis of Signal-Intensity Time Courses in Dynamic Breast MRI

    Get PDF
    An application of an unsupervised neural network-based computer-aided diagnosis (CAD) system is reported for the detection and characterization of small indeterminate breast lesions, average size 1.1 mm, in dynamic contrast-enhanced MRI. This system enables the extraction of spatial and temporal features of dynamic MRI data and additionally provides a segmentation with regard to identification and regional subclassification of pathological breast tissue lesions. Lesions with an initial contrast enhancement ≥50% were selected with semiautomatic segmentation. This conventional segmentation analysis is based on the mean initial signal increase and postinitial course of all voxels included in the lesion. In this paper, we compare the conventional segmentation analysis with unsupervised classification for the evaluation of signal intensity time courses for the differential diagnosis of enhancing lesions in breast MRI. The results suggest that the computerized analysis system based on unsupervised clustering has the potential to increase the diagnostic accuracy of MRI mammography for small lesions and can be used as a basis for computer-aided diagnosis of breast cancer with MR mammography

    A space-efficient quantum computer simulator suitable for high-speed FPGA implementation

    Full text link
    Conventional vector-based simulators for quantum computers are quite limited in the size of the quantum circuits they can handle, due to the worst-case exponential growth of even sparse representations of the full quantum state vector as a function of the number of quantum operations applied. However, this exponential-space requirement can be avoided by using general space-time tradeoffs long known to complexity theorists, which can be appropriately optimized for this particular problem in a way that also illustrates some interesting reformulations of quantum mechanics. In this paper, we describe the design and empirical space-time complexity measurements of a working software prototype of a quantum computer simulator that avoids excessive space requirements. Due to its space-efficiency, this design is well-suited to embedding in single-chip environments, permitting especially fast execution that avoids access latencies to main memory. We plan to prototype our design on a standard FPGA development board.Comment: 12 pages, 6 figures, presented at Quantum Information and Computation VII, Orlando, April 2009. Author reprint of final submitted manuscrip

    Multi-stage optimization of a deep model: A case study on ground motion modeling.

    Full text link
    In this study, a multi-stage optimization procedure is proposed to develop deep neural network models which results in a powerful deep learning pipeline called intelligent deep learning (iDeepLe). The proposed pipeline is then evaluated by a challenging real-world problem, the modeling of the spectral acceleration experienced by a particle during earthquakes. This approach has three main stages to optimize the deep model topology, the hyper-parameters, and its performance, respectively. This pipeline optimizes the deep model via adaptive learning rate optimization algorithms for both accuracy and complexity in multiple stages, while simultaneously solving the unknown parameters of the regression model. Among the seven adaptive learning rate optimization algorithms, Nadam optimization algorithm has shown the best performance results in the current study. The proposed approach is shown to be a suitable tool to generate solid models for this complex real-world system. The results also show that the parallel pipeline of iDeepLe has the capacity to handle big data problems as well

    An optimization approach to segment breast lesions in ultra-sound images using clinically validated visual cues

    No full text
    International audienceAs long as breast cancer remains the leading cause of cancer deaths among female population world wide, developing tools to assist radiologists during the diagnosis process is necessary. However, most of the technologies developed in the imaging laboratories are rarely integrated in this assessing process, as they are based on information cues differing from those used by clinicians. In order to grant Computer Aided Diagnosis (CAD) systems with these information cues when performing non-aided diagnosis, better segmentation strategies are needed to automatically produce accurate delineations of the breast structures. This paper proposes a highly modular and flexible framework for segmenting breast tissues and lesions present in Breast Ultra-Sound (BUS) images. This framework relies on an optimization strategy and high-level de-scriptors designed analogously to the visual cues used by radiologists. The methodology is comprehensively compared to other sixteen published methodologies developed for segmenting lesions in BUS images. The proposed methodology achieves similar results than reported in the state-of-the-art
    corecore