11,002 research outputs found

    Adaptive Quantization Matrices for HD and UHD Display Resolutions in Scalable HEVC

    Get PDF
    HEVC contains an option to enable custom quantization matrices, which are designed based on the Human Visual System and a 2D Contrast Sensitivity Function. Visual Display Units, capable of displaying video data at High Definition and Ultra HD display resolutions, are frequently utilized on a global scale. Video compression artifacts that are present due to high levels of quantization, which are typically inconspicuous in low display resolution environments, are clearly visible on HD and UHD video data and VDUs. The default QM technique in HEVC does not take into account the video data resolution, nor does it take into consideration the associated display resolution of a VDU to determine the appropriate levels of quantization required to reduce unwanted video compression artifacts. Based on this fact, we propose a novel, adaptive quantization matrix technique for the HEVC standard, including Scalable HEVC. Our technique, which is based on a refinement of the current HVS-CSF QM approach in HEVC, takes into consideration the display resolution of the target VDU for the purpose of minimizing video compression artifacts. In SHVC SHM 9.0, and compared with anchors, the proposed technique yields important quality and coding improvements for the Random Access configuration, with a maximum of 56.5% luma BD-Rate reductions in the enhancement layer. Furthermore, compared with the default QMs and the Sony QMs, our method yields encoding time reductions of 0.75% and 1.19%, respectively.Comment: Data Compression Conference 201

    JND-Based Perceptual Video Coding for 4:4:4 Screen Content Data in HEVC

    Get PDF
    The JCT-VC standardized Screen Content Coding (SCC) extension in the HEVC HM RExt + SCM reference codec offers an impressive coding efficiency performance when compared with HM RExt alone; however, it is not significantly perceptually optimized. For instance, it does not include advanced HVS-based perceptual coding methods, such as JND-based spatiotemporal masking schemes. In this paper, we propose a novel JND-based perceptual video coding technique for HM RExt + SCM. The proposed method is designed to further improve the compression performance of HM RExt + SCM when applied to YCbCr 4:4:4 SC video data. In the proposed technique, luminance masking and chrominance masking are exploited to perceptually adjust the Quantization Step Size (QStep) at the Coding Block (CB) level. Compared with HM RExt 16.10 + SCM 8.0, the proposed method considerably reduces bitrates (Kbps), with a maximum reduction of 48.3%. In addition to this, the subjective evaluations reveal that SC-PAQ achieves visually lossless coding at very low bitrates.Comment: Preprint: 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2018

    Intersection Bounds: Estimation and Inference

    Get PDF
    We develop a practical and novel method for inference on intersection bounds, namely bounds defined by either the infimum or supremum of a parametric or nonparametric function, or equivalently, the value of a linear programming problem with a potentially infinite constraint set. We show that many bounds characterizations in econometrics, for instance bounds on parameters under conditional moment inequalities, can be formulated as intersection bounds. Our approach is especially convenient for models comprised of a continuum of inequalities that are separable in parameters, and also applies to models with inequalities that are non-separable in parameters. Since analog estimators for intersection bounds can be severely biased in finite samples, routinely underestimating the size of the identified set, we also offer a median-bias-corrected estimator of such bounds as a by-product of our inferential procedures. We develop theory for large sample inference based on the strong approximation of a sequence of series or kernel-based empirical processes by a sequence of "penultimate" Gaussian processes. These penultimate processes are generally not weakly convergent, and thus non-Donsker. Our theoretical results establish that we can nonetheless perform asymptotically valid inference based on these processes. Our construction also provides new adaptive inequality/moment selection methods. We provide conditions for the use of nonparametric kernel and series estimators, including a novel result that establishes strong approximation for any general series estimator admitting linearization, which may be of independent interest

    Self-adaptive node-based PCA encodings

    Full text link
    In this paper we propose an algorithm, Simple Hebbian PCA, and prove that it is able to calculate the principal component analysis (PCA) in a distributed fashion across nodes. It simplifies existing network structures by removing intralayer weights, essentially cutting the number of weights that need to be trained in half

    Risk Shocks and Housing Markets

    Get PDF
    This paper analyzes the role of uncertainty in a multi-sector housing model with financial frictions. We include time varying uncertainty (i.e. risk shocks) in the technology shocks that affect housing production. The analysis demonstrates that risk shocks to the housing production sector are a quantitatively important impulse mechanism for the business cycle. Also, we demonstrate that bankruptcy costs act as an endogenous markup factor in housing prices; as a consequence, the volatility of housing prices is greater than that of output, as observed in the data. The model can also account for the observed countercyclical behavior of risk premia on loans to the housing sector.agency costs, credit channel, time-varying uncertainty, residential investment, housing production, calibration

    Intersection bounds: estimation and inference

    Get PDF
    We develop a practical and novel method for inference on intersection bounds, namely bounds defined by either the infimum or supremum of a parametric or nonparametric function, or equivalently, the value of a linear programming problem with a potentially infinite constraint set. Our approach is especially convenient for models comprised of a continuum of inequalities that are separable in parameters, and also applies to models with inequalities that are non-separable in parameters. Since analog estimators for intersection bounds can be severely biased in finite samples, routinely underestimating the size of the identified set, we also offer a median-bias-corrected estimator of such bounds as a natural by-product of our inferential procedures. We develop theory for large sample inference based on the strong approximation of a sequence of series or kernel-based empirical processes by a sequence of "penultimate" Gaussian processes. These penultimate processes are generally not weakly convergent, and thus non-Donsker. Our theoretical results establish that we can nonetheless perform asymptotically valid inference based on these processes. Our construction also provides new adaptive inequality/moment selection methods. We provide conditions for the use of nonparametric kernel and series estimators, including a novel result that establishes strong approximation for any general series estimator admitting linearization, which may be of independent interest.
    corecore