5,080 research outputs found

    Estimating factor models for multivariate volatilities : an innovation expansion method

    Get PDF
    We introduce an innovation expansion method for estimation of factor models for conditional variance (volatility) of a multivariate time series. We estimate the factor loading space and the number of factors by a stepwise optimization algorithm on expanding the "white noise space". Simulation and a real data example are given for illustration

    Value at Risk models with long memory features and their economic performance

    Get PDF
    We study alternative dynamics for Value at Risk (VaR) that incorporate a slow moving component and information on recent aggregate returns in established quantile (auto) regression models. These models are compared on their economic performance, and also on metrics of first-order importance such as violation ratios. By better economic performance, we mean that changes in the VaR forecasts should have a lower variance to reduce transaction costs and should lead to lower exceedance sizes without raising the average level of the VaR. We find that, in combination with a targeted estimation strategy, our proposed models lead to improved performance in both statistical and economic terms

    Coherent states, constraint classes, and area operators in the new spin-foam models

    Full text link
    Recently, two new spin-foam models have appeared in the literature, both motivated by a desire to modify the Barrett-Crane model in such a way that the imposition of certain second class constraints, called cross-simplicity constraints, are weakened. We refer to these two models as the FKLS model, and the flipped model. Both of these models are based on a reformulation of the cross-simplicity constraints. This paper has two main parts. First, we clarify the structure of the reformulated cross-simplicity constraints and the nature of their quantum imposition in the new models. In particular we show that in the FKLS model, quantum cross-simplicity implies no restriction on states. The deeper reason for this is that, with the symplectic structure relevant for FKLS, the reformulated cross-simplicity constraints, in a certain relevant sense, are now \emph{first class}, and this causes the coherent state method of imposing the constraints, key in the FKLS model, to fail to give any restriction on states. Nevertheless, the cross-simplicity can still be seen as implemented via suppression of intertwiner degrees of freedom in the dynamical propagation. In the second part of the paper, we investigate area spectra in the models. The results of these two investigations will highlight how, in the flipped model, the Hilbert space of states, as well as the spectra of area operators exactly match those of loop quantum gravity, whereas in the FKLS (and Barrett-Crane) models, the boundary Hilbert spaces and area spectra are different.Comment: 21 pages; statements about gamma limits made more precise, and minor phrasing change

    Dissociable brain mechanisms for inhibitory control: Effects of interference content and working memory capacity

    No full text
    In this study, event-related fMRI was used to examine whether the resolution of interference arising from two different information contents activates the same or different neuronal circuitries. In addition, we examined the extent to which these inhibitory control mechanisms are modulated by individual differences in working memory capacity. Two groups of participants with high and low working memory capacity [high span (HS) and low span (LS) participants, respectively] performed two versions of an item recognition task with familiar letters and abstract objects as stimulus materials. Interference costs were examined by means of the recent negative probe technique with otherwise identical testing conditions across both tasks. While the behavioral interference costs were of similar magnitude in both tasks, the underlying brain activation pattern differed between tasks: The object task interference-effects (higher activation in interference trials than in control trials) were restricted to the anterior intraparietal sulcus (IPS). Interference effects for familiar letters were obtained in the anterior IPS, the left postero-ventral and the right dorsolateral prefrontal cortex (PFC) as well as the precuneus. As the letters were more discernible than the objects, the results suggest that the critical feature for PFC and precuneus involvement in interference resolution is the saliency of stimulus-response mappings. The interference effects in the letter task were modulated by working memory capacity: LS participants showed enhanced activation for interference trials only, whereas for HS participants, who showed better performance and also lower interference costs in the letter task, the above-mentioned neuronal circuitry was activated for interference and control trials, thereby attenuating the interference effects. The latter results support the view that HS individuals allocate more attentional resources for the maintenance of task goals in the face of interfering information from preceding trials with familiar stimulus materials

    Evaluation Criteria for Selecting NoSQL Databases in a Single Box Environment

    Get PDF
    In recent years, NoSQL database systems have become increasingly popular, especially for big data, commercial applications. These systems were designed to overcome the scaling and flexibility limitations plaguing traditional relational database management systems (RDBMSs). Given NoSQL database systems have been typically implemented in large-scale distributed environments serving large numbers of simultaneous users across potentially thousands of geographically separated devices, little consideration has been given to evaluating their value within single-box environments. It is postulated some of the inherent traits of each NoSQL database type may be useful, perhaps even preferable, regardless of scale. Thus, this paper proposes criteria conceived to evaluate the usefulness of NoSQL systems in small-scale single-box environments. Specifically, key value, document, column family, and graph database are discussed with respect to the ability of each to provide CRUD transactions in a single-box environment

    Revisiting the Simplicity Constraints and Coherent Intertwiners

    Full text link
    In the context of loop quantum gravity and spinfoam models, the simplicity constraints are essential in that they allow to write general relativity as a constrained topological BF theory. In this work, we apply the recently developed U(N) framework for SU(2) intertwiners to the issue of imposing the simplicity constraints to spin network states. More particularly, we focus on solving them on individual intertwiners in the 4d Euclidean theory. We review the standard way of solving the simplicity constraints using coherent intertwiners and we explain how these fit within the U(N) framework. Then we show how these constraints can be written as a closed u(N) algebra and we propose a set of U(N) coherent states that solves all the simplicity constraints weakly for an arbitrary Immirzi parameter.Comment: 28 page

    Estimation of greenhouse gas emissions from spontaneous combustion/fire of coal in opencast mines – Indian context

    Get PDF
    There are a significant number of uncontrolled coal mine fires (primarily due to spontaneous combustion of coal), which are currently burning all over the world. These spontaneous combustion sources emit greenhouse gases (GHGs). A critical review reveals that there are no standard measurement methods to estimate GHG emissions from mine fire/spontaneous combustion areas. The objective of this research paper was to estimate GHGs emissions from spontaneous combustion of coals in the Indian context. A sampling chamber (SC) method was successfully used to assess emissions at two locations of the Enna Opencast Project (OCP), Jharia Coalfield (JCF), for 3 months. The study reveals that measured cumulative average emission rate for CO2 varies from 75.02 to 286.03 gs−1m−1 and CH4 varies from 41.49 to 40.34 gs−1m−1 for low- and medium-temperature zones. The total GHG emissions predicted from this single fire affecting mines of JCF vary from 16.86 to 20.19 Mtyr−

    The merit of high-frequency data in portfolio allocation

    Get PDF
    This paper addresses the open debate about the usefulness of high-frequency (HF) data in large-scale portfolio allocation. Daily covariances are estimated based on HF data of the S&P 500 universe employing a blocked realized kernel estimator. We propose forecasting covariance matrices using a multi-scale spectral decomposition where volatilities, correlation eigenvalues and eigenvectors evolve on different frequencies. In an extensive out-of-sample forecasting study, we show that the proposed approach yields less risky and more diversified portfolio allocations as prevailing methods employing daily data. These performance gains hold over longer horizons than previous studies have shown
    • 

    corecore