43 research outputs found

    Multiday expected shortfall under generalized t distributions : evidence from global stock market

    Get PDF
    We apply seven alternative t-distributions to estimate the market risk measures Value at Risk (VaR) and its extension Expected Shortfall (ES). Of these seven, the twin t-distribution (TT) of Baker and Jackson (2014) and generalized asymmetric distribution (GAT) of Baker (2016) are applied for the first time to estimate market risk. We analytically estimate VaR and ES over one-day horizon and extend this to multi-day horizon using Monte Carlo simulation. We find that taken together TT and GAT distributions provide the best back-testing results across individual confidence levels and horizons for majority of scenarios. Moreover, we find that with the lengthening of time horizon, TT and GAT models performs well, such that at the ten-day horizon, GAT provides the best back-testing results for all of the five indices and the TT model provides the second best results, irrespective period of study and confidence level

    A Relative Variation-Based Method to Unraveling Gene Regulatory Networks

    Get PDF
    Gene regulatory network (GRN) reconstruction is essential in understanding the functioning and pathology of a biological system. Extensive models and algorithms have been developed to unravel a GRN. The DREAM project aims to clarify both advantages and disadvantages of these methods from an application viewpoint. An interesting yet surprising observation is that compared with complicated methods like those based on nonlinear differential equations, etc., methods based on a simple statistics, such as the so-called -score, usually perform better. A fundamental problem with the -score, however, is that direct and indirect regulations can not be easily distinguished. To overcome this drawback, a relative expression level variation (RELV) based GRN inference algorithm is suggested in this paper, which consists of three major steps. Firstly, on the basis of wild type and single gene knockout/knockdown experimental data, the magnitude of RELV of a gene is estimated. Secondly, probability for the existence of a direct regulation from a perturbed gene to a measured gene is estimated, which is further utilized to estimate whether a gene can be regulated by other genes. Finally, the normalized RELVs are modified to make genes with an estimated zero in-degree have smaller RELVs in magnitude than the other genes, which is used afterwards in queuing possibilities of the existence of direct regulations among genes and therefore leads to an estimate on the GRN topology. This method can in principle avoid the so-called cascade errors under certain situations. Computational results with the Size 100 sub-challenges of DREAM3 and DREAM4 show that, compared with the -score based method, prediction performances can be substantially improved, especially the AUPR specification. Moreover, it can even outperform the best team of both DREAM3 and DREAM4. Furthermore, the high precision of the obtained most reliable predictions shows that the suggested algorithm may be very helpful in guiding biological experiment designs

    Modeling the Instantaneous Pressureā€“Volume Relation of the Left Ventricle: A Comparison of Six Models

    Get PDF
    Simulations are useful to study the heartā€™s ability to generate flow and the interaction between contractility and loading conditions. The left ventricular pressureā€“volume (PV) relation has been shown to be nonlinear, but it is unknown whether a linear model is accurate enough for simulations. Six models were fitted to the PV-data measured in five sheep and the estimated parameters were used to simulate PV-loops. Simulated and measured PV-loops were compared with the Akaike information criterion (AIC) and the Hamming distance, a measure for geometric shape similarity. The compared models were: a time-varying elastance model with fixed volume intercept (LinFix); a time-varying elastance model with varying volume intercept (LinFree); a Langewouterā€™s pressure-dependent elasticity model (Langew); a sigmoidal model (Sigm); a time-varying elastance model with a systolic flow-dependent resistance (Shroff) and a model with a linear systolic and an exponential diastolic relation (Burkh). Overall, the best model is LinFree (lowest AIC), closely followed by Langew. The remaining models rank: Sigm, Shroff, LinFix and Burkh. If only the shape of the PV-loops is important, all models perform nearly identically (Hamming distance between 20 and 23%). For realistic simulation of the instantaneous PV-relation a linear model suffices

    Solar System Abundances of the Elements

    Full text link
    Representative abundances of the chemical elements for use as a solar abundance standard in astronomical and planetary studies are summarized. Updated abundance tables for solar system abundances based on meteorites and photospheric measurements are presented.Comment: 46 pages; 5 figures; 8 tables; In: Principles and Perspectives in Cosmochemistry.Lecture Notes of the Kodai School on 'Synthesis of Elements in Stars' held at Kodaikanal Observatory, India, April 29 - May 13, 2008 (Aruna Goswami and B. Eswar Reddy eds.) Astrophysics and Space Science Proceedings, Springer-Verlag Berlin Heidelberg, 2010, p. 379-417 (ISBN 978-3-642-10351-3), 201

    Inferring cellular networks ā€“ a review

    Get PDF
    In this review we give an overview of computational and statistical methods to reconstruct cellular networks. Although this area of research is vast and fast developing, we show that most currently used methods can be organized by a few key concepts. The first part of the review deals with conditional independence models including Gaussian graphical models and Bayesian networks. The second part discusses probabilistic and graph-based methods for data from experimental interventions and perturbations

    Boom and bust of a moose population ā€“ a call for integrated forest management

    Get PDF
    This is the postprint version of the article. The published article can be located at www.springerlink.comThere is increasing pressure to manage forests for multiple objectives, including ecosystem services and biodiversity, alongside timber production. However, few forests are currently co-managed for timber and wildlife, despite potential economic and conservation benefits. We present empirical data from a commercial Norway spruce ( Picea abies ) and Scots pine ( Pinus sylvestris ) production system in southern Norway in which moose ( Alces alces ) are an important secondary product. Combining long-term hunting and forestry records, we identified temporal vari- ation in clear-felling over the past five decades, peaking in the 1970s. Herbicide treatment of regenerating stands and a fivefold increase in moose harvest has lead to a reduction in availability of successional forest per moose of [ 90 % since the 1960s. Field estimates showed that spraying with the herbicide glyphosate reduced forage availability by 60 and 96 % in summer and winter, respectively, 4 years after treatment. It also reduced moose use and habitat selection of young spruce stands compared with unsprayed stands. Together these lines of evidence suggest that forest man- agement led to an increase in moose carrying capacity during the 1970s and a subsequent decline thereafter. This is likely to have contributed to observed reductions in moose population productivity in southern Norway and is counter to sustainable resource management. We therefore call for better integration and long-term planning between forestry and wildlife management to minimise forest damage and the development of large fluctuations in ungulate populations

    Urban geochemical mapping studies : how and why we do them

    Get PDF
    Geochemical mapping is a technique rooted in mineral exploration but has now found worldwide application in studies of the urban environment. Such studies, involving multidisciplinary teams including geochemists, have to present their results in a way that nongeochemists can comprehend. A legislatively driven demand for urban geochemical data in connection with the need to identify contaminated land and subsequent health risk assessments has given rise to a greater worldwide interest in the urban geochemical environment. Herein, the aims and objectives of some urban studies are reviewed and commonly used terms such as baseline and background are defined. Geochemists need to better consider what is meant by the term urban. Whilst the unique make up of every city precludes a single recommended approach to a geochemical mapping strategy, more should be done to standardise the sampling and analytical methods. How (from a strategic and presentational point of view) and why we do geochemical mapping studies is discussed. Keywords Background - Baseline - Geochemical mapping - Heavy metals - Pollution - Soil - Urba
    corecore