147 research outputs found

    Assessing technical inefficiency in private, not-for-profit, bachelor\u27s and master\u27s universities in the United States using stochastic frontier estimation

    Get PDF
    This research explored the technical inefficiency of 813 private, not-for-profit, four-year, bachelor’s and master’s colleges and universities in the U.S. using data from 2006 to 2011. The goal of the study was to describe and explain the level of technical inefficiency in this group of institutions that can be identified using a stochastic frontier estimation (SFE) method and to evaluate the applicability of SFE to higher education. Categories of expenditures were utilized to create a production frontier against which the inefficiency of individual institutions to produce degreed students was measured. The analysis using panel data showed this sector of higher education is operating with a mean technical inefficiency of 21%. Instructional expenditures had a positive effect on inefficiency while institutional support and student services had a slight negative effect on inefficiency. It was concluded that the SFE can be used to provide a reliable relative measure of technical inefficiency in higher education. The SFE can be useful in guiding decision makers by giving them a way to compare themselves with institutions of similar characteristics and competing in the same market

    Are realized volatility models good candidates for alternative Value at Risk prediction strategies?

    Get PDF
    In this paper, we assess the Value at Risk (VaR) prediction accuracy and efficiency of six ARCH-type models, six realized volatility models and two GARCH models augmented with realized volatility regressors. The α-th quantile of the innovation’s distribution is estimated with the fully parametric method using either the normal or the skewed student distributions and also with the Filtered Historical Simulation (FHS), or the Extreme Value Theory (EVT) methods. Our analysis is based on two S&P 500 cash index out-of-sample forecasting periods, one of which covers exclusively the recent 2007-2009 financial crisis. Using an extensive array of statistical and regulatory risk management loss functions, we find that the realized volatility and the augmented GARCH models with the FHS or the EVT quantile estimation methods produce superior VaR forecasts and allow for more efficient regulatory capital allocations. The skewed student distribution is also an attractive alternative, especially during periods of high market volatility.High frequency intraday data; Filtered Historical Simulation; Extreme Value Theory; Value-at-Risk forecasting; Financial crisis.

    Analysis of Data Clusters Obtained by Self-Organizing Methods

    Full text link
    The self-organizing methods were used for the investigation of financial market. As an example we consider data time-series of Dow Jones index for the years 2002-2003 (R. Mantegna, cond-mat/9802256). In order to reveal new structures in stock market behavior of the companies drawing up Dow Jones index we apply SOM (Self-Organizing Maps) and GMDH (Group Method of Data Handling) algorithms. Using SOM techniques we obtain SOM-maps that establish a new relationship in market structure. Analysis of the obtained clusters was made by GMDH.Comment: 10 pages, 4 figure

    Pattern Matching and Neural Networks based Hybrid Forecasting System

    Get PDF
    Copyright © 2001 Springer-Verlag Berlin Heidelberg. The final publication is available at link.springer.comBook title: Advances in Pattern Recognition — ICAPR 2001Second International Conference on Advances in Pattern Recognition (ICAPR 2001), Rio de Janeiro, Brazil, March 11–14, 2001In this paper we propose a Neural Net-PMRS hybrid for forecasting time-series data. The neural network model uses the traditional MLP architecture and backpropagation method of training. Rather than using the last n lags for prediction, the input to the network is determined by the output of the PMRS (Pattern Modelling and Recognition System). PMRS matches current patterns in the time-series with historic data and generates input for the neural network that consists of both current and historic information. The results of the hybrid model are compared with those of neural networks and PMRS on their own. In general, there is no outright winner on all performance measures, however, the hybrid model is a better choice for certain types of data, or on certain error measures

    Are realized volatility models good candidates for alternative Value at Risk prediction strategies?

    Get PDF
    In this paper, we assess the Value at Risk (VaR) prediction accuracy and efficiency of six ARCH-type models, six realized volatility models and two GARCH models augmented with realized volatility regressors. The α-th quantile of the innovation’s distribution is estimated with the fully parametric method using either the normal or the skewed student distributions and also with the Filtered Historical Simulation (FHS), or the Extreme Value Theory (EVT) methods. Our analysis is based on two S&P 500 cash index out-of-sample forecasting periods, one of which covers exclusively the recent 2007-2009 financial crisis. Using an extensive array of statistical and regulatory risk management loss functions, we find that the realized volatility and the augmented GARCH models with the FHS or the EVT quantile estimation methods produce superior VaR forecasts and allow for more efficient regulatory capital allocations. The skewed student distribution is also an attractive alternative, especially during periods of high market volatility

    Are realized volatility models good candidates for alternative Value at Risk prediction strategies?

    Get PDF
    In this paper, we assess the Value at Risk (VaR) prediction accuracy and efficiency of six ARCH-type models, six realized volatility models and two GARCH models augmented with realized volatility regressors. The α-th quantile of the innovation’s distribution is estimated with the fully parametric method using either the normal or the skewed student distributions and also with the Filtered Historical Simulation (FHS), or the Extreme Value Theory (EVT) methods. Our analysis is based on two S&P 500 cash index out-of-sample forecasting periods, one of which covers exclusively the recent 2007-2009 financial crisis. Using an extensive array of statistical and regulatory risk management loss functions, we find that the realized volatility and the augmented GARCH models with the FHS or the EVT quantile estimation methods produce superior VaR forecasts and allow for more efficient regulatory capital allocations. The skewed student distribution is also an attractive alternative, especially during periods of high market volatility

    The role of high frequency intra-daily data, daily range and implied volatility in multi-period Value-at-Risk forecasting

    Get PDF
    In this paper, we assess the informational content of daily range, realized variance, realized bipower variation, two time scale realized variance, realized range and implied volatility in daily, weekly, biweekly and monthly out-of-sample Value-at-Risk (VaR) predictions. We use the recently proposed Realized GARCH model combined with the skewed student distribution for the innovations process and a Monte Carlo simulation approach in order to produce the multi-period VaR estimates. The VaR forecasts are evaluated in terms of statistical and regulatory accuracy as well as capital efficiency. Our empirical findings, based on the S&P 500 stock index, indicate that almost all realized and implied volatility measures can produce statistically and regulatory precise VaR forecasts across forecasting horizons, with the implied volatility being especially accurate in monthly VaR forecasts. The daily range produces inferior forecasting results in terms of regulatory accuracy and Basel II compliance. However, robust realized volatility measures such as the adjusted realized range and the realized bipower variation, which are immune against microstructure noise bias and price jumps respectively, generate superior VaR estimates in terms of capital efficiency, as they minimize the opportunity cost of capital and the Basel II regulatory capital. Our results highlight the importance of robust high frequency intra-daily data based volatility estimators in a multi-step VaR forecasting context as they balance between statistical or regulatory accuracy and capital efficiency

    Building robust prediction models for defective sensor data using Artificial Neural Networks

    Get PDF
    Predicting the health of components in complex dynamic systems such as an automobile poses numerous challenges. The primary aim of such predictive systems is to use the high-dimensional data acquired from different sensors and predict the state-of-health of a particular component, e.g., brake pad. The classical approach involves selecting a smaller set of relevant sensor signals using feature selection and using them to train a machine learning algorithm. However, this fails to address two prominent problems: (1) sensors are susceptible to failure when exposed to extreme conditions over a long periods of time; (2) sensors are electrical devices that can be affected by noise or electrical interference. Using the failed and noisy sensor signals as inputs largely reduce the prediction accuracy. To tackle this problem, it is advantageous to use the information from all sensor signals, so that the failure of one sensor can be compensated by another. In this work, we propose an Artificial Neural Network (ANN) based framework to exploit the information from a large number of signals. Secondly, our framework introduces a data augmentation approach to perform accurate predictions in spite of noisy signals. The plausibility of our framework is validated on real life industrial application from Robert Bosch GmbH.Comment: 16 pages, 7 figures. Currently under review. This research has obtained funding from the Electronic Components and Systems for European Leadership (ECSEL) Joint Undertaking, the framework programme for research and innovation Horizon 2020 (2014-2020) under grant agreement number 662189-MANTIS-2014-

    Tau association with synaptic vesicles causes presynaptic dysfunction

    Get PDF
    Tau is implicated in more than 20 neurodegenerative diseases, including Alzheimer's disease. Under pathological conditions, Tau dissociates from axonal microtubules and missorts to pre- and postsynaptic terminals. Patients suffer from early synaptic dysfunction prior to Tau aggregate formation, but the underlying mechanism is unclear. Here we show that pathogenic Tau binds to synaptic vesicles via its N-terminal domain and interferes with presynaptic functions, including synaptic vesicle mobility and release rate, lowering neurotransmission in fly and rat neurons. Pathological Tau mutants lacking the vesicle binding domain still localize to the presynaptic compartment but do not impair synaptic function in fly neurons. Moreover, an exogenously applied membrane-permeable peptide that competes for Tau-vesicle binding suppresses Tau-induced synaptic toxicity in rat neurons. Our work uncovers a presynaptic role of Tau that may be part of the early pathology in various Tauopathies and could be exploited therapeutically.status: publishe
    corecore