23 research outputs found

    Mean univariate- GARCH VaR portfolio optimization: actual portfolio approach

    Get PDF
    In accordance with Basel Capital Accords, the Capital Requirements (CR) for market risk exposure of banks is a nonlinear function of Value-at-Risk (VaR). Importantly, the CR is calculated based on a bank’s actual portfolio, i.e. the portfolio represented by its current holdings. To tackle mean-VaR portfolio optimization within the actual portfolio framework (APF), we propose a novel mean-VaR optimization method where VaR is estimated using a univariate Generalized AutoRegressive Conditional Heteroscedasticity (GARCH) volatility model. The optimization was performed by employing a Nondominated Sorting Genetic Algorithm (NSGA-II). On a sample of 40 large US stocks, our procedure provided superior mean-VaR trade-offs compared to those obtained from applying more customary mean-multivariate GARCH and historical VaR models. The results hold true in both low and high volatility samples

    ONE-YEAR CARDIOVASCULAR OUTCOME IN PATIENTS ON CLOPIDOGREL ANTI-PLATELET THERAPY AFTER ACUTE MYOCARDIAL INFARCTION

    Get PDF
    The aim of this study was to determine the risk factors in patients on clopidogrel anti-platelet therapy after acute myocardial infarction, for cardiovascular mortality, re-hospitalization and admission to emergency care unit. We followed 175 patients on dual antiplatelet therapy, with clopidogrel and acetylsalicylic acid, for 1 year after acute myocardial infarction, both STEMI and NSTEMI. Beside demographic and clinical characteristics, genetic ABCB1, CYP2C19 and CYP2C9 profile was analyzed using Cox-regression analysis. End-points used were: mortality, re-hospitalization and emergency care visits, all related to cardiovascular system. During the accrual and follow-up period, 8 patients (4.6%) died, mostly as a direct consequence of an acute myocardial infarction. Re-hospitalization was needed in 27 patients (15.4%), in nine patients (33.3%) with the diagnosis of re-infarction. Thirty-two patients (18.3%) were admitted to emergency care unit due to cardiovascular causes, up to 15 times during the follow-up. NSTEMI was an independent predictor of all three events registered (mortality OR=7.4, p<0.05; re-hospitalization OR=2.8, p<0.05); emergency care visit OR=2.4, p<0.05). Other significant predictors were related to kidney function (urea and creatinine level, creatinine clearance), co-morbidities such as arterial hypertension and decreased left ventricular ejection fraction, as well as clopidogrel dosing regimen. As a conclusion, it may be suggested that one of the most significant predictors of cardiovascular events (mortality, re-hospitalization and emergency care visits) is NSTEMI. Besides, clopidogrel administration according to up-to-date guidelines, with high loading doses and initial doubled maintenance doses, improves 1-year prognosis in patients with AMI

    The Effectiveness and Students’ Perception of an Adaptive Mobile Learning System based on Personalized Content and Mobile Web

    No full text
    As the whole world is going mobile, application of mobile devices in education, also known as m-learning, is becoming one of the most popular areas of educational research. This paper presents the implementation and evaluation of the effectiveness and students’ attitudes toward an adaptive mobile learning system based on personalized content and mobile web. Personalization of learning materials is based on the Felder-Silverman learning style model and the features of the accessing mobile device were identified using the device library. The results of the study confirm students’ positive attitudes toward mobile learning and the developed adaptive m-learning system. They also prove the effectiveness of the system and m-learning as an additional educational tool in terms of increasing students’ knowledge and scores

    The impact of the functional characteristics of a credit bureau on the level of indebtedness per capita: Evidence from East European countries

    No full text
    The institution of the credit bureau is one of the most important elements in controlling the indebtedness levels of a population. All credit bureaus have specific functional characteristics which are able to influence the development of indebtedness. This research aims to identify the most important characteristics of a credit bureau, to quantify those characteristics and to identify causal relationships between the characteristics of the credit bureau and trends in indebtedness per capita levels. The paper introduces the Credit Bureau Functional Index which presents a quantified value of the functional characteristics of the Credit Bureau. The paper establishes a correlation between this index and indebtedness per capita and finds the formula governing this relationship to be linear. The paper concludes that indebtedness levels can be targeted through a mix of characteristics of a credit bureau. Research on this theme is absent in academic literature to date.credit bureau, credit information, personal indebtedness, coverage ratio, functional characteristic

    MEASURING THE DATA MODEL QUALITY IN THE ESUPPLY CHAINS

    No full text
    The implementation of Internet technology in business has enabled the development of e-business supply chains with large-scale information integration among all partners.The development of information systems (IS) is based on the established business objectives whose achievement, among other things, directly depends on the quality of development and design of IS. In the process of analysis of the key elements of company operations in the supply chain, process model and corresponding data model are designed which should enable selection of appropriate information system architecture. The quality of the implemented information system, which supports e-supply chain, directly depends on the level of data model quality. One of the serious limitations of the data model is its complexity. With a large number of entities, data model is difficult to analyse, monitor and maintain. The problem gets bigger when looking at an integrated data model at the level of participating partners in the supply chain, where the data model usually consists of hundreds or even thousands of entities.The paper will analyse the key elements affecting the quality of data models and show their interactions and factors of significance. In addition, the paper presents various measures for assessing the quality of the data model on which it is possible to easily locate the problems and focus efforts in specific parts of a complex data model where it is not economically feasible to review every detail of the model
    corecore