343 research outputs found

    Machine Learning in Insurance

    Get PDF
    Machine learning is a relatively new field, without a unanimous definition. In many ways, actuaries have been machine learners. In both pricing and reserving, but also more recently in capital modelling, actuaries have combined statistical methodology with a deep understanding of the problem at hand and how any solution may affect the company and its customers. One aspect that has, perhaps, not been so well developed among actuaries is validation. Discussions among actuaries’ “preferred methods” were often without solid scientific arguments, including validation of the case at hand. Through this collection, we aim to promote a good practice of machine learning in insurance, considering the following three key issues: a) who is the client, or sponsor, or otherwise interested real-life target of the study? b) The reason for working with a particular data set and a clarification of the available extra knowledge, that we also call prior knowledge, besides the data set alone. c) A mathematical statistical argument for the validation procedure

    Current Topics on Risk Analysis: ICRA6 and RISK2015 Conference

    Get PDF
    Peer ReviewedPostprint (published version

    Current Topics on Risk Analysis: ICRA6 and RISK2015 Conference

    Get PDF
    Artículos presentados en la International Conference on Risk Analysis ICRA 6/RISK 2015, celebrada en Barcelona del 26 al 29 de mayo de 2015.Peer ReviewedPostprint (published version

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    MODELLING OPERATIONAL RISK MEASUREMENT IN ISLAMIC BANKING: A THEORETICAL AND EMPIRICAL INVESTIGATION

    Get PDF
    With the emergence and development of Islamic banking industry, the need to cater operational risks issues has attracted the attention of academics in recent years. Such studies commonly agree that operational risk is relatively higher and serious than credit risk and market risk for Islamic banks. However, there is not any single research in the context of Islamic banking which thoroughly tackles the issue of operational risks by tackling it in three main aspects: theoretical, methodological, and empirical. This may be due to the fact that operational risk is relatively new area, which requires further research to understand the complexities it carries. This is the sources of motivation for the research, which aims to fill this observed gap in the literature by responding to the mentioned three aspects. This research, hence, aims to develop a new measurement model of operational risk exposures in Islamic banking with the objective of theoretically determining the underlying features of operational risk exposures and its measurement particularly for Islamic banks. In its attempt to develop a theoretical framework of the proposed model, the research provides a classification of operational risks in major Islamic financial contracts. In addition, rather than adopting the existing operational risk measurement methods, this research develops a proposed measurement model attributed as Delta Gamma Sensitivity Analysis- Extreme Value Theory (DGSA-EVT) model. DGSA-EVT is a model to measure high frequency-low severity (HF-LS) and low frequency-high severity (LF-HS) type of operational risks. This is the core of this research’s methodological contribution. As regards to the empirical contributions, in analysing operational value at risk (opVaR), this research carefully analyses the behaviour of the data by taking into account volatility, skewness and kurtosis of the variables. In the modelling, volatility analysis employs two models: constant-variance model and exponential weighted moving average (EWMA) model. Results of the empirical tests show that the operational risk variables in this research are non-normal; thus, non-normality involving skewness and kurtosis as well as volatility has to be taken into account in the estimation of VaR. In doing so, this research employs Cornish-Fisher expansion upon which the confidence interval of operational variables is an explicit function of the skewness and kurtosis as well as the volatility. Empirical findings by deploying a set of econometrics tests reveal that for financing activities, the role of maintaining operational efficiency as part of an Islamic bank’s fiduciary responsibilities is immensely high. However, people risk is enormous and plays a dominant role in affecting the level of operational risk exposures in Islamic banks in investment activities

    Application of Machine Learning Algorithms to Actuarial Ratemaking within Property and Casualty Insurance

    Get PDF
    A scientific pricing assessment is essential for maintaining viable customer relationship management solutions (CRM) for various stakeholders including consumers, insurance intermediaries, and insurers. The thesis aims to examine research problems neighboring the ratemaking process, including relaxing the conventional loss model assumption of homogeneity and independence. The thesis identified three major research scopes within multiperil insurance settings: heterogeneity in consumer behaviour on pricing decisions, loss trending under non-linearity and temporal dependencies, and loss modelling in presence of inflationary pressure. Heterogeneous consumers on pricing decisions were examined using demand and loyalty-based strategy. A hybrid decision tree classification framework is implemented, that includes semi-supervised learning model, variable selection technique, and partitioning approach with different treatment effects in order to achieve adequate risk profiling. Also, the thesis explored a supervised tree learning mechanism under highly imbalanced overlap classes and having a non-linear response-predictors relationship. The two-phase classification framework is applied to an owner’s occupied property portfolio from a personal insurance brokerage powered by a digital platform within the Canadian market. The hybrid three-phase tree algorithm, which includes conditional inference trees, random forest wrapped by the Boruta algorithm, and model-based recursive partitioning under a multinomial generalized linear model, is proposed to study the price sensitivity ranking of digital consumers. The empirical results suggest a well-defined segmentation of digital consumers with differential price sensitivity. Further, with highly imbalanced and overlapped classes, the resampling technique was modelled together with the decision tree algorithm, providing a more scientific approach to overcome classification problems than the traditional multinomial regression. The resulting segmentation was able to identify the high-sensitivity consumers group, where premium rate reductions are recommended to reduce the churn rate. Consumers are classified as an insensitive group for which the price strategy to increase the premium rate is expected to have a slight impact on the closing ratio and retention rate. Insurance loss incurred greatly exhibits abnormal characteristics such as temporal dependence, nonlinear relationship between dependent and independent variables, seasonal variation, and mixture distribution resulting from the implicit claim inflation component. With such abnormal variable characteristics, the severity and frequency components may exhibit an altered trending pattern, that changes over time and never repeats. This could have a profound impact on the experience rating model, where the estimates of the pure premium and the rate relativity of tariff class are likely to be under or over-estimated. A discussion of the pros and cons of the conventional loss trending approach leads to an alternative framework for the loss cost structure. The conventional pure premium is further split into base severity and severity deflator random variables using a do(·) operator within causal inference. The components are separately modelled based on different time basis predictors using the semiparametric generalized additive model (GAM) with a spline curve. To maximize the claim inflation calendar year effect and improve the efficiency of severity trending, this thesis refines the claim inflation estimation by adapting Taylor’s [86] separation method that estimates the inflation index from a loss development triangle. In the second phase of developing the severity trend model, we integrated both the base severity and severity deflator under a new generalized mechanism known as Discount, Model, and Trend (DMT). The two-phase modelling was built to overcome the mixture distribution effect on final trend estimates. A simulation study constructed using the claims paid development triangle from a Canadian Insurtech broker’s houseowners/householders portfolio was used in a severity trend movement prediction analysis. We discovered that the conventional framework understated the severity trends more than the separation cum DMT framework. GAM provides a flexible and effective mechanism for modelling nonlinear time series in studies of the frequency loss trend. However, GAM assumes that residuals are independent and identically distributed (iid), while frequency loss time series can be correlated in adjacent time points. This thesis introduces a new model called Generalized Additive Model with Seasonal Autoregressive term (GAMSAR) that accounts for temporal dependency and seasonal variation in order to improve prediction confidence intervals. Parameters of the GAMSAR model are estimated by maximum partial likelihood using a modified Newton’s method developed by Yang et al. [97], and the goodness-of-fit between GAM, and GAMSAR is demonstrated using a simulation study. Simulation results show that the bias of the mean estimates from GAM differs greatly from their true value. The proposed GAMSAR model shows to be superior, especially in the presence of seasonal variation. Further, a comparison study is conducted between GAMSAR and Generalized Additive Model with Autoregressive term (GAMAR) developed by Yang et al. [97], and the coverage rate of 95% confidence interval confirms that the GAMSAR model has the ability to incorporate the nonlinear trend effects as well as capture the serial correlation between the observations. In the empirical analysis, a claim dataset of personal property insurance obtained from digital brokers in Canada is used to show that the GAMSAR(1)12 captures the periodic dependence structure of the data precisely compared to standard regression models. The proposed frequency severity trend models support the thesis’s goal of establishing a scientific approach to pricing that is robust under different trending processes

    Bayesian stochastic mortality modelling under serially correlated local effects

    Get PDF
    The vast majority of stochastic mortality models in the academic literature are intended to explain the dynamics underpinning the process by a combination of age, period and cohort e ects. In principle, the more such e ects are included in a stochastic mortality model, the better is the in-sample t to the data. Estimates of those parameters are most usually obtained under some distributional assumption about the occurrence of deaths, which leads to the optimisation of a relevant objective function. The present Thesis develops an alternative framework where the local mortality effect is appreciated, by employing a parsimonious multivariate process for modelling the latent residual e ects of a simple stochastic mortality model as dependent rather than conditionally independent variables. Under the suggested extension the cells of the examined data-set are supplied with a serial dependence structure by relating the residual terms through a simple vector autoregressive model. The method is applicable for any of the popular mortality modelling structures in academia and industry, and is accommodated herein for the Lee-Carter and Cairns-Blake-Dowd models. The additional residuals model is used to compensate for factors of a mortality model that might mostly be a ected by local e ects within given populations. By using those two modelling bases, the importance of the number of factors for a stochastic mortality model is emphasised through the properties of the prescribed residuals model. The resultant hierarchical models are set under the Bayesian paradigm, and samples from the joint posterior distribution of the latent states and parameters are obtained by developing Markov chain Monte Carlo algorithms. Along with the imposed short-term dynamics, we also examine the impact of the joint estimation in the long-term factors of the original models. The Bayesian solution aids in recognising the di erent levels of uncertainty for the two naturally distinct type of dynamics across di erent populations. The forecasted rates, mortality improvements, and other relevant mortality dependent metrics under the developed models are compared to those produced by their benchmarks and other standard stochastic mortality models in the literature

    Life settlement pricing with fuzzy parameters

    Full text link
    Existing literature asserts that the growth of life settlement (LS) markets, where they exist, is hampered by limited policyholder participation and suggests that to foster this growth appropriate pricing of LS transactions is crucial. The pricing of LSs relies on quantifying two key variables: the insured's mortality multiplier and the internal rate of return (IRR). However, the available information on these parameters is often scarce and vague. To address this issue, this article proposes a novel framework that models these variables using triangular fuzzy numbers (TFNs). This modelling approach aligns with how mortality multiplier and IRR data are typically provided in insurance markets and has the advantage of offering a natural interpretation for practitioners. When both the mortality multiplier and the IRR are represented as TFNs, the resulting LS price becomes a FN that no longer retains the triangular shape. Therefore, the paper introduces three alternative triangular approximations to simplify computations and enhance interpretation of the price. Additionally, six criteria are proposed to evaluate the effectiveness of each approximation method. These criteria go beyond the typical approach of assessing the approximation quality to the FN itself. They also consider the usability and comprehensibility for financial analysts with no prior knowledge of FNs. In summary, the framework presented in this paper represents a significant advancement in LS pricing. By incorporating TFNs, offering several triangular approximations and proposing goodness criteria of them, it addresses the challenges posed by limited and vague data, while also considering the practical needs of industry practitioners
    corecore