262 research outputs found

    Correntropy-Based Evolving Fuzzy Neural System

    Get PDF
    In this paper, a correntropy-based evolving fuzzy neural system (correntropy-EFNS) is proposed for approximation of nonlinear systems. Different from the commonly used meansquare error criterion, correntropy has a strong outliers rejection ability through capturing the higher moments of the error distribution. Considering the merits of correntropy, this paper brings contributions to build EFNS based on the correntropy concept to achieve a more stable evolution of the rule base and update of the rule parameters instead of the commonly used meansquare error criterion. The correntropy-EFNS (CEFNS) begins with an empty rule base and all rules are evolved online based on the correntropy criterion. The consequent part parameters are tuned based on the maximum correntropy criterion where the correntropy is used as the cost function so as to improve the non-Gaussian noise rejection ability. The steady-state convergence performance of the CEFNS is studied through the calculation of the steady-state excess mean square error (EMSE) in two cases: i) Gaussian noise; and ii) non-Gaussian noise. Finally, the CEFNS is validated through a benchmark system identification problem, a Mackey-Glass time series prediction problem as well as five other real-world benchmark regression problems under both noise-free and noisy conditions. Compared with other evolving fuzzy neural systems, the simulation results show that the proposed CEFNS produces better approximation accuracy using the least number of rules and training time and also owns superior non-Gaussian noise handling capability

    Particle Swarm Optimized Autonomous Learning Fuzzy System

    Get PDF
    The antecedent and consequent parts of a first-order evolving intelligent system (EIS) determine the validity of the learning results and overall system performance. Nonetheless, the state-of-the-art techniques mostly stress on the novelty from the system identification point of view but pay less attention to the optimality of the learned parameters. Using the recently introduced autonomous learning multiple model (ALMMo) system as the implementation basis, this paper introduces a particles warm-based approach for EIS optimization. The proposed approach is able to simultaneously optimize the antecedent and consequent parameters of ALMMo and effectively enhance the system performance by iteratively searching for optimal solutions in the problem spaces. In addition, the proposed optimization approach does not adversely influence the “one pass” learning ability of ALMMo. Once the optimization process is complete, ALMMo can continue to learn from new data to incorporate unseen data patterns recursively without a full retraining. Experimental studies with a number of real-world benchmark problems validate the proposed concept and general principles. It is also verified that the proposed optimization approach can be applied to other types of EISs with similar operating mechanisms

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio

    Improving risk-adjusted performance in high-frequency trading: The role of fuzzy logic systems

    Get PDF
    In recent years, algorithmic and high-frequency trading have been the subject of increasing risk concerns. A general theme that we adopt in this thesis is that trading practitioners are predominantly interested in risk-adjusted performance. Likewise, regulators are demanding stricter risk controls. First, we scrutinise conventional AI model design approaches with the aim to increase the risk-adjusted trading performance of the proposed fuzzy logic models. We show that applying risk-return objective functions and accounting for transaction costs improve out-of-sample results. Our experiments identify that neuro-fuzzy models exhibit superior performance stability across multiple risk regimes when compared to popular neural network models identified in AI literature. Moreover, we propose an innovative ensemble model approach which combines multiple risk-adjusted objective functions and dynamically adapts risk- tolerance according to time-varying risk. Next, we extend our findings to the money management aspects of trading algorithms. We introduce an effective fuzzy logic approach which dynamically discriminates across different regions in the trend and volatility space. The model prioritises higher performing regions at an intraday level and adapts capital allocation policies with the objective to maximise global risk-adjusted performance. Finally, we explore trading improvements that can be attained by advancing our type-1 fuzzy logic ideas to higher order fuzzy systems in view of the increased noise (uncertainty) that is inherent in high-frequency data. We propose an innovative approach to design type-2 models with minimal increase in design and computational complexity. As a further step, we identify a relationship between the increased trading performance benefits of the proposed type-2 model and higher levels of trading frequencies. In conclusion, this thesis sets a framework for practitioners, researchers and regulators in the design of fuzzy logic systems for better management of risk in the field of algorithmic and high-frequency trading

    Volatility forecast for the brazilian stock market (Bovespa index)

    Get PDF
    A volatilidade é uma medida de risco que, a partir da análise do comportamento de um ativo durante um determinado período, indica a velocidade de variação entre a queda e a subida. Ativos altamente voláteis apresentam oscilações rápidas e que podem acontecer de forma muito acentuada. Estudos sobre volatilidade como orientação de investimentos e instrumento de classificação de risco têm sido uma estratégia amplamente utilizada no mercado de capitais. O presente trabalho tem como objetivo analisar e prever a volatilidade do Índice BOVESPA (IBOV) empregando modelos da família ARCH/GARCH, bem como se existe uma relação de longo, curto prazo (ou ambos) entre o IBOV e os indicadores macroeconômicos auxiliares usando modelos VAR/VEC/ARDL. As séries analisadas são dados diárias não sequenciais de 2 de janeiro de 2019 a 30 de abril de 2020 (322 registros), e o período de previsão é de 5 de maio de 2020 a 10 de julho de 2020 (42 registros). As seguintes variáveis foram utilizadas como indicadores auxiliares: o preço do barril do petróleo Brent em dólares americanos (Brent); o índice de diferença entre a taxa de retorno dos títulos brasileiros e a taxa oferecida pelos títulos do Tesouro Norte-Americano (EMBI); e a taxa de câmbio entre o dólar norte-americano e o real brasileiro (BRL_USD). Os resultados sugerem a utilização do modelo IGARCH(1,1) com erros GED e indicam que existe uma relação de curto prazo quando o IBOV é a variável dependente, embora o modelo ARDL tenha sido considerado insatisfatório; Volatility Forecast for the Brazilian Stock Market (BOVESPA Index) Abstract: Volatility is a measure of risk that, based on the behavior analysis of an asset during a certain period, indicates the speed at which it varies between falling and rising. Highly volatile assets show rapid oscillations that can happen very sharply. Studies on volatility as investment guidance and risk classification instrument have been a strategy widely used in the capital market. The present work aims to analyze and predict the volatility of the BOVESPA Index (IBOV) employing ARCH/GARCH family models, as well as whether there is a long-term, short-term (or both) relationship between the IBOV and the auxiliary macroeconomics indicators using VAR/VEC/ARDL models. The series analyzed are non-sequential daily information from January 2, 2019, to April 30, 2020 (322 registers), and the forecast period is from May 5, 2020, to July 10, 2020 (42 registers). The following variables were used as auxiliary indicators: the price of a barrel of Brent oil in US dollars (Brent); the index of difference between the return rate on Brazilian bonds and the rate offered by bonds issued by the North American Treasury (EMBI); and the exchange rate between the US dollar and the Brazilian real (BRL_USD). The results suggest the use of the IGARCH model (1,1) with GED errors and indicate that there is a short-term relationship when IBOV is the dependent variable, although the ARDL model was considered non-satisfactory

    Risk Management using Model Predictive Control

    Get PDF
    Forward planning and risk management are crucial for the success of any system or business dealing with the uncertainties of the real world. Previous approaches have largely assumed that the future will be similar to the past, or used simple forecasting techniques based on ad-hoc models. Improving solutions requires better projection of future events, and necessitates robust forward planning techniques that consider forecasting inaccuracies. This work advocates risk management through optimal control theory, and proposes several techniques to combine it with time-series forecasting. Focusing on applications in foreign exchange (FX) and battery energy storage systems (BESS), the contributions of this thesis are three-fold. First, a short-term risk management system for FX dealers is formulated as a stochastic model predictive control (SMPC) problem in which the optimal risk-cost profiles are obtained through dynamic control of the dealers’ positions on the spot market. Second, grammatical evolution (GE) is used to automate non-linear time-series model selection, validation, and forecasting. Third, a novel measure for evaluating forecasting models, as a part of the predictive model in finite horizon optimal control applications, is proposed. Using both synthetic and historical data, the proposed techniques were validated and benchmarked. It was shown that the stochastic FX risk management system exhibits better risk management on a risk-cost Pareto frontier compared to rule-based hedging strategies, with up to 44.7% lower cost for the same level of risk. Similarly, for a real-world BESS application, it was demonstrated that the GE optimised forecasting models outperformed other prediction models by at least 9%, improving the overall peak shaving capacity of the system to 57.6%

    Risk Management using Model Predictive Control

    Get PDF
    Forward planning and risk management are crucial for the success of any system or business dealing with the uncertainties of the real world. Previous approaches have largely assumed that the future will be similar to the past, or used simple forecasting techniques based on ad-hoc models. Improving solutions requires better projection of future events, and necessitates robust forward planning techniques that consider forecasting inaccuracies. This work advocates risk management through optimal control theory, and proposes several techniques to combine it with time-series forecasting. Focusing on applications in foreign exchange (FX) and battery energy storage systems (BESS), the contributions of this thesis are three-fold. First, a short-term risk management system for FX dealers is formulated as a stochastic model predictive control (SMPC) problem in which the optimal risk-cost profiles are obtained through dynamic control of the dealers’ positions on the spot market. Second, grammatical evolution (GE) is used to automate non-linear time-series model selection, validation, and forecasting. Third, a novel measure for evaluating forecasting models, as a part of the predictive model in finite horizon optimal control applications, is proposed. Using both synthetic and historical data, the proposed techniques were validated and benchmarked. It was shown that the stochastic FX risk management system exhibits better risk management on a risk-cost Pareto frontier compared to rule-based hedging strategies, with up to 44.7% lower cost for the same level of risk. Similarly, for a real-world BESS application, it was demonstrated that the GE optimised forecasting models outperformed other prediction models by at least 9%, improving the overall peak shaving capacity of the system to 57.6%

    A fresh engineering approach for the forecast of financial index volatility and hedging strategies

    Get PDF
    This thesis attempts a new light on a problem of importance in Financial Engineering. Volatility is a commonly accepted measure of risk in the investment field. The daily volatility is the determining factor in evaluating option prices and in conducting different hedging strategies. The volatility estimation and forecast are still far from successfully complete for industry acceptance, judged by their generally lower than 50% forecasting accuracy. By judiciously coordinating the current engineering theory and analytical techniques such as wavelet transform, evolutionary algorithms in a Time Series Data Mining framework, and the Markov chain based discrete stochastic optimization methods, this work formulates a systematic strategy to characterize and forecast crucial as well as critical financial time series. Typical forecast features have been extracted from different index volatility data sets which exhibit abrupt drops, jumps and other embedded nonlinear characteristics so that accuracy of forecasting can be markedly improved in comparison with those of the currently prevalent methods adopted in the industry. The key aspect of the presented approach is "transformation and sequential deployment": i) transform the data from being non-observable to observable i.e., from variance into integrated volatility; ii) conduct the wavelet transform to determine the optimal forecasting horizon; iii) transform the wavelet coefficients into 4-lag recursive data sets or viewed differently as a Markov chain; iv) apply certain genetic algorithms to extract a group of rules that characterize different patterns embedded or hidden in the data and attempt to forecast the directions/ranges of the one-step ahead events; and v)apply genetic programming to forecast the values of the one-step ahead events. By following such a step by step approach, complicated problems of time series forecasting become less complex and readily resolvable for industry application. To implement such an approach, the one year, two year and five year S&PlOO historical data are used as training sets to derive a group of 100 rules that best describe their respective signal characteristics. These rules are then used to forecast the subsequent out-of-sample time series data. This set of tests produces an average of over 75% of correct forecasting rate that surpasses any other publicly available forecast results on any type of financial indices. Genetic programming was then applied on the out of sample data set to forecast the actual value of the one step-ahead event. The forecasting accuracy reaches an average of 70%, which is a marked improvement over other current forecasts. To validate the proposed approach, indices of S&P500 as well as S&P 100 data are tested with the discrete stochastic optimization method, which is based on Markov chain theory and involves genetic algorithms. Results are further validated by the bootstrapping operation. All these trials showed a good reliability of the proposed methodology in this research work. Finally, the thus established methodology has been shown to have broad applications in option pricing, hedging, risk management, VaR determination, etc
    corecore