28 research outputs found

    ПРОГНОЗ ФІНАНСОВИХ ПРОБЛЕМ, ВИКОРИСТОВУЮЧИ МЕТАЕВРИСТИЧНІ МОДЕЛІ

    Get PDF
    Investors need to assess and analyze the financial statement, to make the logical decision. Using financial ratios is one of the most common methods. The main purpose of this research is to predict the financial crisis, using ratios of liquidity. Four models, Support vector machine, neural network back propagation, Decision trees and Adaptive Neuro–Fuzzy Inference System has been compared.Furthermore, the ratios of liquidity considered in a period of 89_93. The research method is qualitative and quantitative and type of casual comparative. The result indicates that the accuracy of the neural network, Decision tree, and Adaptive Neuro–Fuzzy Inference System showed that there is a significant differently 0/000 and 0/005 years this is more than support vector machine result. Therefore the result of support vector machine showed that there is a significant differently 0/001 in years. This has been shown that neural network in 2 years before the bankruptcy has the ability to predict a right thing. Therefore, the results have been shown that all four models were statistically significant. Consequently, there are no significant differences. All models have the precision to predict the financial crisis.Инвесторам необходимо оценить и проанализировать финансовую отчетность, принять логическое решение. Использование финансовых показателей является одним из самых распространенных методов. Основная цель этого исследования – прогнозировать финансовый кризис, используя соотношение ликвидности. Четыре модели: векторные машины поддержки, обратное распространение нейронных сетей, дерево решений и адаптивная нейро–нечеткая система вывода. Кроме того, коэффициенты ликвидности рассмотрены в период 2011–2015 гг. Метод исследования является качественным и количественным, а также тип случайной сравнительной. Результат показывает точность нейронной сети, дерево решений, и система Adaptive Neuro–Fuzzy Inference показала, что значительно отличается от 0/000 и 0/005 лет, это больше, чем поддержка векторной машины. Поэтому результат поддержки векторной машины показал, что существует значительно по–разному 0/001 лет. Это показало, что нейронная сеть за 2 года до банкротства имеет возможность прогнозировать правильно. Поэтому результаты показали, что все четыре модели были статистически значимыми. Итак, существенных различий нет. Все модели имеют точность прогнозирования финансового кризиса.Інвесторам необхідно оцінити та проаналізувати фінансову звітність, прийняти логічне рішення. Використання фінансових показників є одним з найпоширеніших методів. Основна мета цього дослідження – прогнозувати фінансову кризу, використовуючи співвідношення ліквідності. Чотири моделі: векторні машини підтримки, зворотне розповсюдження нейронних мереж, дерево рішень та адаптивна  система нейро–нечіткого висновку. Крім того, коефіцієнти ліквідності розглянуті в період 2011–2015 рр. Метод дослідження є якісним та кількісним, а також тип випадкової порівняльної. Результат показує точність нейронної мережі, дерево рішень, і система Adaptive Neuro–Fuzzy Inference показала, що значно відрізняється від 0/000 і 0/005 років, це більше, ніж підтримка векторної машини. Тому результат підтримки векторної машини показав, що існує значно по–різному 0/001 років. Це показало, що нейронна мережа за 2 роки до банкрутства має можливість прогнозувати правильну річ. Тому результати показали, що всі чотири моделі були статистично значущими. Отже, істотних відмінностей немає. Всі моделі мають точність прогнозування фінансової кризи

    Prediction of Spot Price of Iron Ore Based on PSR-WA-LSSVM Combined Model

    Get PDF
    Aiming at the problems that the existing single time series models are not accurate and robust enough when it comes to forecasting the iron ore prices and the parameters of the traditional LSSVM model are difficult to determine, we propose a combined model based on Phase Space Reconstruction (PSR), wavelet transform and LSSVM (PSR-WA-LSSVM) to tackle these issues. ARIMA model, LSTM model, PSR-LSSVM model, and PSR-WA-LSSVM models were used for contrast simulation to forecast the spot price data of 61.5%PB powder from January 30, 2019, to February 1, 2021, in Ningbo Zhoushan port. The experimental results show that the PSR-WA-LSSVM combination model achieves better prediction results. At the same time, the model has a good performance in the multistep prediction of the iron ore price

    Sustainable Reservoir Management Approaches under Impacts of Climate Change - A Case Study of Mangla Reservoir, Pakistan

    Get PDF
    Reservoir sedimentation is a major issue for water resource management around the world. It has serious economic, environmental, and social consequences, such as reduced water storage capacity, increased flooding risk, decreased hydropower generation, and deteriorated water quality. Increased rainfall intensity, higher temperatures, and more extreme weather events due to climate change are expected to exacerbate the problem of reservoir sedimentation. As a result, sedimentation must be managed to ensure the long-term viability of reservoirs and their associated infrastructure. Effective reservoir sedimentation management in the face of climate change necessitates an understanding of the sedimentation process and the factors that influence it, such as land use practices, erosion, and climate. Monitoring and modelling sedimentation rates are also useful tools for forecasting future impacts and making management decisions. The goal of this research is to create long-term reservoir management strategies in the face of climate change by simulating the effects of various reservoir-operating strategies on reservoir sedimentation and sediment delta movement at Mangla Reservoir in Pakistan (the second-largest dam in the country). In order to assess the impact of the Mangla Reservoir's sedimentation and reservoir life, a framework was developed. This framework incorporates both hydrological and morphodynamic models and various soft computing models. In addition to taking climate change uncertainty into consideration, the proposed framework also incorporates sediment source, sediment delivery, and reservoir morphology changes. Furthermore, the purpose of this study is to provide a practical methodology based on the limited data available. In the first phase of this study, it was investigated how to accurately quantify the missing suspended sediment load (SSL) data in rivers by utilizing various techniques, such as sediment rating curves (SRC) and soft computing models (SCMs), including local linear regression (LLR), artificial neural networks (ANN) and wavelet-cum-ANN (WANN). Further, the Gamma and M-test were performed to select the best-input variables and appropriate data length for SCMs development. Based on an evaluation of the outcomes of all leading models for SSL estimation, it can be concluded that SCMs are more effective than SRC approaches. Additionally, the results also indicated that the WANN model was the most accurate model for reconstructing the SSL time series because it is capable of identifying the salient characteristics in a data series. The second phase of this study examined the feasibility of using four satellite precipitation datasets (SPDs) which included GPM, PERSIANN_CDR, CHIRPS, and CMORPH to predict streamflow and sediment loads (SL) within a poorly gauged mountainous catchment, by employing the SWAT hydrological model as well as SWAT coupled soft computing models (SCMs) such as artificial neural networks (SWAT-ANN), random forests (SWAT-RF), and support vector regression (SWAT-SVR). SCMs were developed using the outputs of un-calibrated SWAT hydrological models to improve the predictions. The results indicate that during the entire simulation, the GPM shows the best performance in both schemes, while PERSIAN_CDR and CHIRPS also perform well, whereas CMORPH predicts streamflow for the Upper Jhelum River Basin (UJRB) with relatively poor performance. Among the best GPM-based models, SWAT-RF offered the best performance to simulate the entire streamflow, while SWAT-ANN excelled at simulating the SL. Hence, hydrological coupled SCMs based on SPDs could be an effective technique for simulating streamflow and SL, particularly in complex terrain where gauge network density is low or uneven. The third and last phase of this study investigated the impact of different reservoir operating strategies on Mangla reservoir sedimentation using a 1D sediment transport model. To improve the accuracy of the model, more accurate boundary conditions for flow and sediment load were incorporated into the numerical model (derived from the first and second phases of this study) so that the successive morphodynamic model could precisely predict bed level changes under given climate conditions. Further, in order to assess the long-term effect of a changing climate, a Global Climate Model (GCM) under Representative Concentration Pathways (RCP) scenarios 4.5 and 8.5 for the 21st century is used. The long-term modelling results showed that a gradual increase in the reservoir minimum operating level (MOL) slows down the delta movement rate and the bed level close to the dam. However, it may compromise the downstream irrigation demand during periods of high water demand. The findings may help the reservoir managers to improve the reservoir operation rules and ultimately support the objective of sustainable reservoir use for societal benefit. In summary, this study provides comprehensive insights into reservoir sedimentation phenomena and recommends an operational strategy that is both feasible and sustainable over the long term under the impact of climate change, especially in cases where a lack of data exists. Basically, it is very important to improve the accuracy of sediment load estimates, which are essential in the design and operation of reservoir structures and operating plans in response to incoming sediment loads, ensuring accurate reservoir lifespan predictions. Furthermore, the production of highly accurate streamflow forecasts, particularly when on-site data is limited, is important and can be achieved by the use of satellite-based precipitation data in conjunction with hydrological and soft computing models. Ultimately, the use of soft computing methods produces significantly improved input data for sediment load and discharge, enabling the application of one-dimensional hydro-morphodynamic numerical models to evaluate sediment dynamics and reservoir useful life under the influence of climate change at various operating conditions in a way that is adequate for evaluating sediment dynamics.:Chapter 1: Introduction Chapter 2:Reconstruction of Sediment Load Data in Rivers Chapter 3:Assessment of The Hydrological and Coupled Soft Computing Models, Based on Different Satellite Precipitation Datasets, To Simulate Streamflow and Sediment Load in A Mountainous Catchment Chapter 4:Simulating the Impact of Climate Change with Different Reservoir Operating Strategies on Sedimentation of the Mangla Reservoir, Northern Pakistan Chapter 5:Conclusions and Recommendation

    Real-time inflation forecasting using non-linear dimension reduction techniques

    Get PDF
    In this paper, we assess whether using non-linear dimension reduction techniques pays off for forecasting inflation in real-time. Several recent methods from the machine learning literature are adopted to map a large dimensional dataset into a lower-dimensional set of latent factors. We model the relationship between inflation and the latent factors using constant and time-varying parameter (TVP) regressions with shrinkage priors. Our models are then used to forecast monthly US inflation in real-time. The results suggest that sophisticated dimension reduction methods yield inflation forecasts that are highly competitive with linear approaches based on principal components. Among the techniques considered, the Autoencoder and squared principal components yield factors that have high predictive power for one-month- and one-quarter-ahead inflation. Zooming into model performance over time reveals that controlling for non-linear relations in the data is of particular importance during recessionary episodes of the business cycle or the current COVID-19 pandemic

    Toward Building an Intelligent and Secure Network: An Internet Traffic Forecasting Perspective

    Get PDF
    Internet traffic forecast is a crucial component for the proactive management of self-organizing networks (SON) to ensure better Quality of Service (QoS) and Quality of Experience (QoE). Given the volatile and random nature of traffic data, this forecasting influences strategic development and investment decisions in the Internet Service Provider (ISP) industry. Modern machine learning algorithms have shown potential in dealing with complex Internet traffic prediction tasks, yet challenges persist. This thesis systematically explores these issues over five empirical studies conducted in the past three years, focusing on four key research questions: How do outlier data samples impact prediction accuracy for both short-term and long-term forecasting? How can a denoising mechanism enhance prediction accuracy? How can robust machine learning models be built with limited data? How can out-of-distribution traffic data be used to improve the generalizability of prediction models? Based on extensive experiments, we propose a novel traffic forecast/prediction framework and associated models that integrate outlier management and noise reduction strategies, outperforming traditional machine learning models. Additionally, we suggest a transfer learning-based framework combined with a data augmentation technique to provide robust solutions with smaller datasets. Lastly, we propose a hybrid model with signal decomposition techniques to enhance model generalization for out-of-distribution data samples. We also brought the issue of cyber threats as part of our forecast research, acknowledging their substantial influence on traffic unpredictability and forecasting challenges. Our thesis presents a detailed exploration of cyber-attack detection, employing methods that have been validated using multiple benchmark datasets. Initially, we incorporated ensemble feature selection with ensemble classification to improve DDoS (Distributed Denial-of-Service) attack detection accuracy with minimal false alarms. Our research further introduces a stacking ensemble framework for classifying diverse forms of cyber-attacks. Proceeding further, we proposed a weighted voting mechanism for Android malware detection to secure Mobile Cyber-Physical Systems, which integrates the mobility of various smart devices to exchange information between physical and cyber systems. Lastly, we employed Generative Adversarial Networks for generating flow-based DDoS attacks in Internet of Things environments. By considering the impact of cyber-attacks on traffic volume and their challenges to traffic prediction, our research attempts to bridge the gap between traffic forecasting and cyber security, enhancing proactive management of networks and contributing to resilient and secure internet infrastructure

    Symmetry-Adapted Machine Learning for Information Security

    Get PDF
    Symmetry-adapted machine learning has shown encouraging ability to mitigate the security risks in information and communication technology (ICT) systems. It is a subset of artificial intelligence (AI) that relies on the principles of processing future events by learning past events or historical data. The autonomous nature of symmetry-adapted machine learning supports effective data processing and analysis for security detection in ICT systems without the interference of human authorities. Many industries are developing machine-learning-adapted solutions to support security for smart hardware, distributed computing, and the cloud. In our Special Issue book, we focus on the deployment of symmetry-adapted machine learning for information security in various application areas. This security approach can support effective methods to handle the dynamic nature of security attacks by extraction and analysis of data to identify hidden patterns of data. The main topics of this Issue include malware classification, an intrusion detection system, image watermarking, color image watermarking, battlefield target aggregation behavior recognition model, IP camera, Internet of Things (IoT) security, service function chain, indoor positioning system, and crypto-analysis

    Decision Support Systems for Risk Assessment in Credit Operations Against Collateral

    Get PDF
    With the global economic crisis, which reached its peak in the second half of 2008, and before a market shaken by economic instability, financial institutions have taken steps to protect the banks’ default risks, which had an impact directly in the form of analysis in credit institutions to individuals and to corporate entities. To mitigate the risk of banks in credit operations, most banks use a graded scale of customer risk, which determines the provision that banks must do according to the default risk levels in each credit transaction. The credit analysis involves the ability to make a credit decision inside a scenario of uncertainty and constant changes and incomplete transformations. This ability depends on the capacity to logically analyze situations, often complex and reach a clear conclusion, practical and practicable to implement. Credit Scoring models are used to predict the probability of a customer proposing to credit to become in default at any given time, based on his personal and financial information that may influence the ability of the client to pay the debt. This estimated probability, called the score, is an estimate of the risk of default of a customer in a given period. This increased concern has been in no small part caused by the weaknesses of existing risk management techniques that have been revealed by the recent financial crisis and the growing demand for consumer credit.The constant change affects several banking sections because it prevents the ability to investigate the data that is produced and stored in computers that are too often dependent on manual techniques. Among the many alternatives used in the world to balance this risk, the provision of guarantees stands out of guarantees in the formalization of credit agreements. In theory, the collateral does not ensure the credit return, as it is not computed as payment of the obligation within the project. There is also the fact that it will only be successful if triggered, which involves the legal area of the banking institution. The truth is, collateral is a mitigating element of credit risk. Collaterals are divided into two types, an individual guarantee (sponsor) and the asset guarantee (fiduciary). Both aim to increase security in credit operations, as an payment alternative to the holder of credit provided to the lender, if possible, unable to meet its obligations on time. For the creditor, it generates liquidity security from the receiving operation. The measurement of credit recoverability is a system that evaluates the efficiency of the collateral invested return mechanism. In an attempt to identify the sufficiency of collateral in credit operations, this thesis presents an assessment of smart classifiers that uses contextual information to assess whether collaterals provide for the recovery of credit granted in the decision-making process before the credit transaction become insolvent. The results observed when compared with other approaches in the literature and the comparative analysis of the most relevant artificial intelligence solutions, considering the classifiers that use guarantees as a parameter to calculate the risk contribute to the advance of the state of the art advance, increasing the commitment to the financial institutions.Com a crise econômica global, que atingiu seu auge no segundo semestre de 2008, e diante de um mercado abalado pela instabilidade econômica, as instituições financeiras tomaram medidas para proteger os riscos de inadimplência dos bancos, medidas que impactavam diretamente na forma de análise nas instituições de crédito para pessoas físicas e jurídicas. Para mitigar o risco dos bancos nas operações de crédito, a maioria destas instituições utiliza uma escala graduada de risco do cliente, que determina a provisão que os bancos devem fazer de acordo com os níveis de risco padrão em cada transação de crédito. A análise de crédito envolve a capacidade de tomar uma decisão de crédito dentro de um cenário de incerteza e mudanças constantes e transformações incompletas. Essa aptidão depende da capacidade de analisar situações lógicas, geralmente complexas e de chegar a uma conclusão clara, prática e praticável de implementar. Os modelos de Credit Score são usados para prever a probabilidade de um cliente propor crédito e tornar-se inadimplente a qualquer momento, com base em suas informações pessoais e financeiras que podem influenciar a capacidade do cliente de pagar a dívida. Essa probabilidade estimada, denominada pontuação, é uma estimativa do risco de inadimplência de um cliente em um determinado período. A mudança constante afeta várias seções bancárias, pois impede a capacidade de investigar os dados que são produzidos e armazenados em computadores que frequentemente dependem de técnicas manuais. Entre as inúmeras alternativas utilizadas no mundo para equilibrar esse risco, destacase o aporte de garantias na formalização dos contratos de crédito. Em tese, a garantia não “garante” o retorno do crédito, já que não é computada como pagamento da obrigação dentro do projeto. Tem-se ainda, o fato de que esta só terá algum êxito se acionada, o que envolve a área jurídica da instituição bancária. A verdade é que, a garantia é um elemento mitigador do risco de crédito. As garantias são divididas em dois tipos, uma garantia individual (patrocinadora) e a garantia do ativo (fiduciário). Ambos visam aumentar a segurança nas operações de crédito, como uma alternativa de pagamento ao titular do crédito fornecido ao credor, se possível, não puder cumprir suas obrigações no prazo. Para o credor, gera segurança de liquidez a partir da operação de recebimento. A mensuração da recuperabilidade do crédito é uma sistemática que avalia a eficiência do mecanismo de retorno do capital investido em garantias. Para tentar identificar a suficiência das garantias nas operações de crédito, esta tese apresenta uma avaliação dos classificadores inteligentes que utiliza informações contextuais para avaliar se as garantias permitem prever a recuperação de crédito concedido no processo de tomada de decisão antes que a operação de crédito entre em default. Os resultados observados quando comparados com outras abordagens existentes na literatura e a análise comparativa das soluções de inteligência artificial mais relevantes, mostram que os classificadores que usam garantias como parâmetro para calcular o risco contribuem para o avanço do estado da arte, aumentando o comprometimento com as instituições financeiras

    Advanced Operation and Maintenance in Solar Plants, Wind Farms and Microgrids

    Get PDF
    This reprint presents advances in operation and maintenance in solar plants, wind farms and microgrids. This compendium of scientific articles will help clarify the current advances in this subject, so it is expected that it will please the reader

    Modelado robusto para la extracción de información en entornos biofísicos y críticos

    Get PDF
    Tesis inédita de la Universidad Complutense de Madrid, Facultad de Informática, Departamento de Arquitectura de Computadores y Automática, leída el 12/07/2018The era of information and Big Data is an environment where multiple devices, always connected, generate huge volumes of information (paradigm of the Internet of Things). This paradigm is present in different areas: the Smart Cities, sport tracking, lifestyle, or health. The goal of this thesis is the development and implementation of a Robust predictive modeling methodology using low cost wearable devices in biophysical and critical scenarios. In this manuscript we present a multilevel architecture that covers from the on-node data processing, up to the data management in Data Centers. The methodology applies energy aware optimization techniques at each level of the network. And the decision system makes use of data from different sources leading to expert decision system...La era de la información y el Big Data, se sustenta en un entorno en el que múltiples dispositivos, siempre conectados, generan ingentes volúmenes de información (paradigma del Internet de las Cosas). Este paradigma ha llegado diversos entornos: las denominadas ciudades inteligentes, monitorización deportiva, estilo de vida, o salud. El objetivo de esta tesis es el desarrollo e implementación de una metodología de modelado predictivo robusto mediante dispositivos wearable de bajo coste en entornos biofísicos y críticos. A lo largo de este manuscrito se presenta una arquitectura multinivel que abarca desde el tratamiento de los datos en los dispositivos sensores hasta el manejo de éstos en centros de datos. La metodología cubre la optimización energética a todos los niveles con consciencia del estado de la red. Y el sistema de decisión hace uso de datos de distintas fuentes para conformar un sistema experto de decisión...Fac. de InformáticaTRUEunpu
    corecore