1,115 research outputs found

    A review of regression models employed for predicting diffuse solar radiation in North-Western Africa

    Get PDF
    The knowledge of diffuse solar radiation (Hd) is of almost importance for determining the gross primary productivity, net ecosystem, exchange of carbon dioxide, light use efficiency and changing colour of the sky. However, routine measurement of Hd is not available in most locations in North-Western Africa. During the past 36 years in order to predict Hd in the horizontal surface on hourly, daily and monthly mean basis, several regression models have been developed for numerous locations in North-Western Africa. As a result, several input parameters have been utilized and different functional forms applied. The regression models so far utilized were classified into six main categories and presented based on the input parameters applied. The models were further reclassified into numerous main groups and finally represented according to their developing year. In general, 188 regression models, 33 functional forms and 20 groups were reported in literature for predicting Hd in North-Western Africa. The regression and soft computing models developed within North-Western Africa and across the globe were examined in order to determine the best technique of prediction. The result revealed that soft computing models are more suitable for predicting Hd in North-Western Africa and across the globe. Citation: Ogbulezie, J., Ushie, O., and Nwokolo, S. (2017). A review of regression models employed for predicting diffuse solar radiation in North-Western Africa. Trends in Renewable Energy, 3(2), 160-206. DOI: 10.17737/tre.2017.3.2.004

    Big Data - Supply Chain Management Framework for Forecasting: Data Preprocessing and Machine Learning Techniques

    Full text link
    This article intends to systematically identify and comparatively analyze state-of-the-art supply chain (SC) forecasting strategies and technologies. A novel framework has been proposed incorporating Big Data Analytics in SC Management (problem identification, data sources, exploratory data analysis, machine-learning model training, hyperparameter tuning, performance evaluation, and optimization), forecasting effects on human-workforce, inventory, and overall SC. Initially, the need to collect data according to SC strategy and how to collect them has been discussed. The article discusses the need for different types of forecasting according to the period or SC objective. The SC KPIs and the error-measurement systems have been recommended to optimize the top-performing model. The adverse effects of phantom inventory on forecasting and the dependence of managerial decisions on the SC KPIs for determining model performance parameters and improving operations management, transparency, and planning efficiency have been illustrated. The cyclic connection within the framework introduces preprocessing optimization based on the post-process KPIs, optimizing the overall control process (inventory management, workforce determination, cost, production and capacity planning). The contribution of this research lies in the standard SC process framework proposal, recommended forecasting data analysis, forecasting effects on SC performance, machine learning algorithms optimization followed, and in shedding light on future research

    Forecasting modeling and analytics of economic processes

    Get PDF
    The book will be useful for economists, finance and valuation professionals, market researchers, public policy analysts, data analysts, teachers or students in graduate-level classes. The book is aimed at students and beginners who are interested in forecasting modeling and analytics of economic processes and want to get an idea of its implementation

    Some further studies on improving QFD methodology and analysis

    Get PDF
    Quality Function Deployment (QFD) starts and ends with the customer. In other words, how it ends may depend largely on how it starts. Any QFD practitioners will start with collecting the voice of the customer that reflects customer’s needs as to make sure that the products will eventually sell or the service may satisfy the customer. On the basis of those needs, a product or service creation process is initiated. It always takes a certain period of time for the product or service to be ready for the customer. The question here is whether those customer-needs may remain exactly the same during the product or service creation process. The answer would be very likely to be a ‘no’, especially in today’s rapidly changing environment due to increased competition and globalization. The focus of this thesis is placed on dealing with the change of relative importance of the customer’s needs during product or service creation process. In other words, the assumption is that there is no new need discovered along the time or an old one becomes outdated; only the relative importance change of the existing needs is dealt with. Considering the latest development of QFD research, especially the increasingly extensive use of Analytic Hierarchy Process (AHP) in QFD, this thesis aims to enhance the current QFD methodology and analysis, with respect to the change during product or service creation process, as to continually meet or exceed the needs of the customer. The entire research works are divided into three main parts, namely, the further use of AHP in QFD, the incorporation of AHP-based priorities’ dynamics in QFD, and decision making analysis with respect to the dynamics. The first part focuses on the question "In what ways does AHP, considering its strength and weakness, contribute to an improved QFD analysis?" The usefulness of AHP in QFD is demonstrated through a case study in improving higher education quality of an education institution. Furthermore, a generalized model of using AHP in QFD is also proposed. The generalized model not only provides an alternative way to construct the house of quality (HoQ), but also creates the possibility to include other relevant factors into QFD analysis, such as new product development risks. The second part addresses the question "How to use the AHP in QFD in dealing with the dynamics of priorities?" A novel quantitative method to model the dynamics of AHP-based priorities in the HoQ is proposed. The method is simple and time-efficient. It is especially useful when the historical data is limited, which is the case in a highly dynamic environment. As to further improve QFD analysis, the modeling method is applied into two areas. The first area is to enhance the use of Kano’s model in QFD by considering its dynamics. It not only extends the use of Kano’s model in QFD, but also advances the academic literature on modeling the life cycle of quality attributes quantitatively. The second area is to enhance the benchmarking part of QFD by including the dynamics of competitors’ performance in addition to the dynamics of customer’s needs. The third part deals with the question "How to make decision in a QFD analysis with respect to the dynamics in the house of quality?" Two decision making approaches are proposed to prioritize and/or optimize the technical attributes with respect to the modeling results. Considering the fact that almost all QFD translation process employs the relationship matrix, a guideline for QFD practitioners to decide whether the relationship matrix should be normalized is developed. Furthermore, a practical implication of the research work towards the possible use of QFD in helping a company develop more innovative products is also discussed. In brief, the main contribution of this thesis is in providing some novel methods and/or approaches to enhance the QFD’s use with respect to the change during product or service creation process. For scientific community, this means that the existing QFD research has been considerably improved, especially with the use of AHP in QFD. For engineering practice, a better way of doing QFD analysis, as a customer-driven engineering design tool, has been proposed. It is hoped that the research work may provide a first step into a better customer-driven product or service design process, and eventually increase the possibility to create more innovative and competitive products or services over time

    Fault Diagnosis and Failure Prognostics of Lithium-ion Battery based on Least Squares Support Vector Machine and Memory Particle Filter Framework

    Get PDF
    123456A novel data driven approach is developed for fault diagnosis and remaining useful life (RUL) prognostics for lithium-ion batteries using Least Square Support Vector Machine (LS-SVM) and Memory-Particle Filter (M-PF). Unlike traditional data-driven models for capacity fault diagnosis and failure prognosis, which require multidimensional physical characteristics, the proposed algorithm uses only two variables: Energy Efficiency (EE), and Work Temperature. The aim of this novel framework is to improve the accuracy of incipient and abrupt faults diagnosis and failure prognosis. First, the LSSVM is used to generate residual signal based on capacity fade trends of the Li-ion batteries. Second, adaptive threshold model is developed based on several factors including input, output model error, disturbance, and drift parameter. The adaptive threshold is used to tackle the shortcoming of a fixed threshold. Third, the M-PF is proposed as the new method for failure prognostic to determine Remaining Useful Life (RUL). The M-PF is based on the assumption of the availability of real-time observation and historical data, where the historical failure data can be used instead of the physical failure model within the particle filter. The feasibility of the framework is validated using Li-ion battery prognostic data obtained from the National Aeronautic and Space Administration (NASA) Ames Prognostic Center of Excellence (PCoE). The experimental results show the following: (1) fewer data dimensions for the input data are required compared to traditional empirical models; (2) the proposed diagnostic approach provides an effective way of diagnosing Li-ion battery fault; (3) the proposed prognostic approach can predict the RUL of Li-ion batteries with small error, and has high prediction accuracy; and, (4) the proposed prognostic approach shows that historical failure data can be used instead of a physical failure model in the particle filter

    Machine Learning with Metaheuristic Algorithms for Sustainable Water Resources Management

    Get PDF
    The main aim of this book is to present various implementations of ML methods and metaheuristic algorithms to improve modelling and prediction hydrological and water resources phenomena having vital importance in water resource management

    Design of optimal reservoir operating rules in large water resources systems combining stochastic programming, fuzzy logic and expert criteria

    Full text link
    Given the high degree of development of hydraulic infrastructure in the developed countries, and with the increasing opposition to constructing new facilities in developing countries, the focus of water resource system analysis has turned into defining adequate operation strategies. Better management is necessary to cope with the challenge of supplying increasing demands and conflicts on water allocation while facing climate change impacts. To do so, a large set of mathematical simulation and optimization tools have been developed. However, the real application of these techniques is still limited. One of the main lines of research to fix this issue regards to the involvement of experts' knowledge in the definition of mathematical algorithms. To define operating rules in a way in which system operators could rely, their expert knowledge should be fully accounted and merged with the results from mathematical algorithms. This thesis develops a methodological framework and the required tools to improve the operation of large-scale water resource systems. In such systems, decision-making processes are complex and supported, at least partially, by the expert knowledge of decision-makers. This importance of expert judgment in the operation strategies requires mathematical tools able to embed and combine it with optimization algorithms. The methods and tools developed in this thesis rely on stochastic programming, fuzzy logic and the involvement of system operators during the whole rule-defining process. An extended stochastic programming algorithm, able to be used in large-scale water resource systems including stream-aquifer interactions, has been developed (the CSG-SDDP). The methodological framework proposed uses fuzzy logic to capture the expert knowledge in the definition of optimal operating rules. Once the current decision-making process is fairly reproduced using fuzzy logic and expert knowledge, stochastic programming results are introduced and thus the performance of the rules is improved. The framework proposed in this thesis has been applied to the Jucar river system (Eastern Spain), in which scarce resources are allocated following complex decision-making processes. We present two applications. In the first one, the CSG-SDDP algorithm has been used to define economically-optimal conjunctive use strategies for a joint operation of reservoirs andaquifers. In the second one, we implement a collaborative framework to couple historical records with expert knowledge and criteria to define a decision support system (DSS) for the seasonal operation of the reservoirs of the Jucar River system. The co-developed DSS tool explicitly reproduces the decision-making processes and criteria considered by the system operators. Two fuzzy logic systems have been developed and linked with this purpose, as well as with fuzzy regressions to preview future inflows. The DSS developed was validated against historical records. The developed framework offers managers a simple way to define a priori suitable decisions, as well as to explore the consequences of any of them. The resulting representation has been then combined with the CSG-SDDP algorithm in order to improve the rules following the current decision-making process. Results show that reducing pumping from the Mancha Oriental aquifer would lead to higher systemwide benefits due to increased flows by stream-aquifer interaction. The operating rules developed successfully combined fuzzy logic, expert judgment and stochastic programming, increasing water allocations to the demands by changing the way in which Alarcon, Contreras and Tous are balanced. These rules follow the same decision-making processes currently done in the system, so system operators would feel familiar with them. In addition, they can be contrasted with the current operating rules to determine what operation options can be coherent with the current management and, at the same time, achieve an optimal operationDado el alto número de infraestructuras construidas en los países desarrollados, y con una oposición creciente a la construcción de nuevas infraestructuras en los países en vías de desarrollo, la atención del análisis de sistemas de recursos hídricos ha pasado a la definición de reglas de operación adecuadas. Una gestión más eficiente del recurso hídrico es necesaria para poder afrontar los impactos del cambio climático y de la creciente demanda de agua. Para lograrlo, un amplio abanico de herramientas y modelos matemáticos de optimización se han desarrollado. Sin embargo, su aplicación práctica en la gestión hídrica sigue siendo limitada. Una de las más importantes líneas de investigación para solucionarlo busca la involucración de los expertos en la definición de dichos modelos matemáticos. Para definir reglas de operación en las cuales los gestores confíen, es necesario tener en cuenta su criterio experto y combinarlo con algoritmos de optimización. La presente tesis desarrolla una metodología, y las herramientas necesarias para aplicarla, con el fin de mejorar la operación de sistemas complejos de recursos hídricos. En éstos, los procesos de toma de decisiones son complicados y se sustentan, al menos en parte, en el juicio experto de los gestores. Esta importancia del criterio de experto en las reglas de operación requiere herramientas matemáticas capaces de incorporarlo en su estructura y de unirlo con algoritmos de optimización. Las herramientas y métodos desarrollados se basan en la optimización estocástica, en la lógica difusa y en la involucración de los expertos durante todo el proceso. Un algoritmo estocástico extendido, capaz de ser usado en sistemas complejos con interacciones río-acuífero se ha desarrollado (el CSG-SDDP). La metodología definida usa lógica difusa para capturar el criterio de experto en la definición de reglas óptimas. En primer lugar se reproducen los procesos de toma de decisiones actuales y, tras ello, el algoritmo de optimización estocástica se emplea para mejorar las reglas previamente obtenidas. La metodología propuesta en esta tesis se ha aplicado al sistema Júcar (Este de España), en el que los recursos hídricos son gestionados de acuerdo a complejos procesos de toma de decisiones. La aplicación se ha realizado de dos formas. En la primera, el algoritmo CSG-SDDP se ha utilizado para definir una estrategia óptima para el uso conjunto de embalses y acuíferos. En la segunda, la metodología se ha usado para reproducir las reglas de operación actuales en base a criterio de expertos. La herramienta desarrollada reproduce de forma explícita los procesos de toma de decisiones seguidos por los operadores del sistema. Dos sistemas lógicos difusos se han empleado e interconectado con este fin, así como regresiones difusas para predecir aportaciones. El Sistema de Ayuda a la Decisión (SAD) creado se ha validado comparándolo con los datos históricos. La metodología desarrollada ofrece a los gestores una forma sencilla de definir decisiones a priori adecuadas, así como explorar las consecuencias de una decisión concreta. La representación matemática resultante se ha combinado entonces con el CSG-SDDP para definir reglas óptimas que respetan los procesos actuales. Los resultados obtenidos indican que reducir el bombeo del acuífero de la Mancha Oriental conlleva una mejora en los beneficios del sistema debido al incremento de caudal por relación río-acuífero. Las reglas de operación han sido adecuadamente desarrolladas combinando lógica difusa, juicio experto y optimización estocástica, aumentando los suministros a las demandas mediante modificaciones el balance de Alarcón, Contreras y Tous. Estas reglas siguen los procesos de toma de decisiones actuales en el Júcar, por lo que pueden resultar familiares a los gestores. Además, pueden compararse con las reglas de operación actuales para establecer qué decisiones entreDonat l'alt nombre d'infraestructures construïdes en els països desenrotllats, i amb una oposició creixent a la construcció de noves infraestructures en els països en vies de desenrotllament, l'atenció de l'anàlisi de sistemes de recursos hídrics ha passat a la definició de regles d'operació adequades. Una gestió més eficient del recurs hídric és necessària per a poder afrontar els impactes del canvi climàtic i de la creixent demanda d'aigua. Per a aconseguir-ho, una amplia selecció de ferramentes i models matemàtics d'optimització s'han desenrotllat. No obstant això, la seua aplicació pràctica en la gestió hídrica continua sent limitada. Una de les més importants línies d'investigació per a solucionar-ho busca la col·laboració activa dels experts en la definició dels models matemàtics. Per a definir regles d'operació en les quals els gestors confien, és necessari tindre en compte el seu criteri expert i combinar-ho amb algoritmes d'optimització. La present tesi desenrotlla una metodologia, i les ferramentes necessàries per a aplicar-la, amb la finalitat de millorar l'operació de sistemes complexos de recursos hídrics. En estos, els processos de presa de decisions són complicats i se sustenten, almenys en part, en el juí expert dels gestors. Esta importància del criteri d'expert en les regles d'operació requereix ferramentes matemàtiques capaces d'incorporar-lo en la seua estructura i d'unir-lo amb algoritmes d'optimització. Les ferramentes i mètodes desenrotllats es basen en l'optimització estocàstica, en la lògica difusa i en la col·laboració activa dels experts durant tot el procés. Un algoritme estocàstic avançat, capaç de ser usat en sistemes complexos amb interaccions riu-aqüífer, s'ha desenrotllat (el CSG-SDDP) . La metodologia definida utilitza lògica difusa per a capturar el criteri d'expert en la definició de regles òptimes. En primer lloc es reprodueixen els processos de presa de decisions actuals i, després d'això, l'algoritme d'optimització estocàstica s'empra per a millorar les regles prèviament obtingudes. La metodologia proposada en esta tesi s'ha aplicat al sistema Xúquer (Est d'Espanya), en el que els recursos hídrics són gestionats d'acord amb complexos processos de presa de decisions. L'aplicació s'ha realitzat de dos formes. En la primera, l'algoritme CSG-SDDP s'ha utilitzat per a definir una estratègia òptima per a l'ús conjunt d'embassaments i aqüífers. En la segona, la metodologia s'ha usat per a reproduir les regles d'operació actuals basant-se en criteri d'experts. La ferramenta desenvolupada reprodueix de forma explícita els processos de presa de decisions seguits pels operadors del sistema. Dos sistemes lògics difusos s'han empleat i interconnectat amb este fi, al igual què regressions difuses per preveure cabdals. El Sistema d'Ajuda a la Decisió (SAD) creat s'ha validat comparant-lo amb les dades històriques. La metodologia desenvolupada ofereix als gestors una manera senzilla de definir decisions a priori adequades, així com per explorar les conseqüències d'una decisió concreta. La representació matemàtica resultant s'ha combinat amb el CSG-SDDP per a definir regles òptimes que respecten els processos actuals. Els resultats obtinguts indiquen que reduir el bombament de l'aqüífer de la Mancha Oriental comporta una millora en els beneficis del sistema a causa de l'increment de l'aigua per relació riu-aqüífer. Les regles d'operació han sigut adequadament desenrotllades combinant lògica difusa, juí expert i optimització estocàstica, augmentant els subministres a les demandes per mitjà de modificacions del balanç d'Alarcón, Contreras i Tous. Estes regles segueixen els processos de presa de decisions actuals en el Xúquer, per la qual cosa poden resultar familiars als gestors. A més, poden comparar-se amb les regles d'operació actuals per a establir quines decisions entre les possibles serien coherentsMacián Sorribes, H. (2017). Design of optimal reservoir operating rules in large water resources systems combining stochastic programming, fuzzy logic and expert criteria [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/82554TESI

    Rails Quality Data Modelling via Machine Learning-Based Paradigms

    Get PDF

    Quasi-optimization of Neuro-fuzzy Expert Systems using Asymptotic Least-squares and Modified Radial Basis Function Models: Intelligent Planning of Operational Research Problems

    Get PDF
    The uncertainty found in many industrialization systems poses a significant challenge; partic-ularly in modelling production planning and optimizing manufacturing flow. In aggregate production planning, a key requirement is an ability to accurately predict demand from a range of influencing factors, such as consumption for example. Accurately building such causal models can be problematic if significant uncertainties are present, such as when the data are fuzzy, uncertain, fluctuate and are non-linear. AI models, such as Adaptive Neuro-Fuzzy Inference Systems (ANFIS), can cope with this better than most but even these well-established approaches fail if the data is scarce, poorly scaled and noisy. ANFIS is a combination of two approaches; Sugeno-type Fuzzy Inference System (FIS)and Artificial Neural Networks (ANN). Two sets of parameters are required to define the model: premise parameters and consequent parameters. Together, they ensure that the correct number and shape of membership functions are used and combined to produce reliable outputs. However, optimally determining values for these parameters can only happen if there are enough data samples representing the problem space to ensure that the method can converge. Mitigation strategies are suggested in the literature, such as fixing the premise parameters to avoid over-fitting, but, for many practitioners, this is not an adequate solution, as their expertise lies in the application domain, not in the AI domain. The work presented here is motivated by a real-world challenge in modelling and pre-dicting demand for the gasoline industry in Iraq, an application where both the quality and quantity of the training data can significantly affect prediction accuracy. To overcome data scarcity, we propose novel data expansion algorithms that are able to augment the original data with new samples drawn from the same distribution. By using a combination of carefully chosen and suitably modified radial basis function models, we show how robust methods can overcome problems of over-smoothing at boundary values and turning points. We further show how transformed least-squares (TLS) approximation of the data can be constructed to asymptotically bound the effect of outliers to enable accurate data expansion to take place. Though the problem of scaling/normalization is well understood in some AI applications, we assess the impact on model accuracy for two specific scaling techniques. By comparing and contrasting a range of data scaling and data expansion methods, we can evaluate their effectiveness in reducing prediction error. Throughout this work, the various methods are explained and expanded upon using the case study drawn from the oil and gas industry in Iraq which focuses on the accurate prediction of yearly gasoline consumption. This case study, and others are used to demonstrate, empirically, the effectiveness of the approaches presented when compared to current state of the art. Finally, we present a tool developed in Matlab to allow practitioners to experiment with all methods and options presented in this work
    corecore