10 research outputs found

    Identification of continuous-time model of hammerstein system using modified multi-verse optimizer

    Get PDF
    his thesis implements a novel nature-inspired metaheuristic optimization algorithm, namely the modified Multi-Verse Optimizer (mMVO) algorithm, to identify the continuous-time model of Hammerstein system. Multi-Verse Optimizer (MVO) is one of the most recent robust nature-inspired metaheuristic algorithm. It has been successfully implemented and used in various areas such as machine learning applications, engineering applications, network applications, parameter control, and other similar applications to solve optimization problems. However, such metaheuristics had some limitations, such as local optima problem, low searching capability and imbalance between exploration and exploitation. By considering these limitations, two modifications were made upon the conventional MVO in our proposed mMVO algorithm. Our first modification was an average design parameter updating mechanism to solve the local optima issue of the traditional MVO. The essential feature of the average design parameter updating mechanism is that it helps any trapped design parameter jump out from the local optima region and continue a new search track. The second modification is the hybridization of MVO with the Sine Cosine Algorithm (SCA) to improve the low searching capability of the conventional MVO. Hybridization aims to combine MVO and SCA algorithms advantages and minimize the disadvantages, such as low searching capability and imbalance between exploration and exploitation. In particular, the search capacity of the MVO algorithm has been improved using the sine and cosine functions of the Sine Cosine Algorithm (SCA) that will be able to balance the processes of exploration and exploitation. The mMVO based method is then used for identifying the parameters of linear and nonlinear subsystems in the Hammerstein model using the given input and output data. Note that the structure of the linear and nonlinear subsystems is assumed to be known. Moreover, a continuous-time linear subsystem is considered in this study, while there are a few methods that utilize such models. Two numerical examples and one real-world application, such as the Twin Rotor System (TRS) are used to illustrate the efficiency of the mMVO-based method. Various nonlinear subsystems such as quadratic and hyperbolic functions (sine and tangent) are used in those experiments. Numerical and experimental results are analyzed to focus on the convergence curve of the fitness function, the parameter variation index, frequency and time domain response and the Wilcoxon rank test. For the numerical identifications, three different levels of white noise variances were taken. The statistical analysis value (mean) was taken from the parameter deviation index to see how much our proposed algorithm has improved. For Example 1, the improvements are 29%, 33.15% and 36.68%, and for the noise variances, 0.01, 0.25, and 1.0 improvements can be found. For Example 2, the improvements are 39.36%, 39.61% and 66.18%, and for noise variances, the improvements are by 0.01, 0.25 and 1.0, respectively. Finally, for the real TRS application, the improvement is 7%. The numerical and experimental results also showed that both Hammerstein model subsystems are defined effectively using the mMVO-based method, particularly in quadratic output estimation error and a differentiation parameter index. The results further confirmed that the proposed mMVObased method provided better solutions than other optimization techniques, such as PSO, GWO, ALO, MVO and SCA

    The use of computational geometry techniques to resolve the issues of coverage and connectivity in wireless sensor networks

    Get PDF
    Wireless Sensor Networks (WSNs) enhance the ability to sense and control the physical environment in various applications. The functionality of WSNs depends on various aspects like the localization of nodes, the strategies of node deployment, and a lifetime of nodes and routing techniques, etc. Coverage is an essential part of WSNs wherein the targeted area is covered by at least one node. Computational Geometry (CG) -based techniques significantly improve the coverage and connectivity of WSNs. This paper is a step towards employing some of the popular techniques in WSNs in a productive manner. Furthermore, this paper attempts to survey the existing research conducted using Computational Geometry-based methods in WSNs. In order to address coverage and connectivity issues in WSNs, the use of the Voronoi Diagram, Delaunay Triangulation, Voronoi Tessellation, and the Convex Hull have played a prominent role. Finally, the paper concludes by discussing various research challenges and proposed solutions using Computational Geometry-based techniques.Web of Science2218art. no. 700

    Feature Selection for Document Classification : Case Study of Meta-heuristic Intelligence and Traditional Approaches

    Get PDF
    Doctor of Philosophy (Computer Engineering), 2020Nowadays, the culture for accessing news around the world is changed from paper to electronic format and the rate of publication for newspapers and magazines on website are increased dramatically. Meanwhile, text feature selection for the automatic document classification (ADC) is becoming a big challenge because of the unstructured nature of text feature, which is called “multi-dimension feature problem”. On the other hand, various powerful schemes dealing with text feature selection are being developed continuously nowadays, but there still exists a research gap for “optimization of feature selection problem (OFSP)”, which can be looked for the global optimal features. Meanwhile, the capacity of meta-heuristic intelligence for knowledge discovery process (KDP) is also become the critical role to overcome NP-hard problem of OFSP by providing effective performance and efficient computation time. Therefore, the idea of meta-heuristic based approach for optimization of feature selection is proposed in this research to search the global optimal features for ADC. In this thesis, case study of meta-heuristic intelligence and traditional approaches for feature selection optimization process in document classification is observed. It includes eleven meta-heuristic algorithms such as Ant Colony search, Artificial Bee Colony search, Bat search, Cuckoo search, Evolutionary search, Elephant search, Firefly search, Flower search, Genetic search, Rhinoceros search, and Wolf search, for searching the optimal feature subset for document classification. Then, the results of proposed model are compared with three traditional search algorithms like Best First search (BFS), Greedy Stepwise (GS), and Ranker search (RS). In addition, the framework of data mining is applied. It involves data preprocessing, feature engineering, building learning model and evaluating the performance of proposed meta-heuristic intelligence-based feature selection using various performance and computation complexity evaluation schemes. In data processing, tokenization, stop-words handling, stemming and lemmatizing, and normalization are applied. In feature engineering process, n-gram TF-IDF feature extraction is used for implementing feature vector and both filter and wrapper approach are applied for observing different cases. In addition, three different classifiers like J48, Naïve Bayes, and Support Vector Machine, are used for building the document classification model. According to the results, the proposed system can reduce the number of selected features dramatically that can deteriorate learning model performance. In addition, the selected global subset features can yield better performance than traditional search according to single objective function of proposed model

    Holistic, data-driven, service and supply chain optimisation: linked optimisation.

    Get PDF
    The intensity of competition and technological advancements in the business environment has made companies collaborate and cooperate together as a means of survival. This creates a chain of companies and business components with unified business objectives. However, managing the decision-making process (like scheduling, ordering, delivering and allocating) at the various business components and maintaining a holistic objective is a huge business challenge, as these operations are complex and dynamic. This is because the overall chain of business processes is widely distributed across all the supply chain participants; therefore, no individual collaborator has a complete overview of the processes. Increasingly, such decisions are automated and are strongly supported by optimisation algorithms - manufacturing optimisation, B2B ordering, financial trading, transportation scheduling and allocation. However, most of these algorithms do not incorporate the complexity associated with interacting decision-making systems like supply chains. It is well-known that decisions made at one point in supply chains can have significant consequences that ripple through linked production and transportation systems. Recently, global shocks to supply chains (COVID-19, climate change, blockage of the Suez Canal) have demonstrated the importance of these interdependencies, and the need to create supply chains that are more resilient and have significantly reduced impact on the environment. Such interacting decision-making systems need to be considered through an optimisation process. However, the interactions between such decision-making systems are not modelled. We therefore believe that modelling such interactions is an opportunity to provide computational extensions to current optimisation paradigms. This research study aims to develop a general framework for formulating and solving holistic, data-driven optimisation problems in service and supply chains. This research achieved this aim and contributes to scholarship by firstly considering the complexities of supply chain problems from a linked problem perspective. This leads to developing a formalism for characterising linked optimisation problems as a model for supply chains. Secondly, the research adopts a method for creating a linked optimisation problem benchmark by linking existing classical benchmark sets. This involves using a mix of classical optimisation problems, typically relating to supply chain decision problems, to describe different modes of linkages in linked optimisation problems. Thirdly, several techniques for linking supply chain fragmented data have been proposed in the literature to identify data relationships. Therefore, this thesis explores some of these techniques and combines them in specific ways to improve the data discovery process. Lastly, many state-of-the-art algorithms have been explored in the literature and these algorithms have been used to tackle problems relating to supply chain problems. This research therefore investigates the resilient state-of-the-art optimisation algorithms presented in the literature, and then designs suitable algorithmic approaches inspired by the existing algorithms and the nature of problem linkages to address different problem linkages in supply chains. Considering research findings and future perspectives, the study demonstrates the suitability of algorithms to different linked structures involving two sub-problems, which suggests further investigations on issues like the suitability of algorithms on more complex structures, benchmark methodologies, holistic goals and evaluation, processmining, game theory and dependency analysis

    Structural optimization in steel structures, algorithms and applications

    Get PDF
    L'abstract è presente nell'allegato / the abstract is in the attachmen

    AI meets CRNs : a prospective review on the application of deep architectures in spectrum management

    Get PDF
    The spectrum low utilization and high demand conundrum created a bottleneck towards ful lling the requirements of next-generation networks. The cognitive radio (CR) technology was advocated as a de facto technology to alleviate the scarcity and under-utilization of spectrum resources by exploiting temporarily vacant spectrum holes of the licensed spectrum bands. As a result, the CR technology became the rst step towards the intelligentization of mobile and wireless networks, and in order to strengthen its intelligent operation, the cognitive engine needs to be enhanced through the exploitation of arti cial intelligence (AI) strategies. Since comprehensive literature reviews covering the integration and application of deep architectures in cognitive radio networks (CRNs) are still lacking, this article aims at lling the gap by presenting a detailed review that addresses the integration of deep architectures into the intricacies of spectrum management. This is a prospective review whose primary objective is to provide an in-depth exploration of the recent trends in AI strategies employed in mobile and wireless communication networks. The existing reviews in this area have not considered the relevance of incorporating the mathematical fundamentals of each AI strategy and how to tailor them to speci c mobile and wireless networking problems. Therefore, this reviewaddresses that problem by detailing howdeep architectures can be integrated into spectrum management problems. Beyond reviewing different ways in which deep architectures can be integrated into spectrum management, model selection strategies and how different deep architectures can be tailored into the CR space to achieve better performance in complex environments are then reported in the context of future research directions.The Sentech Chair in Broadband Wireless Multimedia Communications (BWMC) at the University of Pretoria.http://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=6287639am2022Electrical, Electronic and Computer Engineerin

    Intelligent Circuits and Systems

    Get PDF
    ICICS-2020 is the third conference initiated by the School of Electronics and Electrical Engineering at Lovely Professional University that explored recent innovations of researchers working for the development of smart and green technologies in the fields of Energy, Electronics, Communications, Computers, and Control. ICICS provides innovators to identify new opportunities for the social and economic benefits of society.  This conference bridges the gap between academics and R&D institutions, social visionaries, and experts from all strata of society to present their ongoing research activities and foster research relations between them. It provides opportunities for the exchange of new ideas, applications, and experiences in the field of smart technologies and finding global partners for future collaboration. The ICICS-2020 was conducted in two broad categories, Intelligent Circuits & Intelligent Systems and Emerging Technologies in Electrical Engineering

    CACIC 2015 : XXI Congreso Argentino de Ciencias de la Computación. Libro de actas

    Get PDF
    Actas del XXI Congreso Argentino de Ciencias de la Computación (CACIC 2015), realizado en Sede UNNOBA Junín, del 5 al 9 de octubre de 2015.Red de Universidades con Carreras en Informática (RedUNCI

    Thickness estimation, automated classification and novelty detection in ultrasound images of the plantar fascia tissues

    Get PDF
    The plantar fascia (PF) tissue plays an important role in the movement and the stability of the foot during walking and running. Thus it is possible for the overuse and the associated medical problems to cause injuries and some severe common diseases. Ultrasound (US) imaging offers significant potential in diagnosis of PF injuries and monitoring treatments. Despite the advantages of US, the generated PF images are difficult to interpret during medical assessment. This is partly due to the size and position of the PF in relation to the adjacent tissues. This limits the use of US in clinical practice and therefore impacts on patient services for what is a common problem and a major cause of foot pain and discomfort. It is therefore a requirement to devise an automated system that allows better and easier interpretation of PF US images during diagnosis. This study is concerned with developing a computer-based system using a combination of medical image processing techniques whereby different PF US images can be visually improved, segmented, analysed and classified as normal or abnormal, so as to provide more information to the doctors and the clinical treatment department for early diagnosis and the detection of the PF associated medical problems. More specifically, this study is required to investigate the possibility of a proposed model for localizing and estimating the PF thickness a cross three different sections (rearfoot, midfoot and forefoot) using a supervised ANN segmentation technique. The segmentation method uses RBF artificial neural network module in order to classify small overlapping patches into PF and non-PF tissue. Feature selection technique was performed as a post-processing step for feature extraction to reduce the number of the extracted features. Then the trained RBF-ANN is used to segment the desired PF region. The PF thickness was calculated using two different methods: distance transformation and a proposed area-length calculation algorithm. Additionally, different machine learning approaches were investigated and applied to the segmented PF region in order to distinguish between symptomatic and asymptomatic PF subjects using the best normalized and selected feature set. This aims to facilitate the characterization and the classification of the PF area for the identification of patients with inferior heel pain at risk of plantar fasciitis. Finally, a novelty detection framework for detecting the symptomatic PF samples (with plantar fasciitis disorder) using only asymptomatic samples is proposed. This model implies the following: feature analysis, building a normality model by training the one-class SVDD classifier using only asymptomatic PF training datasets, and computing novelty scores using the trained SVDD classifier, training and testing asymptomatic datasets, and testing symptomatic datasets of the PF dataset. The performance evaluation results showed that the proposed approaches used in this study obtained favourable results compared to other methods reported in the literature
    corecore