397 research outputs found

    Toward enhancement of deep learning techniques using fuzzy logic: a survey

    Get PDF
    Deep learning has emerged recently as a type of artificial intelligence (AI) and machine learning (ML), it usually imitates the human way in gaining a particular knowledge type. Deep learning is considered an essential data science element, which comprises predictive modeling and statistics. Deep learning makes the processes of collecting, interpreting, and analyzing big data easier and faster. Deep neural networks are kind of ML models, where the non-linear processing units are layered for the purpose of extracting particular features from the inputs. Actually, the training process of similar networks is very expensive and it also depends on the used optimization method, hence optimal results may not be provided. The techniques of deep learning are also vulnerable to data noise. For these reasons, fuzzy systems are used to improve the performance of deep learning algorithms, especially in combination with neural networks. Fuzzy systems are used to improve the representation accuracy of deep learning models. This survey paper reviews some of the deep learning based fuzzy logic models and techniques that were presented and proposed in the previous studies, where fuzzy logic is used to improve deep learning performance. The approaches are divided into two categories based on how both of the samples are combined. Furthermore, the models' practicality in the actual world is revealed

    Classification of microarray gene expression cancer data by using artificial intelligence methods

    Get PDF
    Günümüzde bilgisayar teknolojilerinin gelişmesi ile birçok alanda yapılan çalışmaları etkilemiştir. Moleküler biyoloji ve bilgisayar teknolojilerinde meydana gelen gelişmeler biyoinformatik adlı bilimi ortaya çıkarmıştır. Biyoinformatik alanında meydana gelen hızlı gelişmeler, bu alanda çözülmeyi bekleyen birçok probleme çözüm olma yolunda büyük katkılar sağlamıştır. DNA mikroarray gen ekspresyonlarının sınıflandırılması da bu problemlerden birisidir. DNA mikroarray çalışmaları, biyoinformatik alanında kullanılan bir teknolojidir. DNA mikroarray veri analizi, kanser gibi genlerle alakalı hastalıkların teşhisinde çok etkin bir rol oynamaktadır. Hastalık türüne bağlı gen ifadeleri belirlenerek, herhangi bir bireyin hastalıklı gene sahip olup olmadığı büyük bir başarı oranı ile tespit edilebilir. Bireyin sağlıklı olup olmadığının tespiti için, mikroarray gen ekspresyonları üzerinde yüksek performanslı sınıflandırma tekniklerinin kullanılması büyük öneme sahiptir. DNA mikroarray’lerini sınıflandırmak için birçok yöntem bulunmaktadır. Destek Vektör Makinaları, Naive Bayes, k-En yakın Komşu, Karar Ağaçları gibi birçok istatistiksel yöntemler yaygın olarak kullanlmaktadır. Fakat bu yöntemler tek başına kullanıldığında, mikroarray verilerini sınıflandırmada her zaman yüksek başarı oranları vermemektedir. Bu yüzden mikroarray verilerini sınıflandırmada yüksek başarı oranları elde etmek için yapay zekâ tabanlı yöntemlerin de kullanılması yapılan çalışmalarda görülmektedir. Bu çalışmada, bu istatistiksel yöntemlere ek olarak yapay zekâ tabanlı ANFIS gibi bir yöntemi kullanarak daha yüksek başarı oranları elde etmek amaçlanmıştır. İstatistiksel sınıflandırma yöntemleri olarak K-En Yakın Komşuluk, Naive Bayes ve Destek Vektör Makineleri kullanılmıştır. Burada Göğüs ve Merkezi Sinir Sistemi kanseri olmak üzere iki farklı kanser veri seti üzerinde çalışmalar yapılmıştır. Sonuçlardan elde edilen bilgilere göre, genel olarak yapay zekâ tabanlı ANFIS tekniğinin, istatistiksel yöntemlere göre daha başarılı olduğu tespit edilmiştir

    Forecasting Cryptocurrency Value by Sentiment Analysis: An HPC-Oriented Survey of the State-of-the-Art in the Cloud Era

    Get PDF
    This chapter surveys the state-of-the-art in forecasting cryptocurrency value by Sentiment Analysis. Key compounding perspectives of current challenges are addressed, including blockchains, data collection, annotation, and filtering, and sentiment analysis metrics using data streams and cloud platforms. We have explored the domain based on this problem-solving metric perspective, i.e., as technical analysis, forecasting, and estimation using a standardized ledger-based technology. The envisioned tools based on forecasting are then suggested, i.e., ranking Initial Coin Offering (ICO) values for incoming cryptocurrencies, trading strategies employing the new Sentiment Analysis metrics, and risk aversion in cryptocurrencies trading through a multi-objective portfolio selection. Our perspective is rationalized on the perspective on elastic demand of computational resources for cloud infrastructures

    A Fuzzy Logic-Based System for Soccer Video Scenes Classification

    Get PDF
    Massive global video surveillance worldwide captures data but lacks detailed activity information to flag events of interest, while the human burden of monitoring video footage is untenable. Artificial intelligence (AI) can be applied to raw video footage to identify and extract required information and summarize it in linguistic formats. Video summarization automation usually involves text-based data such as subtitles, segmenting text and semantics, with little attention to video summarization in the processing of video footage only. Classification problems in recorded videos are often very complex and uncertain due to the dynamic nature of the video sequence and light conditions, background, camera angle, occlusions, indistinguishable scene features, etc. Video scene classification forms the basis of linguistic video summarization, an open research problem with major commercial importance. Soccer video scenes present added challenges due to specific objects and events with similar features (e.g. “people” include audiences, coaches, and players), as well as being constituted from a series of quickly changing and dynamic frames with small inter-frame variations. There is an added difficulty associated with the need to have light weight video classification systems working in real time with massive data sizes. In this thesis, we introduce a novel system based on Interval Type-2 Fuzzy Logic Classification Systems (IT2FLCS) whose parameters are optimized by the Big Bang–Big Crunch (BB-BC) algorithm, which allows for the automatic scenes classification using optimized rules in broadcasted soccer matches video. The type-2 fuzzy logic systems would be unequivocal to present a highly interpretable and transparent model which is very suitable for the handling the encountered uncertainties in video footages and converting the accumulated data to linguistic formats which can be easily stored and analysed. Meanwhile the traditional black box techniques, such as support vector machines (SVMs) and neural networks, do not provide models which could be easily analysed and understood by human users. The BB-BC optimization is a heuristic, population-based evolutionary approach which is characterized by the ease of implementation, fast convergence and low computational cost. We employed the BB-BC to optimize our system parameters of fuzzy logic membership functions and fuzzy rules. Using the BB-BC we are able to balance the system transparency (through generating a small rule set) together with increasing the accuracy of scene classification. Thus, the proposed fuzzy-based system allows achieving relatively high classification accuracy with a small number of rules thus increasing the system interpretability and allowing its real-time processing. The type-2 Fuzzy Logic Classification System (T2FLCS) obtained 87.57% prediction accuracy in the scene classification of our testing group data which is better than the type-1 fuzzy classification system and neural networks counterparts. The BB-BC optimization algorithms decrease the size of rule bases both in T1FLCS and T2FLCS; the T2FLCS finally got 85.716% with reduce rules, outperforming the T1FLCS and neural network counterparts, especially in the “out-of-range data” which validates the T2FLCSs capability to handle the high level of faced uncertainties. We also presented a novel approach based on the scenes classification system combined with the dynamic time warping algorithm to implement the video events detection for real world processing. The proposed system could run on recorded or live video clips and output a label to describe the event in order to provide the high level summarization of the videos to the user

    Short-Term Load Demand Forecasting For Transnet Port Terminal (Tpt) In East London Using Artificial Neural Network

    Get PDF
    DissertationThe daily and weekly energy consumption patterns at the Transnet Port Terminal (TPT) in East London varies stochastically. This is as a result of the transient weather patterns that exist at the harbor. It has therefore become imperative to wisely manage this load in order to save electricity costs and for future infrastructure development. Hence the ongoing supply of electricity to port consumers requires an accurate and adequate short-term load forecast (STLF) for quality, quantity, and efficient management. Many researchers have recently proposed Artificial Neural Networks for short-term load prediction. However, most of the studies have not considered the quickly changing weather patterns that exist at the port. Therefore, the objective of this study is to establish a supervised short-term load prediction using ANN models, and to verify the effectiveness of such predictions by using the real load data from the TPT. The suggested system architecture uses open- loop training with real load and weather information, and then a closed-loop network is used to produce a prediction with the predicted load as its feedback data. Data collection points were set up in the ring network of the port by installing new power measuring meters, and weather data obtained from local meteorology offices in order to build a suitable alternative of localised data management (data base) for saving all data gathered. Hence, profiling of the load in the TPT was done and load forecasting was carried out, leading to improved load management strategies for the harbor terminal. ANN short-term load prediction (STLP) models were developed utilising its own performance to improve precision by essentially implementing a load feedback loop that is less reliant on external data. To ensure that the timeseries data recorded at the port were well modeled, the Nonlinear autoregressive exogenous model (NARX) for load prediction were developed using mean squared error (MSE) as a performance metric. Furthermore, to show the efficacy of the proposed model for STLP, the adaptive neuro-fuzzy inference system (ANFIS) was used with the same data for short-term predictions. The minimum mean squared errors obtained for both NARX and ANFIS models were 0.0010939 and 0.0032 respectively, indicating that the NARX model is more accurate during the forecast of departmental loads. The results of the predictions using the hourly timeseries indicated a close match between the forecasted and actual load demand at the port terminal. The effects of the load forecast could be used as a guide for implementing management plans for internal load, such as the generation of urgent electricity and the programme of implementation for demand-side management policies

    Intelligent Computing: The Latest Advances, Challenges and Future

    Get PDF
    Computing is a critical driving force in the development of human civilization. In recent years, we have witnessed the emergence of intelligent computing, a new computing paradigm that is reshaping traditional computing and promoting digital revolution in the era of big data, artificial intelligence and internet-of-things with new computing theories, architectures, methods, systems, and applications. Intelligent computing has greatly broadened the scope of computing, extending it from traditional computing on data to increasingly diverse computing paradigms such as perceptual intelligence, cognitive intelligence, autonomous intelligence, and human-computer fusion intelligence. Intelligence and computing have undergone paths of different evolution and development for a long time but have become increasingly intertwined in recent years: intelligent computing is not only intelligence-oriented but also intelligence-driven. Such cross-fertilization has prompted the emergence and rapid advancement of intelligent computing. Intelligent computing is still in its infancy and an abundance of innovations in the theories, systems, and applications of intelligent computing are expected to occur soon. We present the first comprehensive survey of literature on intelligent computing, covering its theory fundamentals, the technological fusion of intelligence and computing, important applications, challenges, and future perspectives. We believe that this survey is highly timely and will provide a comprehensive reference and cast valuable insights into intelligent computing for academic and industrial researchers and practitioners

    Transforming of traditional commerce into e-commerce: Trends in the world and in Ukraine

    Get PDF
    Given the current circumstances of the late 2010s-early 2020s (pandemic, war) in Ukraine, e-commerce has received a stable basis for even greater growth rates, so the implementation of e-commerce tools in businesses of all levels is especially relevant. The goal of the study was the analysis of the development of commerce in the world and Ukraine and analysis of the current state of e-commerce. To achieve the goal, the methods of analysis, synthesis, projection, expert opinions were used. So, the evolutionary nature of transformational processes in commerce in the world is substantiated, the importance of introduction of e-commerce tools in businesses of all levels is proved. A comprehensive analysis of the latest research on the development of e-commerce, dedicated to the improvement of all stages of online commerce is performed. Reasons for the current formation of global world commerce are identified and structured by social, economic and technological nature, the trend of further development of the trade, namely the growing share of online commerce, is highlighted and substantiated. The assessment of the results of impact of COVID-19 pandemic shock on the state of both traditional and e-commerce is carried out, a forecasted state of commerce in the coming years after the pandemic and the perspective tools to be used in e-commerce are projected. The main factors and features of global processes of transformation of traditional commerce into e-commerce are determined. A comparative analysis of the state of e-commerce in the world and in Ukraine, as well as in different product categories, is carried out. Trends in the development of commerce in the South Korean cosmetics market have been identified on the example of the Ukrainian company Lovely Bunny Group LLC. The functional up-to-date trends which allow increase of traffic and sales are described. The advisability of implementation of innovative tools in e-commerce marketing is substantiated. Study results may be interesting for the businesses of different levels to significantly increase sales efficiency in both the short and medium term

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    High-Performance Modelling and Simulation for Big Data Applications

    Get PDF
    This open access book was prepared as a Final Publication of the COST Action IC1406 “High-Performance Modelling and Simulation for Big Data Applications (cHiPSet)“ project. Long considered important pillars of the scientific method, Modelling and Simulation have evolved from traditional discrete numerical methods to complex data-intensive continuous analytical optimisations. Resolution, scale, and accuracy have become essential to predict and analyse natural and complex systems in science and engineering. When their level of abstraction raises to have a better discernment of the domain at hand, their representation gets increasingly demanding for computational and data resources. On the other hand, High Performance Computing typically entails the effective use of parallel and distributed processing units coupled with efficient storage, communication and visualisation systems to underpin complex data-intensive applications in distinct scientific and technical domains. It is then arguably required to have a seamless interaction of High Performance Computing with Modelling and Simulation in order to store, compute, analyse, and visualise large data sets in science and engineering. Funded by the European Commission, cHiPSet has provided a dynamic trans-European forum for their members and distinguished guests to openly discuss novel perspectives and topics of interests for these two communities. This cHiPSet compendium presents a set of selected case studies related to healthcare, biological data, computational advertising, multimedia, finance, bioinformatics, and telecommunications

    The 1st International Conference on Computational Engineering and Intelligent Systems

    Get PDF
    Computational engineering, artificial intelligence and smart systems constitute a hot multidisciplinary topic contrasting computer science, engineering and applied mathematics that created a variety of fascinating intelligent systems. Computational engineering encloses fundamental engineering and science blended with the advanced knowledge of mathematics, algorithms and computer languages. It is concerned with the modeling and simulation of complex systems and data processing methods. Computing and artificial intelligence lead to smart systems that are advanced machines designed to fulfill certain specifications. This proceedings book is a collection of papers presented at the first International Conference on Computational Engineering and Intelligent Systems (ICCEIS2021), held online in the period December 10-12, 2021. The collection offers a wide scope of engineering topics, including smart grids, intelligent control, artificial intelligence, optimization, microelectronics and telecommunication systems. The contributions included in this book are of high quality, present details concerning the topics in a succinct way, and can be used as excellent reference and support for readers regarding the field of computational engineering, artificial intelligence and smart system
    corecore