9,398 research outputs found

    What attracts vehicle consumers’ buying:A Saaty scale-based VIKOR (SSC-VIKOR) approach from after-sales textual perspective?

    Get PDF
    Purpose: The increasingly booming e-commerce development has stimulated vehicle consumers to express individual reviews through online forum. The purpose of this paper is to probe into the vehicle consumer consumption behavior and make recommendations for potential consumers from textual comments viewpoint. Design/methodology/approach: A big data analytic-based approach is designed to discover vehicle consumer consumption behavior from online perspective. To reduce subjectivity of expert-based approaches, a parallel Naïve Bayes approach is designed to analyze the sentiment analysis, and the Saaty scale-based (SSC) scoring rule is employed to obtain specific sentimental value of attribute class, contributing to the multi-grade sentiment classification. To achieve the intelligent recommendation for potential vehicle customers, a novel SSC-VIKOR approach is developed to prioritize vehicle brand candidates from a big data analytical viewpoint. Findings: The big data analytics argue that “cost-effectiveness” characteristic is the most important factor that vehicle consumers care, and the data mining results enable automakers to better understand consumer consumption behavior. Research limitations/implications: The case study illustrates the effectiveness of the integrated method, contributing to much more precise operations management on marketing strategy, quality improvement and intelligent recommendation. Originality/value: Researches of consumer consumption behavior are usually based on survey-based methods, and mostly previous studies about comments analysis focus on binary analysis. The hybrid SSC-VIKOR approach is developed to fill the gap from the big data perspective

    Analysis of Segmentation Parameters Effect towards Parallel Processing Time on Fuzzy C Means Algorithm

    Get PDF
    Fuzzy C Means algorithm or FCM is one of many clustering algorithms that has better accuracy to solve problems related to segmentation. Its application is almost in every aspects of life and many disciplines of science. However, this algorithm has some shortcomings, one of them is the large amount of processing time consumption. This research conducted mainly to do an analysis about the effect of segmentation parameters towards processing time in sequential and parallel. The other goal is to reduce the processing time of segmentation process using parallel approach. Parallel processing applied on Nvidia GeForce GT540M GPU using CUDA v8.0 framework. The experiment conducted on natural RGB color image sized 256x256 and 512x512. The settings of segmentation parameter values were done as follows, weight in range (2-3), number of iteration (50-150), number of cluster (2-8), and error tolerance or epsilon (0.1 – 1e-06). The results obtained by this research as follows, parallel processing time is faster 4.5 times than sequential time with similarity level of image segmentations generated both of processing types is 100%. The influence of segmentation parameter values towards processing times in sequential and parallel can be concluded as follows, the greater value of weight parameter then the sequential processing time becomes short, however it has no effects on parallel processing time. For iteration and cluster parameters, the greater their values will make processing time consuming in sequential and parallel become large. Meanwhile the epsilon parameter has no effect or has an unpredictable tendency on both of processing time

    Enhancement Techniques and Methods for Brain MRI Imaging

    Get PDF
    In this paper, it is planned to review and compare the different methods of enhancing a DICOM of brain MRIused in preprocessing and segmentation techniques. Image segmentation is the process of partitioning an image into multiple segments, so as to change the representation of an image into something that is more meaningful and easier to analyze. Several general-purpose algorithms and techniques have been developed for image segmentation. This paper describes the different segmentation techniques used in the field of ultrasound, MR image and SAR Image Processing. In preprocessing and enhancement stage is used to eliminate the noise and high frequency components from DICOM image. In this paper, various Preprocessing and Enhancement Technique, Segmentation Algorithm and their compared

    Forecasting Long-Term Government Bond Yields: An Application of Statistical and AI Models

    Get PDF
    This paper evaluates several artificial intelligence and classical algorithms on their ability of forecasting the monthly yield of the US 10-year Treasury bonds from a set of four economic indicators. Due to the complexity of the prediction problem, the task represents a challenging test for the algorithms under evaluation. At the same time, the study is of particular significance for the important and paradigmatic role played by the US market in the world economy. Four data-driven artificial intelligence approaches are considered, namely, a manually built fuzzy logic model, a machine learned fuzzy logic model, a self-organising map model and a multi-layer perceptron model. Their performance is compared with the performance of two classical approaches, namely, a statistical ARIMA model and an econometric error correction model. The algorithms are evaluated on a complete series of end-month US 10-year Treasury bonds yields and economic indicators from 1986:1 to 2004:12. In terms of prediction accuracy and reliability of the modelling procedure, the best results are obtained by the three parametric regression algorithms, namely the econometric, the statistical and the multi-layer perceptron model. Due to the sparseness of the learning data samples, the manual and the automatic fuzzy logic approaches fail to follow with adequate precision the range of variations of the US 10-year Treasury bonds. For similar reasons, the self-organising map model gives an unsatisfactory performance. Analysis of the results indicates that the econometric model has a slight edge over the statistical and the multi-layer perceptron models. This suggests that pure data-driven induction may not fully capture the complicated mechanisms ruling the changes in interest rates. Overall, the prediction accuracy of the best models is only marginally better than the prediction accuracy of a basic one-step lag predictor. This result highlights the difficulty of the modelling task and, in general, the difficulty of building reliable predictors for financial markets.interest rates; forecasting; neural networks; fuzzy logic.

    The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms

    Get PDF
    open access articleWe present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them

    Development of Neurofuzzy Architectures for Electricity Price Forecasting

    Get PDF
    In 20th century, many countries have liberalized their electricity market. This power markets liberalization has directed generation companies as well as wholesale buyers to undertake a greater intense risk exposure compared to the old centralized framework. In this framework, electricity price prediction has become crucial for any market player in their decision‐making process as well as strategic planning. In this study, a prototype asymmetric‐based neuro‐fuzzy network (AGFINN) architecture has been implemented for short‐term electricity prices forecasting for ISO New England market. AGFINN framework has been designed through two different defuzzification schemes. Fuzzy clustering has been explored as an initial step for defining the fuzzy rules while an asymmetric Gaussian membership function has been utilized in the fuzzification part of the model. Results related to the minimum and maximum electricity prices for ISO New England, emphasize the superiority of the proposed model over well‐established learning‐based models
    corecore