131,212 research outputs found

    Neuro-rough trading rules for mining Kuala Lumpur composite index

    Get PDF
    Stock market plays a vital role in the economic performance. Typically, it is used to infer the economic situation of a particular nation. However, information regarding a stock market is normally incomplete, uncertain and vague, making it a challenge to predict the future economic performance. In order to represent the market, attending to granular information is required. In recent years, many researches in stock market prediction are conducted using diverse Artificial Intelligence approaches. These artificial applications have shown superior prediction results. As such, in this study, a prediction enhancement alleged as Neuro-Rough (NR) is proposed to forecast the Kuala Lumpur Stock Exchange Composite Index (KLCI) movements. NR hybridizes high generality of artificial neural network (ANN) and rules extraction ability of rough sets theory (RST) by demonstrating the capability of simplifying the time series data and dealing with uncertain information. Features of stock market data are extracted and presented in a set of decision attribute to the NR systems. The length of the stock market trend is used to assist the process of identifying the trading signals. A pilot experiment is conducted to discover the best discretization algorithm and ANN structure. NR is implemented in a trading simulation and its effectiveness is verified by analyzing the classifier output against the information provided in Bursa Malaysia's annual reports. The experiments using 10 years training and testing data reveal that NR achieves an accuracy of 70% with generated annual profit in trading simulation of 74.33%

    Computing fuzzy rough approximations in large scale information systems

    Get PDF
    Rough set theory is a popular and powerful machine learning tool. It is especially suitable for dealing with information systems that exhibit inconsistencies, i.e. objects that have the same values for the conditional attributes but a different value for the decision attribute. In line with the emerging granular computing paradigm, rough set theory groups objects together based on the indiscernibility of their attribute values. Fuzzy rough set theory extends rough set theory to data with continuous attributes, and detects degrees of inconsistency in the data. Key to this is turning the indiscernibility relation into a gradual relation, acknowledging that objects can be similar to a certain extent. In very large datasets with millions of objects, computing the gradual indiscernibility relation (or in other words, the soft granules) is very demanding, both in terms of runtime and in terms of memory. It is however required for the computation of the lower and upper approximations of concepts in the fuzzy rough set analysis pipeline. Current non-distributed implementations in R are limited by memory capacity. For example, we found that a state of the art non-distributed implementation in R could not handle 30,000 rows and 10 attributes on a node with 62GB of memory. This is clearly insufficient to scale fuzzy rough set analysis to massive datasets. In this paper we present a parallel and distributed solution based on Message Passing Interface (MPI) to compute fuzzy rough approximations in very large information systems. Our results show that our parallel approach scales with problem size to information systems with millions of objects. To the best of our knowledge, no other parallel and distributed solutions have been proposed so far in the literature for this problem
    • …
    corecore