25 research outputs found

    Enhanced Manhattan-based Clustering using Fuzzy C-Means Algorithm for High Dimensional Datasets

    Get PDF
    The problem of mining a high dimensional data includes a high computational cost, a high dimensional dataset composed of thousands of attribute and or instances. The efficiency of an algorithm, specifically, its speed is oftentimes sacrificed when this kind of dataset is supplied to the algorithm. Fuzzy C-Means algorithm is one which suffers from this problem. This clustering algorithm requires high computational resources as it processes whether low or high dimensional data. Netflix data rating, small round blue cell tumors (SRBCTs) and Colon Cancer (52,308, and 2,000 of attributes and 1500, 83 and 62 of instances respectively) dataset were identified as a high dimensional dataset. As such, the Manhattan distance measure employing the trigonometric function was used to enhance the fuzzy c-means algorithm. Results show an increase on the efficiency of processing large amount of data using the Netflix ,Colon cancer and SRCBT an (39,296, 38,952 and 85,774 milliseconds to complete the different clusters, respectively) average of 54,674 milliseconds while Manhattan distance measure took an average of (36,858, 36,501 and 82,86 milliseconds, respectively)  52,703 milliseconds for the entire dataset to cluster. On the other hand, the enhanced Manhattan distance measure took (33,216, 32,368 and 81,125 milliseconds, respectively) 48,903 seconds on clustering the datasets. Given the said result, the enhanced Manhattan distance measure is 11% more efficient compared to Euclidean distance measure and 7% more efficient than the Manhattan distance measure respectively

    Towards An Enhanced Backpropagation Network for Short-Term Load Demand Forecasting

    Get PDF
    Artificial neural networks (ANNs) are ideal for the prediction and classification of non-linear relationships however they are also known for computational intensity and long training times especially when large data sets are used. A two-tiered approach combining data mining algorithms is proposed in order to enhance an artificial neural network’s performance when applied to a phenomenon exhibits predictable changes every calendar year such as that of electrical load demand. This approach is simulated using the French zonal load data for 2016 and 2017. The first tier performs clustering into seasons and classification into day-types. The second tier uses artificial neural networks to forecast 24-hour loads. The first tier results are the focus of this. The K-means algorithm is first applied to the morning slope feature of the data set and a comparison is then made between the Naïve Bayes algorithm and the k-Nearest Neighbors algorithm to determine the better classifier for this particular data set. The first tier results show that calendar-based clustering does not accurately reflect electrical load behavior. The results also show that k-Nearest Neighbors is the better classifier for this particular data set. It is expected that by optimizing the data set and reducing training time, the learning performance of ANN-based short-term load demand forecasting

    Search for dark matter produced in association with bottom or top quarks in √s = 13 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for weakly interacting massive particle dark matter produced in association with bottom or top quarks is presented. Final states containing third-generation quarks and miss- ing transverse momentum are considered. The analysis uses 36.1 fb−1 of proton–proton collision data recorded by the ATLAS experiment at √s = 13 TeV in 2015 and 2016. No significant excess of events above the estimated backgrounds is observed. The results are in- terpreted in the framework of simplified models of spin-0 dark-matter mediators. For colour- neutral spin-0 mediators produced in association with top quarks and decaying into a pair of dark-matter particles, mediator masses below 50 GeV are excluded assuming a dark-matter candidate mass of 1 GeV and unitary couplings. For scalar and pseudoscalar mediators produced in association with bottom quarks, the search sets limits on the production cross- section of 300 times the predicted rate for mediators with masses between 10 and 50 GeV and assuming a dark-matter mass of 1 GeV and unitary coupling. Constraints on colour- charged scalar simplified models are also presented. Assuming a dark-matter particle mass of 35 GeV, mediator particles with mass below 1.1 TeV are excluded for couplings yielding a dark-matter relic density consistent with measurements

    Measurements of top-quark pair differential cross-sections in the eμe\mu channel in pppp collisions at s=13\sqrt{s} = 13 TeV using the ATLAS detector

    Get PDF

    Measurement of the W boson polarisation in ttˉt\bar{t} events from pp collisions at s\sqrt{s} = 8 TeV in the lepton + jets channel with ATLAS

    Get PDF

    Search for new phenomena in events containing a same-flavour opposite-sign dilepton pair, jets, and large missing transverse momentum in s=\sqrt{s}= 13 pppp collisions with the ATLAS detector

    Get PDF

    Measurement of jet fragmentation in Pb+Pb and pppp collisions at sNN=2.76\sqrt{{s_\mathrm{NN}}} = 2.76 TeV with the ATLAS detector at the LHC

    Get PDF

    Healthcare Expert System based on the Group Cooperation Model

    No full text
    Expert systems in healthcare services are widely studied by researchers where the accuracy on diagnosing and efficiency of the system are the considerations. Also, recent researches include decision support services, expert medical services and autonomous management are based on multi-agent systems. The cooperation of the agents is crucial in analyzing, and managing the data of patient to detect abnormal patterns and provide an advance treatment and prevent loss of life. This research shows an object implementation of the healthcare expert system (HES) based on the group cooperation model. An object performs data mining for specific disease diagnosing tool used by the expert. The proposed agent managers coordinate the processing of client requests. Replicas of object in servers are introduced to provide quality of service for clients. An adaptive scheme managed by the load balancing service is presented. The implementation of the healthcare expert system using the group cooperation model is presented. 1

    Optimization of Location Management using Multi-Agent for Distributed Location Based Services

    No full text
    Location management is very important in location based services to provide services to the mobile users like banking, city guides and many more. Ubiquitous and mobile devices are the source of data in location management and its significant operations are update and search method. Some studies to improve these were presented by using optimal sequential paging, location area scheme and hierarchical database scheme. In addition, not all location based services have the same access methods on data and it lead to difficulties of providing services. A proposed location management of multi-agent is presented in this study. It shows the coordination of the agents on the distributed location-based services. In addition, the proposal focuses on the location management of the mobile object presented in a hierarchical search and update. An optimal search of mobile object is introduced. The method is done by agents which learns from the previous mobile request data in this research and decides the location of mobile object. The result using the technique optimizes the search method of the location management. 1
    corecore