117 research outputs found

    An Improved Binary Grey-Wolf Optimizer with Simulated Annealing for Feature Selection

    Get PDF
    This paper proposes improvements to the binary grey-wolf optimizer (BGWO) to solve the feature selection (FS) problem associated with high data dimensionality, irrelevant, noisy, and redundant data that will then allow machine learning algorithms to attain better classification/clustering accuracy in less training time. We propose three variants of BGWO in addition to the standard variant, applying different transfer functions to tackle the FS problem. Because BGWO generates continuous values and FS needs discrete values, a number of V-shaped, S-shaped, and U-shaped transfer functions were investigated for incorporation with BGWO to convert their continuous values to binary. After investigation, we note that the performance of BGWO is affected by the selection of the transfer function. Then, in the first variant, we look to reduce the local minima problem by integrating an exploration capability to update the position of the grey wolf randomly within the search space with a certain probability; this variant was abbreviated as IBGWO. Consequently, a novel mutation strategy is proposed to select a number of the worst grey wolves in the population which are updated toward the best solution and randomly within the search space based on a certain probability to determine if the update is either toward the best or randomly. The number of the worst grey wolf selected by this strategy is linearly increased with the iteration. Finally, this strategy is combined with IBGWO to produce the second variant of BGWO that was abbreviated as LIBGWO. In the last variant, simulated annealing (SA) was integrated with LIBGWO to search around the best-so-far solution at the end of each iteration in order to identify better solutions. The performance of the proposed variants was validated on 32 datasets taken from the UCI repository and compared with six wrapper feature selection methods. The experiments show the superiority of the proposed improved variants in producing better classification accuracy than the other selected wrapper feature selection algorithms

    An Optimization Model for Appraising Intrusion-Detection Systems for Network Security Communications:Applications, Challenges, and Solutions

    Get PDF
    Cyber-attacks are getting increasingly complex, and as a result, the functional concerns of intrusion-detection systems (IDSs) are becoming increasingly difficult to resolve. The credibility of security services, such as privacy preservation, authenticity, and accessibility, may be jeopardized if breaches are not detected. Different organizations currently utilize a variety of tactics, strategies, and technology to protect the systems’ credibility in order to combat these dangers. Safeguarding approaches include establishing rules and procedures, developing user awareness, deploying firewall and verification systems, regulating system access, and forming computer-issue management groups. The effectiveness of intrusion-detection systems is not sufficiently recognized. IDS is used in businesses to examine possibly harmful tendencies occurring in technological environments. Determining an effective IDS is a complex task for organizations that require consideration of many key criteria and their sub-aspects. To deal with these multiple and interrelated criteria and their sub-aspects, a multi-criteria decision-making (MCMD) approach was applied. These criteria and their sub-aspects can also include some ambiguity and uncertainty, and thus they were treated using q-rung orthopair fuzzy sets (q-ROFS) and q-rung orthopair fuzzy numbers (q-ROFNs). Additionally, the problem of combining expert and specialist opinions was dealt with using the q-rung orthopair fuzzy weighted geometric (q-ROFWG). Initially, the entropy method was applied to assess the priorities of the key criteria and their sub-aspects. Then, the combined compromised solution (CoCoSo) method was applied to evaluate six IDSs according to their effectiveness and reliability. Afterward, comparative and sensitivity analyses were performed to confirm the stability, reliability, and performance of the proposed approach. The findings indicate that most of the IDSs appear to be systems with high potential. According to the results, Suricata is the best IDS that relies on multi-threading performance

    Fast T wave detection calibrated by clinical knowledge with annotation of P and T waves

    Get PDF
    There are limited studies on the automatic detection of T waves in arrhythmic electrocardiogram (ECG) signals. This is perhaps because there is no available arrhythmia dataset with annotated T waves. There is a growing need to develop numerically-efficient algorithms that can accommodate the new trend of battery-driven ECG devices. Moreover, there is also a need to analyze long-term recorded signals in a reliable and time-efficient manner, therefore improving the diagnostic ability of mobile devices and point-of-care technologies.Here, the T wave annotation of the well-known MIT-BIH arrhythmia database is discussed and provided. Moreover, a simple fast method for detecting T waves is introduced. A typical T wave detection method has been reduced to a basic approach consisting of two moving averages and dynamic thresholds. The dynamic thresholds were calibrated using four clinically known types of sinus node response to atrial premature depolarization (compensation, reset, interpolation, and reentry).The determination of T wave peaks is performed and the proposed algorithm is evaluated on two well-known databases, the QT and MIT-BIH Arrhythmia databases. The detector obtained a sensitivity of 97.14% and a positive predictivity of 99.29% over the first lead of the validation databases (total of 221,186 beats).We present a simple yet very reliable T wave detection algorithm that can be potentially implemented on mobile battery-driven devices. In contrast to complex methods, it can be easily implemented in a digital filter design.Mohamed Elgendi, Bjoern Eskofier and Derek Abbot

    Thrombotic and Hemorrhagic Complications Following Left Ventricular Assist Device Placement: An Emphasis on Gastrointestinal Bleeding, Stroke, and Pump Thrombosis

    Get PDF
    The left ventricular assist device (LVAD) is a mechanical circulatory support device that supports the heart failure patient as a bridge to transplant (BTT) or as a destination therapy for those who have other medical comorbidities or complications that disqualify them from meeting transplant criteria. In patients with severe heart failure, LVAD use has extended survival and improved signs and symptoms of cardiac congestion and low cardiac output, such as dyspnea, fatigue, and exercise intolerance. However, these devices are associated with specific hematologic and thrombotic complications. In this manuscript, we review the common hematologic complications of LVADs

    Detection of a and b waves in the acceleration photoplethysmogram

    Get PDF
    Background: Analyzing acceleration photoplethysmogram (APG) signals measured after exercise is challenging. In this paper, a novel algorithm that can detect a waves and consequently b waves under these conditions is proposed. Accurate a and b wave detection is an important first step for the assessment of arterial stiffness and other cardiovascular parameters. Methods: Nine algorithms based on fixed thresholding are compared, and a new algorithm is introduced to improve the detection rate using a testing set of heat stressed APG signals containing a total of 1,540 heart beats. Results: The new a detection algorithm demonstrates the highest overall detection accuracy—99.78% sensitivity, 100% positive predictivity—over signals that suffer from 1) non-stationary effects, 2) irregular heartbeats, and 3) low amplitude waves. In addition, the proposed b detection algorithm achieved an overall sensitivity of 99.78% and a positive predictivity of 99.95%. Conclusions: The proposed algorithm presents an advantage for real-time applications by avoiding human intervention in threshold determination.Mohamed Elgendi, Ian Norton, Matt Brearley, Derek Abbott, and Dale Schuurman

    Arm movement speed assessment via a Kinect camera: A preliminary study in healthy subjects

    Get PDF
    Background: Many clinical studies have shown that the arm movement of patients with neurological injury is often slow. In this paper, the speed of arm movements in healthy subjects is evaluated in order to validate the efficacy of using a Kinect camera for automated analysis. The consideration of arm movement appears trivial at first glance, but in reality it is a very complex neural and biomechanical process that can potentially be used for detecting neurological disorders. Methods: We recorded handmovements using a Kinect camera from 27 healthy subjects (21 males) with a mean age of 29 years undergoing three different arbitrary arm movement speeds: fast, medium, and slow. Results: Our developed algorithm is able to classify the three arbitrary speed classes with an overall error of 5.43% for interclass speed classification and 0.49% for intraclass classification. Conclusions: This is the first step toward laying the foundation for future studies that investigate abnormality in arm movement via use of a Kinect camera.Mohamed Elgendi, Flavien Picon, Nadia Magnenat-Thalmann and Derek Abbot

    Systolic peak detection in acceleration photoplethysmograms measured from emergency responders in tropical conditions

    Get PDF
    Photoplethysmogram (PPG) monitoring is not only essential for critically ill patients in hospitals or at home, but also for those undergoing exercise testing. However, processing PPG signals measured after exercise is challenging, especially if the environment is hot and humid. In this paper, we propose a novel algorithm that can detect systolic peaks under challenging conditions, as in the case of emergency responders in tropical conditions. Accurate systolic-peak detection is an important first step for the analysis of heart rate variability. Algorithms based on local maxima-minima, first-derivative, and slope sum are evaluated, and a new algorithm is introduced to improve the detection rate. With 40 healthy subjects, the new algorithm demonstrates the highest overall detection accuracy (99.84% sensitivity, 99.89% positive predictivity). Existing algorithms, such as Billauer's, Li's and Zong's, have comparable although lower accuracy. However, the proposed algorithm presents an advantage for real-time applications by avoiding human intervention in threshold determination. For best performance, we show that a combination of two event-related moving averages with an offset threshold has an advantage in detecting systolic peaks, even in heat-stressed PPG signals.Mohamed Elgendi, Ian Norton, Matt Brearley, Derek Abbott, Dale Schuurman

    Generalized β\beta-conformal change and special Finsler spaces

    Full text link
    In this paper, we investigate the change of Finslr metrics L(x,y)Lˉ(x,y)=f(eσ(x)L(x,y),β(x,y)),L(x,y) \to\bar{L}(x,y) = f(e^{\sigma(x)}L(x,y),\beta(x,y)), which we refer to as a generalized β\beta-conformal change. Under this change, we study some special Finsler spaces, namely, quasi C-reducible, semi C-reducible, C-reducible, C2C_2-like, S3S_3-like and S4S_4-like Finsler spaces. We also obtain the transformation of the T-tensor under this change and study some interesting special cases. We then impose a certain condition on the generalized β\beta-conformal change, which we call the b-condition, and investigate the geometric consequences of such condition. Finally, we give the conditions under which a generalized β\beta-conformal change is projective and generalize some known results in the literature.Comment: References added, some modifications are performed, LateX file, 24 page
    corecore