13 research outputs found

    Next-generation cyber attack prediction for IoT systems: leveraging multi-class SVM and optimized CHAID decision tree

    No full text
    Abstract Billions of gadgets are already online, making the IoT an essential aspect of daily life. However, the interconnected nature of IoT devices also leaves them open to cyber threats. The quantity and sophistication of cyber assaults aimed against Internet of Things (IoT) systems have skyrocketed in recent years. This paper proposes a next-generation cyber attack prediction framework for IoT systems. The framework uses the multi-class support vector machine (SVM) and the improved CHAID decision tree machine learning methods. IoT traffic is classified using a multi-class support vector machine to identify various types of attacks. The SVM model is then optimized with the help of the CHAID decision tree, which prioritizes the attributes most relevant to the categorization of attacks. The proposed framework was evaluated on a real-world dataset of IoT traffic. The findings demonstrate the framework's ability to categorize attacks accurately. The framework may determine which attributes are most crucial for attack categorization to enhance the SVM model's precision. The proposed technique focuses on network traffic characteristics that can be signs of cybersecurity threats on IoT networks and affected Network nodes. Selected feature vectors were also created utilizing the elements acquired on every IoT console. The evaluation results on the Multistep Cyber-Attack Dataset (MSCAD) show that the proposed CHAID decision tree can significantly predict the multi-stage cyber attack with 99.72% accuracy. Such accurate prediction is essential in managing cyber attacks in real-time communication. Because of its efficiency and scalability, the model may be used to forecast cyber attacks in real time, even in massive IoT installations. Because of its computing efficiency, it can make accurate predictions rapidly, allowing for prompt detection and action. By locating possible entry points for attacks and mitigating them, the framework helps strengthen the safety of IoT systems

    Forecasting students' adaptability in online entrepreneurship education using modified ensemble machine learning model

    No full text
    Entrepreneurship education has become essential in recent years. This education system may not be unconnected with the global agitation for value creation, employability skills and job creation. Engaging in entrepreneurial training provides students with the skills needed to enhance their ability to create marketable and profitable solutions to emerging problems. To do this, many emerging entrepreneurs rely on technology to engage in entrepreneurship education. This study presents a machine learning technique to predict the adaptability level of students in online entrepreneurship education. The suitability of different algorithms like Random Forest, C5.0, CART and Artificial Neural Network was examined using the Kaggle Educational dataset. The algorithms recorded a high accuracy rate and affirmed machine learning techniques' ability to forecast students' adaptation to online entrepreneurship training. The findings of this research contribute to the field of online entrepreneurship education by providing a reliable and efficient approach for predicting students' adaptability. The proposed modified ensemble machine learning model can assist educators and administrators in identifying students who may require additional support, tailoring instructional strategies, and designing targeted interventions to enhance their adaptability and overall learning experience in online entrepreneurship education

    Design and Implementation of an ML and IoT Based Adaptive Traffic-Management System for Smart Cities

    No full text
    The rapid growth in the number of vehicles has led to traffic congestion, pollution, and delays in logistic transportation in metropolitan areas. IoT has been an emerging innovation, moving the universe towards automated processes and intelligent management systems. This is a critical contribution to automation and smart civilizations. Effective and reliable congestion management and traffic control help save many precious resources. An IoT-based ITM system set of sensors is embedded in automatic vehicles and intelligent devices to recognize, obtain, and transmit data. Machine learning (ML) is another technique to improve the transport system. The existing transport-management solutions encounter several challenges resulting in traffic congestion, delay, and a high fatality rate. This research work presents the design and implementation of an Adaptive Traffic-management system (ATM) based on ML and IoT. The design of the proposed system is based on three essential entities: vehicle, infrastructure, and events. The design utilizes various scenarios to cover all the possible issues of the transport system. The proposed ATM system also utilizes the machine-learning-based DBSCAN clustering method to detect any accidental anomaly. The proposed ATM model constantly updates traffic signal schedules depending on traffic volume and estimated movements from nearby crossings. It significantly lowers traveling time by gradually moving automobiles across green signals and decreases traffic congestion by generating a better transition. The experiment outcomes reveal that the proposed ATM system significantly outperformed the conventional traffic-management strategy and will be a frontrunner for transportation planning in smart-city-based transport systems. The proposed ATM solution minimizes vehicle waiting times and congestion, reduces road accidents, and improves the overall journey experience

    Enhanced Convolutional Neural Network Model for Cassava Leaf Disease Identification and Classification

    No full text
    Cassava is a crucial food and nutrition security crop cultivated by small-scale farmers and it can survive in a brutal environment. It is a significant source of carbohydrates in African countries. Sometimes, Cassava crops can be infected by leaf diseases, affecting the overall production and reducing farmers’ income. The existing Cassava disease research encounters several challenges, such as poor detection rate, higher processing time, and poor accuracy. This research provides a comprehensive learning strategy for real-time Cassava leaf disease identification based on enhanced CNN models (ECNN). The existing Standard CNN model utilizes extensive data processing features, increasing the computational overhead. A depth-wise separable convolution layer is utilized to resolve CNN issues in the proposed ECNN model. This feature minimizes the feature count and computational overhead. The proposed ECNN model utilizes a distinct block processing feature to process the imbalanced images. To resolve the color segregation issue, the proposed ECNN model uses a Gamma correction feature. To decrease the variable selection process and increase the computational efficiency, the proposed ECNN model uses global average election polling with batch normalization. An experimental analysis is performed over an online Cassava image dataset containing 6256 images of Cassava leaves with five disease classes. The dataset classes are as follows: class 0: “Cassava Bacterial Blight (CBB)”; class 1: “Cassava Brown Streak Disease (CBSD)”; class 2: “Cassava Green Mottle (CGM)”; class 3: “Cassava Mosaic Disease (CMD)”; and class 4: “Healthy”. Various performance measuring parameters, i.e., precision, recall, measure, and accuracy, are calculated for existing Standard CNN and the proposed ECNN model. The proposed ECNN classifier significantly outperforms and achieves 99.3% accuracy for the balanced dataset. The test findings prove that applying a balanced database of images improves classification performance

    Enhanced Convolutional Neural Network Model for Cassava Leaf Disease Identification and Classification

    No full text
    Cassava is a crucial food and nutrition security crop cultivated by small-scale farmers and it can survive in a brutal environment. It is a significant source of carbohydrates in African countries. Sometimes, Cassava crops can be infected by leaf diseases, affecting the overall production and reducing farmers’ income. The existing Cassava disease research encounters several challenges, such as poor detection rate, higher processing time, and poor accuracy. This research provides a comprehensive learning strategy for real-time Cassava leaf disease identification based on enhanced CNN models (ECNN). The existing Standard CNN model utilizes extensive data processing features, increasing the computational overhead. A depth-wise separable convolution layer is utilized to resolve CNN issues in the proposed ECNN model. This feature minimizes the feature count and computational overhead. The proposed ECNN model utilizes a distinct block processing feature to process the imbalanced images. To resolve the color segregation issue, the proposed ECNN model uses a Gamma correction feature. To decrease the variable selection process and increase the computational efficiency, the proposed ECNN model uses global average election polling with batch normalization. An experimental analysis is performed over an online Cassava image dataset containing 6256 images of Cassava leaves with five disease classes. The dataset classes are as follows: class 0: “Cassava Bacterial Blight (CBB)”; class 1: “Cassava Brown Streak Disease (CBSD)”; class 2: “Cassava Green Mottle (CGM)”; class 3: “Cassava Mosaic Disease (CMD)”; and class 4: “Healthy”. Various performance measuring parameters, i.e., precision, recall, measure, and accuracy, are calculated for existing Standard CNN and the proposed ECNN model. The proposed ECNN classifier significantly outperforms and achieves 99.3% accuracy for the balanced dataset. The test findings prove that applying a balanced database of images improves classification performance

    Hybrid CNN-LSTM model with efficient hyperparameter tuning for prediction of Parkinson’s disease

    No full text
    Abstract The patients’ vocal Parkinson’s disease (PD) changes could be identified early on, allowing for management before physically incapacitating symptoms appear. In this work, static as well as dynamic speech characteristics that are relevant to PD identification are examined. Speech changes or communication issues are among the challenges that Parkinson’s individuals may encounter. As a result, avoiding the potential consequences of speech difficulties brought on by the condition depends on getting the appropriate diagnosis early. PD patients’ speech signals change significantly from those of healthy individuals. This research presents a hybrid model utilizing improved speech signals with dynamic feature breakdown using CNN and LSTM. The proposed hybrid model employs a new, pre-trained CNN with LSTM to recognize PD in linguistic features utilizing Mel-spectrograms derived from normalized voice signal and dynamic mode decomposition. The proposed Hybrid model works in various phases, which include Noise removal, extraction of Mel-spectrograms, feature extraction using pre-trained CNN model ResNet-50, and the final stage is applied for classification. An experimental analysis was performed using the PC-GITA disease dataset. The proposed hybrid model is compared with traditional NN and well-known machine learning-based CART and SVM & XGBoost models. The accuracy level achieved in Neural Network, CART, SVM, and XGBoost models is 72.69%, 84.21%, 73.51%, and 90.81%. The results show that under these four machine approaches of tenfold cross-validation and dataset splitting without samples overlapping one individual, the proposed hybrid model achieves an accuracy of 93.51%, significantly outperforming traditional ML models utilizing static features in detecting Parkinson’s disease

    Design and Implementation of an ML and IoT Based Adaptive Traffic-Management System for Smart Cities

    No full text
    The rapid growth in the number of vehicles has led to traffic congestion, pollution, and delays in logistic transportation in metropolitan areas. IoT has been an emerging innovation, moving the universe towards automated processes and intelligent management systems. This is a critical contribution to automation and smart civilizations. Effective and reliable congestion management and traffic control help save many precious resources. An IoT-based ITM system set of sensors is embedded in automatic vehicles and intelligent devices to recognize, obtain, and transmit data. Machine learning (ML) is another technique to improve the transport system. The existing transport-management solutions encounter several challenges resulting in traffic congestion, delay, and a high fatality rate. This research work presents the design and implementation of an Adaptive Traffic-management system (ATM) based on ML and IoT. The design of the proposed system is based on three essential entities: vehicle, infrastructure, and events. The design utilizes various scenarios to cover all the possible issues of the transport system. The proposed ATM system also utilizes the machine-learning-based DBSCAN clustering method to detect any accidental anomaly. The proposed ATM model constantly updates traffic signal schedules depending on traffic volume and estimated movements from nearby crossings. It significantly lowers traveling time by gradually moving automobiles across green signals and decreases traffic congestion by generating a better transition. The experiment outcomes reveal that the proposed ATM system significantly outperformed the conventional traffic-management strategy and will be a frontrunner for transportation planning in smart-city-based transport systems. The proposed ATM solution minimizes vehicle waiting times and congestion, reduces road accidents, and improves the overall journey experience

    Predicting the risk of heart failure based on clinical data

    Get PDF
    The disorder that directly impacts the heart and the blood vessels inside the body is cardiovascular disease (CVD). According to World Health Organization (WHO) reports, CVDs are the leading cause of mortality worldwide, claiming the human life of nearly 23.6 million people annually. The categorization of diseases in CVD includes Coronary Heart Disease, Strokes and Transient Ischaemic Attacks (TIA), Peripheral Arterial Disease, Aortic Disease. Most CVD fatalities are caused by strokes and heart attacks, with an estimated one third of these deaths currently happening before 60. The standard medical organization "New York Heart Association" (NYHA) categorize the various stages of heart failure as Class I: with no symptoms, Class II: mild symptoms, Class III: comfortable only when in resting position, Class IV: severe condition or Patient is bed-bound, and Class V: unable to determine the class. Machine Learning-based methods play an essential role in clinical data analysis. This research presents the importance of various essential attributes related to heart disease based on a hybrid machine learning model. The proposed hybrid model SVM-GA is based on a Support vector machine (SVM) and the Genetic Algorithm (GA). This research analyzed an online dataset obtainable at the UCI machine learning repository with the medical data of 299 patients who suffered from heart failures and are classified as Class III or IV as per the standard NYHA. This dataset was collected through patients' available follow-up and checkup duration and involved thirteen clinical characteristics. The proposed Machine Learning models were used to calculate feature importance in this research. The proposed model and existing well-known machine learning based-models, i.e., Bayesian Generalized Linear Model, ANN, Bagged CART, Bag Earth, and SVM, are implemented using python and various performance measuring parameters, i.e.,,, Accuracy, processing time, precision, recall, f-measures are calculated. Experimental analysis shows the proposed SVM-GA model strengthens in terms of better Accuracy, processing time, precision, recall, f-measures over existing methods

    Comparative analysis of metaheuristic load balancing algorithms for efficient load balancing in cloud computing

    No full text
    Load balancing is a serious problem in cloud computing that makes it challenging to ensure the proper functioning of services contiguous to the Quality of Service, performance assessment, and compliance to the service contract as demanded from cloud service providers (CSP) to organizations. The primary objective of load balancing is to map workloads to use computing resources that significantly improve performance. Load balancing in cloud computing falls under the class of concerns defined as "NP-hard" issues due to vast solution space. Therefore it requires more time to predict the best possible solution. Few techniques can perhaps generate an ideal solution under a polynomial period to fix these issues. In previous research, Metaheuristic based strategies have been confirmed to accomplish accurate solutions under a decent period for those kinds of issues. This paper provides a comparative analysis of various metaheuristic load balancing algorithms for cloud computing based on performance factors i.e., Makespan time, degree of imbalance, response time, data center processing time, flow time, and resource utilization. The simulation results show the performance of various Meta-heuristic Load balancing methods, based on performance factors. The Particle swarm optimization method performs better in improving makespan, flow time, throughput time, response time, and degree of imbalance
    corecore