11 research outputs found

    A Novel Hierarchical Extreme Machine-Learning-Based Approach for Linear Attenuation Coefficient Forecasting

    Get PDF
    The development of reinforced polymer composite materials has had a significant influence on the challenging problem of shielding against high-energy photons, particularly X-rays and γ-rays in industrial and healthcare facilities. Heavy materials’ shielding characteristics hold a lot of potential for bolstering concrete chunks. The mass attenuation coefficient is the main physical factor that is utilized to measure the narrow beam γ-ray attenuation of various combinations of magnetite and mineral powders with concrete. Data-driven machine learning approaches can be investigated to assess the gamma-ray shielding behavior of composites as an alternative to theoretical calculations, which are often time- and resource-intensive during workbench testing. We developed a dataset using magnetite and seventeen mineral powder combinations at different densities and water/cement ratios, exposed to photon energy ranging from 1 to 1006 kiloelectronvolt (KeV). The National Institute of Standards and Technology (NIST) photon cross-section database and software methodology (XCOM) was used to compute the concrete’s γ-ray shielding characteristics (LAC). The XCOM-calculated LACs and seventeen mineral powders were exploited using a range of machine learning (ML) regressors. The goal was to investigate whether the available dataset and XCOM-simulated LAC can be replicated using ML techniques in a data-driven approach. The minimum absolute error (MAE), root mean square error (RMSE), and R2score were employed to assess the performance of our proposed ML models, specifically a support vector machine (SVM), 1d-convolutional neural network (CNN), multi-Layer perceptrons (MLP), linear regressor, decision tree, hierarchical extreme machine learning (HELM), extreme learning machine (ELM), and random forest networks. Comparative results showed that our proposed HELM architecture outperformed state-of-the-art SVM, decision tree, polynomial regressor, random forest, MLP, CNN, and conventional ELM models. Stepwise regression and correlation analysis were further used to evaluate the forecasting capability of ML techniques compared to the benchmark XCOM approach. According to the statistical analysis, the HELM model showed strong consistency between XCOM and predicted LAC values. Additionally, the HELM model performed better in terms of accuracy than the other models used in this study, yielding the highest R2score and the lowest MAE and RMSE

    Arabic Sentiment Analysis Based on Word Embeddings and Deep Learning

    Get PDF
    Social media networks have grown exponentially over the last two decades, providing the opportunity for users of the internet to communicate and exchange ideas on a variety of topics. The outcome is that opinion mining plays a crucial role in analyzing user opinions and applying these to guide choices, making it one of the most popular areas of research in the field of natural language processing. Despite the fact that several languages, including English, have been the subjects of several studies, not much has been conducted in the area of the Arabic language. The morphological complexities and various dialects of the language make semantic analysis particularly challenging. Moreover, the lack of accurate pre-processing tools and limited resources are constraining factors. This novel study was motivated by the accomplishments of deep learning algorithms and word embeddings in the field of English sentiment analysis. Extensive experiments were conducted based on supervised machine learning in which word embeddings were exploited to determine the sentiment of Arabic reviews. Three deep learning algorithms, convolutional neural networks (CNNs), long short-term memory (LSTM), and a hybrid CNN-LSTM, were introduced. The models used features learned by word embeddings such as Word2Vec and fastText rather than hand-crafted features. The models were tested using two benchmark Arabic datasets: Hotel Arabic Reviews Dataset (HARD) for hotel reviews and Large-Scale Arabic Book Reviews (LARB) for book reviews, with different setups. Comparative experiments utilized the three models with two-word embeddings and different setups of the datasets. The main novelty of this study is to explore the effectiveness of using various word embeddings and different setups of benchmark datasets relating to balance, imbalance, and binary and multi-classification aspects. Findings showed that the best results were obtained in most cases when applying the fastText word embedding using the HARD 2-imbalance dataset for all three proposed models: CNN, LSTM, and CNN-LSTM. Further, the proposed CNN model outperformed the LSTM and CNN-LSTM models for the benchmark HARD dataset by achieving 94.69%, 94.63%, and 94.54% accuracy with fastText, respectively. Although the worst results were obtained for the LABR 3-imbalance dataset using both Word2Vec and FastText, they still outperformed other researchers’ state-of-the-art outcomes applying the same dataset

    IoT-Based Cotton Plant Pest Detection and Smart-Response System

    No full text
    IoT technology and drones are indeed a step towards modernization. Everything from field monitoring to pest identification is being conducted through these technologies. In this paper, we consider the issue of smart pest detection and management of cotton plants which is an important crop for an agricultural country. We proposed an IoT framework to detect insects through motion detection sensors and then receive an automatic response using drones based targeted spray. In our proposed method, we also explored the use of drones to improve field surveillance and then proposed a predictive algorithm for a pest detection response system using a decision-making theory. To validate the working behavior of our framework, we have included the simulation results of the tested scenarios in the cup-carbon IoT simulator. The purpose of our work is to modernize pest management so that farmers can not only attain higher profits but can also increase the quantity and quality of their crops

    IoT-Based Cotton Plant Pest Detection and Smart-Response System

    No full text
    IoT technology and drones are indeed a step towards modernization. Everything from field monitoring to pest identification is being conducted through these technologies. In this paper, we consider the issue of smart pest detection and management of cotton plants which is an important crop for an agricultural country. We proposed an IoT framework to detect insects through motion detection sensors and then receive an automatic response using drones based targeted spray. In our proposed method, we also explored the use of drones to improve field surveillance and then proposed a predictive algorithm for a pest detection response system using a decision-making theory. To validate the working behavior of our framework, we have included the simulation results of the tested scenarios in the cup-carbon IoT simulator. The purpose of our work is to modernize pest management so that farmers can not only attain higher profits but can also increase the quantity and quality of their crops

    Customer Profiling Using Internet of Things Based Recommendations

    No full text
    The digital revolution caused major changes in the world because not only are people increasingly connected, but companies are also turning more to the use of intelligent systems. The large amount of information about each product provided by the e-commerce websites may confuse the customers in their choices. The recommendations system and Internet of Things (IoT) are being used by an increasing number of e-commerce websites to help customers find products that fit their profile and to purchase what they had already chosen. This paper proposes a novel IoT based system that would serve as the foundation for creating a profile, which will store all the contextual data, personalize the content, and create a personal profile for each user. In addition, customer segmentation is used to determine which items the client wants. Next, statistical analysis is performed on the extracted data, where feelings, state of mind, and categorization play a critical role in forecasting what customers think about products, services, and so on. We will assess the accuracy of the forecasts to identify the most appropriate products based on the multi-source data thanks to the IoT, which assigns a digital footprint linking customers, processes, and things through identity-based information and recommendations, which is applied by using Raspberry Pi and other sensors such as the camera. Moreover, we perform experiments on the recommendation system to gauge the precision in predictions and recommendations

    COV-CTX: A Deep Learning Approach to Detect COVID-19 from Lung CT and X-Ray Images

    No full text
    With the massive outbreak of coronavirus (COVID-19) disease, the demand for automatic and quick detection of COVID-19 has become a crucial challenge for scientists around the world. Many researchers are working on finding an automated and effective system for detecting COVID-19. They have found that computed tomography (CT-scan) and X-ray images of COVID-19 infected patients can provide more accurate and faster results. In this paper, an automated system is proposed named as COV-CTX which can detect COVID-19 from CT-scan and X-ray images. The system consists of three different CNN models: VGG16, VGG16- InceptionV3-ResNet50, and Francois CNN. The models are trained with CT-scan and X-ray images individually to classify COVID-19 and non-COVID patients. Finally, the results of the models are combined to develop a voting ensemble of classifiers to ensure more accurate and precise results. The three models are trained and validated with 9412 CT-scan images (4756 numbers of COVID positive and 4656 numbers of non-COVID images) and 3257 X-ray images (1647 numbers of COVID positive and 1610 numbers of non-COVID images). The proposed system, COV-CTX provides up to 96.37% accuracy, 96.71% precision, 96.02% F1-score, 97.24% sensitivity, 95.35% specificity, 92.68% Cohens Kappa score for CT-scan image based COVID-19 detection and 99.23% accuracy, 99.37% precision, 99.22% F1-score, 99.39% sensitivity, 99.07% specificity, 98.46% Cohens Kappa score for X-ray image based COVID-19 detection

    Double Cloak Area Approach for Preserving Privacy and Reliability of Crowdsourcing Data

    No full text
    Crowdsourcing has emerged as a pivotal data source for diverse smart city applications, ranging from health and traffic to security and safety. However, the integration of users’ location data in crowdsourced information poses a significant privacy challenge. Current privacy protection approaches of location-based services have become inadequate to face the evolving attackers’ techniques and tools. Moreover, these protection methods ignored the issue of preserving the accuracy and reliability of data. This paper introduces a novel approach, termed Double Cloak Area (DCL-Ar), designed to effectively safeguard users’ location privacy and ensure the reliability of data based on crowdsourcing. DCL-Ar differentiates by offering dual-layer protection for identity. The first layer involves users creating an initial cloak zone, while the second layer utilizes fog nodes to establish an extended cloak zone. Furthermore, the proposed method introduces three distinct scenarios for managing collaboration among fog nodes to select the optimal anonymizer and address the limitations of existing protection methods which are related to saving the reliability and the accuracy of data. DCL-Ar maintains maximum entropy, achieving complete uncertainty about user locations, thereby ensuring a high level of privacy protection. Through simulation and comparative analysis, the efficacy of the proposed approach is demonstrated where it provides a superior privacy level without significant performance. Experimental results demonstrate that DCL-Ar outperforms traditional methods, improving cache hit ratios and response times while reducing server query loads. Specifically, our approach reduces the number of queries sent to the service provider (SP) by up to 50% compared to existing methods and maintains a high cache hit ratio of nearly 100% over time. It further impacts on the traditional cloak-area and other protection approaches

    Two-Stage Classification Model for the Prediction of Heart Disease Using IoMT and Artificial Intelligence

    No full text
    Internet of Things (IoT) technology has recently been applied in healthcare systems as an Internet of Medical Things (IoMT) to collect sensor information for the diagnosis and prognosis of heart disease. The main objective of the proposed research is to classify data and predict heart disease using medical data and medical images. The proposed model is a medical data classification and prediction model that operates in two stages. If the result from the first stage is efficient in predicting heart disease, there is no need for stage two. In the first stage, data gathered from medical sensors affixed to the patient’s body were classified; then, in stage two, echocardiogram image classification was performed for heart disease prediction. A hybrid linear discriminant analysis with the modified ant lion optimization (HLDA-MALO) technique was used for sensor data classification, while a hybrid Faster R-CNN with SE-ResNet-101 modelwass used for echocardiogram image classification. Both classification methods were carried out, and the classification findings were consolidated and validated to predict heart disease. The HLDA-MALO method obtained 96.85% accuracy in detecting normal sensor data, and 98.31% accuracy in detecting abnormal sensor data. The proposed hybrid Faster R-CNN with SE-ResNeXt-101 transfer learning model performed better in classifying echocardiogram images, with 98.06% precision, 98.95% recall, 96.32% specificity, a 99.02% F-score, and maximum accuracy of 99.15%

    A Novel Hierarchical Extreme Machine-Learning-Based Approach for Linear Attenuation Coefficient Forecasting

    No full text
    The development of reinforced polymer composite materials has had a significant influence on the challenging problem of shielding against high-energy photons, particularly X-rays and γ-rays in industrial and healthcare facilities. Heavy materials’ shielding characteristics hold a lot of potential for bolstering concrete chunks. The mass attenuation coefficient is the main physical factor that is utilized to measure the narrow beam γ-ray attenuation of various combinations of magnetite and mineral powders with concrete. Data-driven machine learning approaches can be investigated to assess the gamma-ray shielding behavior of composites as an alternative to theoretical calculations, which are often time- and resource-intensive during workbench testing. We developed a dataset using magnetite and seventeen mineral powder combinations at different densities and water/cement ratios, exposed to photon energy ranging from 1 to 1006 kiloelectronvolt (KeV). The National Institute of Standards and Technology (NIST) photon cross-section database and software methodology (XCOM) was used to compute the concrete’s γ-ray shielding characteristics (LAC). The XCOM-calculated LACs and seventeen mineral powders were exploited using a range of machine learning (ML) regressors. The goal was to investigate whether the available dataset and XCOM-simulated LAC can be replicated using ML techniques in a data-driven approach. The minimum absolute error (MAE), root mean square error (RMSE), and R2score were employed to assess the performance of our proposed ML models, specifically a support vector machine (SVM), 1d-convolutional neural network (CNN), multi-Layer perceptrons (MLP), linear regressor, decision tree, hierarchical extreme machine learning (HELM), extreme learning machine (ELM), and random forest networks. Comparative results showed that our proposed HELM architecture outperformed state-of-the-art SVM, decision tree, polynomial regressor, random forest, MLP, CNN, and conventional ELM models. Stepwise regression and correlation analysis were further used to evaluate the forecasting capability of ML techniques compared to the benchmark XCOM approach. According to the statistical analysis, the HELM model showed strong consistency between XCOM and predicted LAC values. Additionally, the HELM model performed better in terms of accuracy than the other models used in this study, yielding the highest R2score and the lowest MAE and RMSE

    A Simulation Model for Forecasting COVID-19 Pandemic Spread: Analytical Results Based on the Current Saudi COVID-19 Data

    No full text
    The coronavirus pandemic (COVID-19) spreads worldwide during the first half of 2020. As is the case for all countries, the Kingdom of Saudi Arabia (KSA), where the number of reported cases reached more than 392 K in the first week of April 2021, was heavily affected by this pandemic. In this study, we introduce a new simulation model to examine the pandemic evolution in two major cities in KSA, namely, Riyadh (the capital city) and Jeddah (the second-largest city). Consequently, this study estimates and predicts the number of cases infected with COVID-19 in the upcoming months. The major advantage of this model is that it is based on real data for KSA, which makes it more realistic. Furthermore, this paper examines the parameters used to understand better and more accurately predict the shape of the infection curve, particularly in KSA. The obtained results show the importance of several parameters in reducing the pandemic spread: the infection rate, the social distance, and the walking distance of individuals. Through this work, we try to raise the awareness of the public and officials about the seriousness of future pandemic waves. In addition, we analyze the current data of the infected cases in KSA using a novel Gaussian curve fitting method. The results show that the expected pandemic curve is flattening, which is recorded in real data of infection. We also propose a new method to predict the new cases. The experimental results on KSA’s updated cases reveal that the proposed method outperforms some current prediction techniques, and therefore, it is more efficient in fighting possible future pandemics
    corecore