12 research outputs found

    Machine learning-based models to predict the conversion of normal blood pressure to hypertension within 5-year follow-up

    Get PDF
    BACKGROUND: Factors contributing to the development of hypertension exhibit significant variations across countries and regions. Our objective was to predict individuals at risk of developing hypertension within a 5-year period in a rural Middle Eastern area. METHODS: This longitudinal study utilized data from the Fasa Adults Cohort Study (FACS). The study initially included 10,118 participants aged 35-70 years in rural districts of Fasa, Iran, with a follow-up of 3,000 participants after 5 years using random sampling. A total of 160 variables were included in the machine learning (ML) models, and feature scaling and one-hot encoding were employed for data processing. Ten supervised ML algorithms were utilized, namely logistic regression (LR), support vector machine (SVM), random forest (RF), Gaussian naive Bayes (GNB), linear discriminant analysis (LDA), k-nearest neighbors (KNN), gradient boosting machine (GBM), extreme gradient boosting (XGB), cat boost (CAT), and light gradient boosting machine (LGBM). Hyperparameter tuning was performed using various combinations of hyperparameters to identify the optimal model. Synthetic Minority Over-sampling Technology (SMOTE) was used to balance the training data, and feature selection was conducted using SHapley Additive exPlanations (SHAP). RESULTS: Out of 2,288 participants who met the criteria, 251 individuals (10.9%) were diagnosed with new hypertension. The LGBM model (determined to be the optimal model) with the top 30 features achieved an AUC of 0.67, an f1-score of 0.23, and an AUC-PR of 0.26. The top three predictors of hypertension were baseline systolic blood pressure (SBP), gender, and waist-to-hip ratio (WHR), with AUCs of 0.66, 0.58, and 0.63, respectively. Hematuria in urine tests and family history of hypertension ranked fourth and fifth. CONCLUSION: ML models have the potential to be valuable decision-making tools in evaluating the need for early lifestyle modification or medical intervention in individuals at risk of developing hypertension

    Performance of the LGBM model with different number of features.

    No full text
    #Abbreviations, LGBM; Light Gradient Boosting Machine, AUC; Area Under the ROC Curve, ROC; Receiver operating characteristic, AUC-PR; Area Under the Precision-Recall curve. (DOCX)</p

    Flowchart of this study.

    No full text
    BackgroundFactors contributing to the development of hypertension exhibit significant variations across countries and regions. Our objective was to predict individuals at risk of developing hypertension within a 5-year period in a rural Middle Eastern area.MethodsThis longitudinal study utilized data from the Fasa Adults Cohort Study (FACS). The study initially included 10,118 participants aged 35–70 years in rural districts of Fasa, Iran, with a follow-up of 3,000 participants after 5 years using random sampling. A total of 160 variables were included in the machine learning (ML) models, and feature scaling and one-hot encoding were employed for data processing. Ten supervised ML algorithms were utilized, namely logistic regression (LR), support vector machine (SVM), random forest (RF), Gaussian naive Bayes (GNB), linear discriminant analysis (LDA), k-nearest neighbors (KNN), gradient boosting machine (GBM), extreme gradient boosting (XGB), cat boost (CAT), and light gradient boosting machine (LGBM). Hyperparameter tuning was performed using various combinations of hyperparameters to identify the optimal model. Synthetic Minority Over-sampling Technology (SMOTE) was used to balance the training data, and feature selection was conducted using SHapley Additive exPlanations (SHAP).ResultsOut of 2,288 participants who met the criteria, 251 individuals (10.9%) were diagnosed with new hypertension. The LGBM model (determined to be the optimal model) with the top 30 features achieved an AUC of 0.67, an f1-score of 0.23, and an AUC-PR of 0.26. The top three predictors of hypertension were baseline systolic blood pressure (SBP), gender, and waist-to-hip ratio (WHR), with AUCs of 0.66, 0.58, and 0.63, respectively. Hematuria in urine tests and family history of hypertension ranked fourth and fifth.ConclusionML models have the potential to be valuable decision-making tools in evaluating the need for early lifestyle modification or medical intervention in individuals at risk of developing hypertension.</div

    Interpret LGBM model with top-30 features and its performance.

    No full text
    (A), SHAP beeswarm plot for top features. The plot below sorts features by the sum of SHAP value magnitudes over all samples, and uses SHAP values to show the distribution of the impacts each feature has on the model output. The color represents the feature value (red high, blue low). This reveals for example that a high systolic blood pressure highers the predicted home price. (B), The SHAP values of top-30 variables. The input features on the y-axis are arranged in descending importance, and the values on the x-axis represent the mean influence of each feature on the size of the model output based on SHAP analysis. (C), Receiver operating characteristic (ROC) curves of top-3 features. LGBM; Light Gradient Boosting Machine, AUC; Area Under the ROC Curve, ROC; Receiver operating characteristic, SHAP; Shapley Additive exPlanations, SBP: Systolic Blood Pressure, WHR; Waist-to-Hip Ratio.</p

    Finding the appropriate hyper-parameter values for each algorithm after hyper-parameter tuning.

    No full text
    #Abbreviations, LR; Logistic Regression, SVM; Support Vector Machine, RF; Random Forest, GNB; Gaussian Naive Bayes, LDA; Linear Discriminant Analysis KNN; K-Nearest Neighbors, GBM; Gradient Boosting Machine, XGB; Extreme Gradient Boosting, CAT; Cat Boost, LGBM; Light Gradient Boosting Machine. (DOCX)</p

    Performance of the ten machine learning algorithms using all features.

    No full text
    #Abbreviations, LR; Logistic Regression, SVM; Support Vector Machine, RF; Random Forest, GNB; Gaussian Naive Bayes, LDA; Linear Discriminant Analysis, KNN; K-Nearest Neighbors, GBM; Gradient Boosting Machine, XGB; Extreme Gradient Boosting, CAT; Cat boost, LGBM; Light Gradient Boosting Machine, AUC; Area Under the ROC Curve, ROC; Receiver operating characteristic, AUC-PR; Area Under the Precision-Recall curve. (DOCX)</p

    Comparative analysis of model performance indicators and diagnostic visualizations for full-featured and simplified models across ten algorithms.

    No full text
    (A), Comparison the AUC, f1-score and AUC-PR among the models with ten algorithms using all features. (B), Comparison of the AUC, f1-score and AUC-PR among simplified models and the model with all features. (C-D), ROC curve and confusion matrix of the LGBM model with top-30 features. LR; Logistic Regression, SVM; Support Vector Machine, RF; Random Forest, GNB; Gaussian Naive Bayes, LDA; Linear Discriminant Analysis, KNN; K-Nearest Neighbors, GBM; Gradient Boosting Machine, XGB; Extreme Gradient Boosting, CAT; Cat boost, LGBM; Light Gradient Boosting Machine, AUC; Area Under the ROC Curve, ROC; Receiver operating characteristic, AUC-PR; Area Under the Precision-Recall curve.</p
    corecore